Version françaisefleche_langueEnglish versionPrintable Version
  • Home
  • Planning
  • Reservation
  • Locations
  • Partners

Associated Events.

The European Project SAME

Experimentation with Prototypes for Musical Interaction

The SAME project, funded by the European Commission through the ICT program, experiments with different technological systems, especially in mobile telephones, that give music lovers new means of interaction with music: gestural control, spatialized sound synthesis and processing, context-aware recommendation, and collaborative systems.

This event is designed to introduce the public to the first research findings in this project and to gather participants’ opinions on their experience with a short questionnaire.

Partners: Università degli Studi di Genova - UGDIST, Italy (project coordinator, Pr. Antonio Camurri), Nokia Research Center (Finland), Royal Institute of Technology - KTH (Sweden), Pompeu Fabra University - UPF (Spain), Helsinki University of Technology - TKK (Finland), IRCAM (France).

Studio 4

Audio Explorer

Audio ExplorerAudio Explorer is a mobile active-listening application that allows users to interactively de-mix commercial stereo recordings into different channels while being streamed to their mobile devices, also offering interactive re-mixing possibilities based on previously separated channels. For doing so, two main modes are featured: de-mix mode and re-mix mode.
Contributors: Esteban Maestre (UPF): conception, coordination | Jordi Llop (UPF): user interfaces | Vassilis Pantazis (UPF): signal processing and VST integration | Alberto Massari (UGDIST): intégration VST

Fishing Game

Fishing GamePour yourself a glass of champagne or brush your teeth! Mimic one of these actions holding your cell phone, and you'll hear the sound. Could you do it? Continue with a harder choice. This game illustrates novel technologies on gestural sound control and embodied listening. The system makes use of a gesture recognition and analysis system, driving a sound engine. This illustrates emerging uses of embedded sensors in mobile phones.
Contributors: Pierre Jodlowski (IRCAM): artistic design and coordination | Baptiste Caramiaux, Grace Leslie, Norbert Schnell, Diemo Schwarz, Bruno Zamborlin (IRCAM): technical design and development | Frédéric Bevilacqua, Hugues Vinet, Olivier Warusfel (IRCAM): design and coordination.

Studio 5

Grain Stick

Grain StickThe Grain Stick installation offers a collaborative interactive experience featuring music by Pierre Jodlowski. One or two participants shake a virtual tube by means of two manual sensors that set off a waterfall of sound grains (like a rain stick) in a sound space spatialized with WFS technology. The sounds of the grains overlap the surrounding soundscape and percussive sounds that are controlled triggered by the users' movements. The virtual stick can be used by one person alone with both hands or by two users.
Contributors: Pierre Jodlowski (IRCAM): artistic design and coordination | Grace Leslie, Markus Noisternig, Norbert Schnell, Joseph Sanson, Diemo Schwarz, Bruno Zamborlin (IRCAM): technical design and development | Frédéric Bevilacqua, Hugues Vinet, Olivier Warusfel (IRCAM): coordination.

Orchestra Explorer

Orchestra ExplorerThe Orchestra Explorer installation offers an active experience with prerecorded music. Users can navigate and express themselves in a shared (physical or virtual) "orchestra space". Sections or single instruments of an orchestra populate this space that users can navigate by activating and listening to one or several sections of the orchestra. The mobile phones are used to detect the user's movement, to activate and control the music sections, and to present the user's position in the orchestra space on the phone's screen. The music rendering is either based on 3D sound via loudspeakers (using WFS) or on the mobile phone using headphones.
Contributors: Antonio Camurri, Corrado Canepa, Paolo Coletta, Gualtiero Volpe (UGDIST): Orchestra Explorer design | Alberto Massari (UGDIST): software development for the EyesWeb XMI software, follow-up of accelerometers for Nokia S60 | Maurizio Mancini (UGDIST): follow up of the accelerometers and analysis in EyesWeb XMI | Markus Noisternig, Joseph Sanson, Olivier Warusfel (IRCAM): WFS spatialization system.

Sync'n'Move

Sync'n'MoveSync'n'Move enables users to experience novel form of social interaction based on music and gesture, using mobile phones. Users move rhythmically (e.g. dancing) holding their mobile phones and the group of users takes part in a synchronization task. The information from this task is is measured and used to modify in real-time the performance of a pre-recorded music. Every time the users are successful in the synchronization task, the music orchestration and rendering is enhanced; while in cases of low synchronization, i.e. poor collaborative interaction, the music gradually corrupts, looses sections and rendering features, until it becomes a very poor monophonic audio signal.
Contributors: Giovanna Varni, Paolo Coletta, Gualtiero Volpe (UGDIST): phase synchronization software | Antonio Camurri, Corrado Canepa (UGDIST): design of the prototype and of the active listening paradigm | Maurizio Mancini, Barbara Mazzarino, Giovanna Varni (UGDIST): software development.

pyDM: Expressive Control of a Piano Performance

pyDMIn this demo, a computer-controlled piano performs a piece of music. This performance is controlled by a mobile phone. Each command on the phone controls different aspects of the performance such as tempo, dynamics, and articulation. These values can be adjusted separately, or grouped together in a dedicated space where different basic emotions can be expressed (e.g. happiness, sadness, tenderness, anger) with a moving dot. The color and dimension of this dot changes according to the emotion expressed. The program can be controlled using a mobile phone graphical interface, or by tilting the phone, as well as by shaking it in different ways to express different emotions.
Contributors: Marco Fabiani, Roberto Bresin, Gaël Dubus (KTH): design, development | Frédéric Bevilacqua, Bruno Zamborlin (IRCAM): gesture recognition.

Mobile Expressive Music Performance

Interprétation expressive par mobileIn this demo a mobile phone is used for controlling the emotional expression of ringtones. The user chooses an emotion for his/her ringtone. The ringtone is sent to a server where the ringtone is processed using the KTH performance system for expressive music performance and returned to the user's handset with the desired emotional expression. The KTH performance system controls different aspects of the performance, such as tempo, dynamics, articulation, orchestration, by associating pre-assigned values for each emotion.
Contributors: Roberto Bresin (KTH): design, development | Jarno Seppanen (Nokia): server development.

Zagora

ZagoraZagora is a context-aware mobile music player, which detects the ambient situation using audio analysis and retrieves a playlist of suitable music for you. The Zagora player carries out advanced audio processing to differentiate between situations like street, restaurant, car, office, and meeting, and uses the situation information to filter down an online music catalog. In a few clicks, you are able to see the current audio analysis results, generate a playlist online, and start streaming music. Finally, all resulting playlists can be browsed for other similar online music in a single click.
Contributors: Antti Eronen (Nokia): similarity-based music recommendation | Jussi Leppänen (Nokia): audio context recognition | Jarno Seppänen (Nokia): prototype design and implementation; context-aware recommendation.

Mobile Sonic Playground

Terrain de jeu sonoreThis prototype demonstrates several examples of individual and collective musical games using mobile phones as musical instruments. In the first example, the telephone is used like a musical instrument. In the second, the telephone makes car sounds. The user interacts with the phone accelerometers and keypad keys, and generates control events that are captured and rendered to sound using the phone embedded Mobilophone framework. Several levels of games and synthesized sound selections are available.
Contributors: Jari Kleimola (TKK): interaction and audio synthesis framework, mobile musical instruments  | Sami Oksanen (TKK): mobile sound toys | Vesa Välimäki (TKK): project lead.

June 16, 17, 19, IRCAM, Studios 4 & 5

  • Schedule: June 16 and 19, 5pm-8pm | June 17, 5pm-7pm
  • Access conditions: Free entry during the opening slots, limited seating available.
  • Website: SAME