![]() |
Équipe Analyse/Synthèse |
Abstract:
This paper presents experiments in two real-time frameworks for gesture controlled music performance. The SensorGlove (constructed at the Technical University of Berlin) is used as multi-parametric input devices for music, while experiments with other devices are also reported. Two alternative frameworks are used for data processing and for sound synthesis: A Java framework for graphical user interface connected to the MIDAS real-time open distributed processing system, and a framework constructed with the music programming environment SuperCollider by James McCartney. Both frameworks represent gesture data input from the Glove as well as data for performance control and sound synthesis configuration. Gesture data processing involves three steps: preprocessing (data range calibration and scaling, preliminary feature extraction), gesture recognition and mapping to or generation of performance parameters. Both simple mappings of input data to synthesis parameters and feature extraction with Time Delay Neural Networks are applied, in order to combine direct control of sound with higher level semantic processing of gestures. The aim is to combine the features of the two frameworks above in an integrated system for real-time open distributed processing with high-level programming, design and control tools.
* Staatliches Institut für Musikforschung,
Berlin, Germany - e-mail: iani@sim.spk-berlin.de