Ircam - Centre Georges-Pompidou Équipe Analyse/Synthese


Gestural Research in Music

Bibliography

Marcelo M. Wanderley
IRCAM - Analysis/Synthesis Team

June, 1999


  • M. Akamatsu, I. S. MacKenzie, and T. Hasbrouc, ``A comparison of tactile, auditory and visual feedback in a pointing task using a mouse-type device,'' Ergonomics, vol. 38, pp. 816-827, 1995.
  • M. Akamatsu and I. S. MacKenzie, ``Movement characteristics using a mouse with tactile and force feedback,'' International journal of Human-Computer Studies, vol. 45, pp. 483-493, 1996.
  • A. G. Almeida, ``Methodologies for design and evaluation of interactive musical interfaces,'' in Proceedings of the 3rd Brazilian Symposium on Computer Music, 1996.
  • Y. Aono, H. Katayose, and S. Inokuchi, ``An improvisational accompaniment system observing performer's musical gesture,'' in Proc. Int. Computer Music Conf. (ICMC'95), pp. 106-107, 1995.
  • V. Auer, M. M. Maes, M. C. Bonfim, M. M. Wanderley, and M. V. Lamar., ``3D position signal acquisition system with application in real-time processing,'' in Proceedings of the ICSPAT'96 - International Congress on Signal Processing  Applications Technology, 1996.
  • A. Azarbayejani, C. Wren, and A. Pentland, ``Real-time 3-D tracking of the human body,'' in Proceedings of the IMAGE'COM96, May 1996.
  • R. M. Baecker and W. A. S. Buxton, Readings in Human-Computer Interaction: A Multidisciplinary Approach, ch. 8, 9 - The Audio Channel, pp. 393-399. Morgan-Kauffmann, 1989.
  • R. M. Baecker, J. Grudin, W. A. S. Buxton, and S. Greenberg, Readings in Human-Computer Interaction: Toward the Year 2000. Morgan-Kauffmann, 2nd edition ed., 1995. Part III, Chapter 7.
  • N. J. Bailey, A. Purvis, I. W. Bowler, and P. D. Manning, ``Applications of the phase vocoder in the control of real-time electronic musical instruments,'' Interface, vol. 22, pp. 259-275, 1993.
  • N. Bailey, ``Personnal communication,'' Dec. 1995.
  • N. Bailey, ``Personnal communication,'' Aug. 1997.
  • R. Balakrishnan and I. S. MacKenzie, ``Performance differences in the fingers, wrist and forearm in computer input control,'' in Proc. Conf. on Human Factors in Computing Systems (CHI'97), pp. 303-310, 1997.
  • R. Bargar, ``Authoring intelligent sound for synchronous human-computer interaction,'' in Proc. KANSEI - The Technology of Emotion Workshop, 1997.
  • R. Bargar, ``Multi-modal synchronization and the automation of an observer's point of view,'' in Proccedings of the 1998 IEEE International Conference on Systems, Man, and Cybernetics (SMC'98), 1998.
  • M. Battier, Les musiques *lectroacoustiques et l'environnement informatique. Th*se de doctorat, 1981. Available at: http://www.mygale.org/00/bam/d-TXT/TH/T-F.html
  • M. Battier, ``Une nouvelle g*om*trie du son,'' Cahiers de l'IRCAM - recherche musique n. 7, pp. 43-56, 1995.
  • M. Battier,``Entre l'id*e et l'oeuvre - Parcours de l'informatique musicale,'' in Estheique des Arts Mediatiques, tome I, Louise Poissant, ed., Presses de l'Universite du Quebec, Sainte-Foy (Quebec, pp. 319-335, 1995.
  • M. Battier,``La musique electronique au regard de la mecanisation,'' text presented at Seminaire Observatoire Musical Francais/AMEFA, 11/May/98, Universite Paris-Sorbonne (Paris IV).
  • M. Battier, ``L'approche gestuelle dans l'histoire de la lutherie Electronique. Etude d'un cas : le theremin,'' to be published in the Proceedings of the Colloque International "Les Nouveaux Gestes de la Musique".
  • T. Baudel, L'Interaction gestuelle : definitions, etat de l'art et perspectives d'industrialisation, in Nouvelles interfaces homme-machine, ch. III-C. Observatoire Francais des Techniques Avancees, 1996.
  • W. Bauer and B. Foss, ``GAMS: An integrated media controller system,'' Computer Music J., vol. 16, no. 1, pp. 19-24, 1992.
  • G. Bertini and P. Carosi, ``Light baton: a system for conducting computer music performance,'' in Proc. Int. Computer Music Conf. (ICMC'92), pp. 73-76, 1992.
  • A. Bobick, ``Movement, Activity, and Action: The Role of Knowledge in the Perception of Motion,'' MIT Media Laboratory Perceptual Computing Section Tech. Rep. 413, 1997. To appear in Proceedigs of the Royal Society.
  • R. Boie, M. Mathews, and A. Schloss, ``The radio drum as a synthesizer controller,'' in Proc. Int. Computer Music Conf. (ICMC'89), pp. 42-45, 1989.
  • B. Bongers, ``The use of active tactile and force feedback in timbre controlling electronic instruments,'' in Proc. Int. Computer Music Conf. (ICMC'94), pp. 171-174, 1994.
  • B. Bongers, ``An interview with Sensorband,'' CMJ, vol. 22, no. 1, pp. 13-24, 1998.
  • B. Bongers, ``Tactual display of sound properties in electronic musical instruments,'' in Displays, vol. 18, pp. 129-133, 1998.
  • J. Borchers and M. Muhlheuser, ``Design Patterns for Interactive Musical Systems,'' IEEE Multimedia, vol. 5, No. 3, pp. 36-46, 1998.
  • R. Boulanger and M.-V. Mathews, ``The 1997 mathews' radio baton and improvisation modes,'' in Proc. Int. Computer Music Conf. (ICMC'97), pp. 395-398.
  • I. Bowler, A. Purvis, P. Manning, and N. Bailey, ``On Mapping N Articulation onto M Synthesiser-Control Parameters,'' in Proc. Int. Computer Music Conf. (ICMC'90), pp. 181-184, 1990.
  • I. Bowler, P. Manning, A. Purvis, and N. Bailey, ``New techniques for a real-time phase vocoder,'' in Proc. Int. Computer Music Conf. (ICMC'90), pp. 178-180, 1990.
  • B. Brecht and G. Garnett, ``Conductor follower,'' in Proc. Int. Computer Music Conf. (ICMC'95), pp. 185-186, 1995.
  • S. Brennan, Conversation as Direct Manipulation: An Iconoclastic View, in B. Laurel (ed.) The Art of Human-Computer Interface Design, pp. 393-404, Addison Wesley, 1990.
  • M.-A. Bromwich, ``The metabone: An interactive sensory control mechanism for virtuoso trombone,'' in Proc. Int. Computer Music Conf. (ICMC'97), pp. 473-475.
  • M. -A. Bromwich and J. Wilson, ``"bodycoder": A sensor suit and vocal performance mechanism for real-time performance,'' in Proc. Int. Computer Music Conf. (ICMC'97), pp. 292-295.
  • D. Buchla, ``Lightning II midi controller.'' http://www.buchla.com/ Buchla and Associates' Homepage.
  • W. Buxton, The "Natural" Language of Interaction: A perspective on nonverbal dialogues, in B. Laurel (ed.) The Art of Human Computer Interface Design, pp. 405-416, Addison-Wesley, 1990.
  • C. Cadoz, A. Luciani, and J. L. Florens, ``Responsive imput devices and sound synthesis by simulation of instrumental mechanisms: The cordis system,'' Computer Music J., vol. 8, no. 3, pp. 60-73, 1984.
  • C. Cadoz, ``Instrumental gesture and musical composition,'' in Proc. Int. Computer Music Conf. (ICMC'88), pp. 1-12, 1988.
  • C. Cadoz and C. Ramstein, ``Capture, representation and ÇcompositionÈ of the instrumental gesture,'' in Proc. Int. Computer Music Conf. (ICMC'90), pp. 53-56, 1990.
  • C. Cadoz, L. Lisowski, and J. L. Florens, ``A modular feedback keyboard design,'' Computer Music J., vol. 14, no. 2, pp. 47-51, 1990.
  • C. Cadoz, L. Lisowsky, and J. L. Florens, ``Modular feedback keyboard,'' in Proc. Int. Computer Music Conf. (ICMC'90), pp. 379-382, 1990.
  • C. Cadoz, ``Realite du timbre? virtualite de l'instrument!,'' Analyse Musicale, pp. 68-72, 1990.
  • C. Cadoz, A. Luciani, and J.-L. Florens, ``Cordis-anima: A modeling and simulation system for sound and image synthesis - the general formalism,'' Computer Music J., vol. 17, no. 1, pp. 19-29, 1993.
  • C. Cadoz, ``Le geste canal de communication homme-machine. la communication instrumentale'' Sciences Informatiques - Numero Special: Interface Homme-Machine, vol. 13, no. 1, pp. 31-61, 1994.
  • C. Cadoz, Les realites virtuelles. Collection DOMINOS, Paris: Flamarion, 1994.
  • C. Cadoz, ``Musique, geste, technologie.'' Unpublished private version of 6/Mars/98.
  • L. Campbell, D. Becker, A. Azarbayejani, A. Bobick, and A. Pentland, ``Invariant features for 3-d gesture recognition,'' in Second International Workshop pn Face and Gesture Recognition, Oct. 1996.
  • A. Camurri, Applications of Artificial Intelligence methods and tools for music description and processing, vol. 9 of The Computer Music and Digital Audio Series, pp. 233-266. A-R Editions, 1992.
  • A. Camurri, ``Interactive dance/music systems,'' in Proc. Int. Computer Music Conf. (ICMC'95), pp. 245-252, 1995.
  • A. Camurri, A. Catorcini, C. Innocenti, and A. Massari, ``Music and multimedia knowledge representation and reasoning: The harp system,'' Computer Music J., vol. 19, no. 2, pp. 34-58, 1995.
  • A. Camurri, ``Network models for music and motor control,'' chapter of the book "Self-Organization, Computational Maps and Motor control", Elsevier, 1997.
  • A. Camurri, ed., KANSEI - The Technology of Emotion Workshop, 1997.
  • A. Camurri, A. Coglio, P. Coletta, and C. Massucco, ``An architecture for multimodal environment agents,'' in Proc. KANSEI - The Technology of Emotion Workshop, 1997.
  • A. Camurri et al., ``Toward kansei evaluation of movement and gesture in music/dance interactive multimodal environments,'' in Proc. KANSEI - The Technology of Emotion Workshop, 1997.
  • A. Camurri and P. Ferrentino, ``A computational model of artificial emotion,'' in Proc. KANSEI - The Technology of Emotion Workshop, 1997.
  • A. Camurri, M. Ricchetti, M. Di Stefano, and A. Stroscio, ``EyesWeb - toward gesture and affect recognition in dance/music interactive systems,'' in Proc. Colloquio di Informatica Musicale CIM'98, AIMI.
  • A. Camurri and P. Ferrentino, ``The Other Way - A Change of Viewpoint in Artificial Emotions,'' in Proc. Colloquio di Informatica Musicale CIM'98, AIMI.
  • A. Camurri and P. Ferrentino, ``The other way - a change of viewpoint in the field of artificial emotions,'' in Proceedings of the 1998 IEEE International Conference on Systems, Man and Cybernetics (SMC'98), 1998.
  • J. Cassell, A Framework for Gesture Generation and Interpretation. Cambridge University Press, in Computer Vision in human-Machine Interaction, in press.
  • X. Chabot, ``Performance with electronics: Gesture interfaces and software toolkit,'' in Proc. Int. Computer Music Conf. (ICMC'89), pp. 65-68, 1989.
  • X. Chabot, ``Gesture interfaces and a software toolkit for performance with electronics,'' Computer Music J., vol. 14, no. 2, pp. 15-27, 1990.
  • X. Chabot, ``To listen and to see: Making and using electronic instruments,'' Leonardo Music Journal, vol. 3, 1993.
  • B. de Ch*nerilles, ``Contrôle midi par infrarouges,'' Audiorama Homepage - recherche et creation en cours, 1998. English version here
  • I. Choi, R. Bargar, and C. Goudeseune, ``A manifold interface for a high dimensional control space,'' in Proc. Int. Computer Music Conf. (ICMC'95), pp. 385-392, 1995.
  • I. Choi, ``Interactivity vs. control: human-machine performance basis of emotion,'' in Proc. KANSEI - The Technology of Emotion Workshop, 1997.
  • I. Choi, A. Betts, and R. Bargar, ``Scoregraph: Dynamically activated connectivity among paralle processes for interactive computer music performance,'' in Proc. Int. Computer Music Conf. (ICMC'98), pp. 527-535.
  • I. Choi, ``Cognitive engineering of gestural primitives for multi-mmodal interaction in a virtual environment,'' in Proceedings of the 1998 IEEE International Conference on Systems, Man and Cybernetics (SMC'98), 1998.
  • L. Chu, ``Haptic feedback in computer music performance,'' in Proc. Int. Computer Music Conf. (ICMC'96), pp. 57-58, 1996.
  • J. Coutaz, L. Nigay, D. Salber, A. Blandford, J. May, and R. Young, ``Four easy pieces for assessing the usability of multimodal interaction: The care properties,'' in Proc. of Interact'95, 1995.
  • J. Coutaz and J. Crowley, ``Interpreting human gesture with computer vision,'' in Proc. Conf. on Human Factors in Computing Systems (CHI'95), 1995.
  • F. Delalande, ``La gestique de Gould: elements pour une s*miologie du geste musical,'' in Glenn Gould, Pluriel, G. Guertin (ed.), Louise Courteau *ditrice inc., pp. 83-111, 1988.
  • F. Delalande, ``Il gesto musicale - dal senso motorio al simbolico,'' in Dqll'Atto motorio alla interpretazione musicale - Atti del secondo colloquio internqzionale di psicologia della musica, 1992.
  • T. J. Darrel and A. Pentland, ``Recognition of space-time gestures using a distributed representation,'' Tech. Rep. 197, MIT Media Laboratory Vision and modeling Group.
  • T. Darter, ``Computer music's big noise,'' Keyboard, pp. 34-40, July 1991.
  • T. Demeyer, ``Bigeye 1.10 - realtime video to midi software,'' 1996.
  • P. Depalle, S. Tassart, and M. Wanderley, ``Instruments Virtuels'' Resonance, pp. 5-8, Sept. 1997.
  • G. Dubost, ``Technologies de capteurs et leurs applications musicales,'' Master's thesis, Universit* Paris Sud Orsay, 1993.
  • E. B. Egozy, ``Deriving musical control features from a real-time timbre analysis of the clarinet,'' Master's thesis, Massachusetts Institut of Technology, 1995.
  • H. C. Fantapie, ``L'analyse de la partition dans la pratique du chef d'orchestre. de l'analyse de l'ecrit a l'analyse du geste,'' Revue d'Analyse Musicale, pp. 26-30, Janvier 1988.
  • S. Favilla, ``Non-linear controller mapping for gestural control of the gamaka,'' in Proc. Int. Computer Music Conf. (ICMC'96), pp. 89-92, 1996.
  • S. Fels, K. Nishimoto, and K. Mase, ``MusiKalscope: A Graphical Musical Instrument,'' IEEE Multimedia, vol. 5, No. 3, pp. 26-35, 1998.
  • J. Fineberg, ``IRCAM instrumental data base,'' tech. rep., IRCAM, 1996.
  • S. Fisher, Virtual Interface Environments, in B. Laurel (ed.) The Art of Human Computer Interface Design, pp. 423-438, Addison-Wesley, 1990.
  • R. Fletcher, ``Aplying transduction materials for human-technology interfaces,'' IBM Systems Journal, vol. 35, no. 3/4, pp. 630-638, 1996.
  • E. Flety, ``Sonars à ultrasons,'' tech. rep., IRCAM, 1997.
  • E. Flety and M. H. Serra, ``Utilisations r*centes de capteurs gestuels en cr*ation musicale à l'ircam,'' in Proc. Journ*es d'Informatique Musicale, pp. D3-1-D3-4, 1998.
  • J. L. Florens, C. Cadoz, and A. Luciani, ``A real-time workstation for physical model of multi-sensorial and gesturally controlled instrument,'' in Proc. Int. Computer Music Conf. (ICMC'98), pp. 518-525.
  • A. Freed and D. Wessel, ``Max objects for media integration,'' in Proc. Int. Computer Music Conf. (ICMC'91), pp. 397-400, 1991.
  • A. Freed, ``New tools for rapid prototyping of musical sound synthesis algorithms and control strategies,'' in Proc. Int. Computer Music Conf. (ICMC'92), pp. 178-181, 1992.
  • A. Freed and M. Wright, ``Communication of musical gesture using the AES/EBU digital audio standard,'' in Proc. Int. Computer Music Conf. (ICMC'98), pp. 220-223.
  • N. Gershenfeld and J. Paradiso, ``Musical applications of electric field sensing,'' Computer Music J., vol. 21, no. 2, pp. 69-89, 1997.
  • S. Gibet, Codage, Repr*sentation et Traitement du Geste Instrumental. PhD thesis, Institut National Polytechnique de Grenoble, 1987.
  • S. Gibet and J. L. Florens, ``Instrumental gesture modeling by identification with time-varying mechanical models,'' in Proc. Int. Computer Music Conf. (ICMC'88), pp. 28-40, 1988.
  • S. Gibet and P. F. Marteau, ``Gestural control of sound synthesis,'' in Proc. Int. Computer Music Conf. (ICMC'90), pp. 387-391, 1990.
  • K. C. S. R. Gillett and R. Pritchard, ``Maddm - dance directed music,'' in Proc. Int. Computer Music Conf. (ICMC'85), pp. 329-330, 1985.
  • N. Griffith and M. Fernstrom, ``Litefoot - a floor space for recording dance and controlling media,'' in Proc. Int. Computer Music Conf. (ICMC'98), pp. 475-481.
  • M. Goldstein, ``Gestural coherence and musical interaction design,'' in Proceedings of the 1998 IEEE International Conference on Systems, Man and Cybernetics (SMC'98), 1998.
  • L. Haken, R. Abdullah, and M. Smart, ``The continuum: A continuous music keyboard,'' in Proc. Int. Computer Music Conf. (ICMC'92), pp. 81-84, 1992.
  • L.  Haken, K. Fritz, E. Tellmann, P. Wolfe and P. Christensen, ``A Continuous Music Keyboard controlling poliphonic morphing using Bandwidht Enhanced Oscillators,'' in Proc. Int. Computer Music Conf. (ICMC'97), pp. 375-378.
  • L. Haken, E. Tellman, and P. Wolfe, ``An indiscrete music keyboard,'' Computer Music J., vol. 22, no. 1, pp. 31-48, 1998.
  • C. Hand, ``A survey of 3-d input devices,'' tech. rep., De Montfort University - Leicester - UK, 1993.
  • P. Hartono, K. Asano, W. Inoue, and S. Hashimoto, ``Adaptative timbre control using gesture,'' in Proc. Int. Computer Music Conf. (ICMC'94), pp. 151-158, 1994.
  • S. Hashimoto, ``Kansei as the third target of information processing and related topics in japan,'' in Proc. KANSEI - The Technology of Emotion Workshop, 1997.
  • V. Hayward, J. Choksi, G. Lanvin, and C.Ramstein, Design and multi-objective optimization of a linkage for a haptic interface, pp. 352-359. Kluver Academic, 1994, in: "Advances in Robot Kinematic".
  • V. Hayward, ``Toward a seven axis haptic interface,'' in Proc. IROS'95, Int. Workshop on Intelligent Robots and Systems, vol. 2, IEEE Press, 1995.
  • V. Hayward and O. R. Astley, Performance Measures for Haptic Interfaces, pp. 195-207. Springer Verlag, 1996, in Robotics Research: The 7th International Symposium.
  • V. Hayward, Opportunities for haptics in human-machine communication. MIT Press, 1998. In: "Human and Machine Haptics", in preparation.
  • S. Hirai, H. Katayose, T. Kanamori, and S. Inokuchi, ``Software sonsors for interactive digital art,'' in Proc. Int. Computer Music Conf. (ICMC'98), pp. 514-517.
  • F. Iazzetta, ``A semiotic approach to music interaction,'' in Proc. Int. Computer Music Conf. (ICMC'95), pp. 583-584, 1995.
  • F. Iazzetta, ``Formalization of computer music interaction through a semiotic approach,'' Journal of New Music Research, vol. 25, pp. 213-230, 1996.
  • D. Jaffe and A. Schloss, ``The making of "wildlife",'' in Proc. Int. Computer Music Conf. (ICMC'92), pp. 269-272, 1992.
  • D. Jaffe and A. Schloss, ``The computer-extended ensemble.'' Transcription of a speech at CNMAT.
  • K. Jensen, ``The control mechanism of the violin,'' in Proceedings of the Nordic Acoustic Meeting, (Helsinki, Finland), pp. 373-378, 1996.
  • K. Jensen, ``The control of musical instruments,'' in Proceedings of the Nordic Acoustic Meeting, (Helsinki, Finland), pp. 379-384, 1996.
  • T. Kanamori, H. Katayose, Y. Aono, S. Inokuchi, and T. Sakaguchi, ``Sensor integration for interactive digital art,'' in Proc. Int. Computer Music Conf. (ICMC'95), pp. 265-268, 1995.
  • T. Kanamori, H. Katayose, S. Simura, and S. Inokuchi, ``Gesture sensor in virtual performer,'' in Proc. Int. Computer Music Conf. (ICMC'93), pp. 127-129, 1993.
  • D. Kaplowitcz, ``Cesium Sound Flex Processor 1.3 (Mac),'' in Electronic Musician Magazine, Vol. 14, No.3, pp. 168-173, March 1998.
  • H. Katayose, T. Kanamori, K. Kamei, Y. Nagashima, K. Sato, S. Inokuchi, and S. Simura, ``Virtual performer,'' in Proc. Int. Computer Music Conf. (ICMC'93), pp. 138-145, 1993.
  • H. Katayose, T. Kanamori, S. Simura, and S. Inokuchi, ``Demonstration of gesture sensors for the shakuhachi,'' in Proc. Int. Computer Music Conf. (ICMC'94), pp. 196-199, 1994.
  • H. Katayose, T. Kanamori, and S. Inokuchi, ``An environment fo interactive art - sensor integration and applications,'' in Proc. Int. Computer Music Conf. (ICMC'96), pp. 173-176, 1996.
  • H. Katayose, H. Shirakabe, T. Kanamori, and S. Inokuchi, ``A toolkit for interactive digital art,'' in Proc. Int. Computer Music Conf. (ICMC'97), pp. 476-478.
  • H. Katayose, S. Hirai, T. Kanamori, H. Kato, and S. Inokuchi, ``Physiological measurement of performers' tension and its utilisation for media control,'' in Proc. Int. Computer Music Conf. (ICMC'98), pp. 211-214.
  • D. Keane and P. Gross, ``The midi baton,'' in Proc. Int. Computer Music Conf. (ICMC'89), pp. 151-154, 1989.
  • D. Keane, G. Smecca, and K. Wood, ``The midi baton II,'' in Proc. Int. Computer Music Conf. (ICMC'90), 1990.
  • R. B. Knapp and H. S. Lusted, ``A bioelectric controller for computer music applications,'' Computer Music J., vol. 14, no. 1, pp. 42-47, 1990.
  • M. Kieslinger and T. Ungvary : "Aspects of Visualizing Information for a Realtime Hypermedia Musical Environment". Multimedia
  • Technology and Applications, Chow (Ed.), Springer-Publications Singapore 1997, ISBN: 981-3083-16-6
  • V. Krefeld, ``The hand in the web: An interview with Michel Waisvisz,'' Computer Music J., vol. 14, no. 2, pp. 28-33, 1990.
  • G. Kurtenbach and E. A. Hulteen, Gestures in Human-Computer Communication, in B. Laurel (ed.) The Art of Human-Computer Interface Design, pp. 309-317, Addison Wesley, 1990.
  • M. Laliberte, ``Archetypes et paradoxes des nouveaux instruments de musique,'' L'Aventure humaine, pp. 11-22, Automme-Hiver 1995.
  • O. Laske, ``Toward a theory of interfaces for computer music systems,'' Computer Music J., pp. 53-60, 1977.
  • S. de Laubier, ``Le meta-instrument.'' Folder from Espace Musical describing the Meta-Instrument.
  • S. de Laubier, ``Le meta-instrument a t'il un son? emergence de lois ou de constantes dans le development d'instruments virtuels,'' in Colloque Les Nouveaux Gestes de la Musique, GMEM - Marseille, 1997.
  • S. de Laubier, ``The meta-instrument,'' Computer Music J., vol. 22, no. 1, pp. 25-29, 1998.
  • S. de Laubier, ``Midi Formers,'' Computer Music J., vol. 21, no. 1, pp. 39-40, 1997.
  • M. Lee, A. Freed, and D. Wessel, ``Real-time neural network processing of gestural and acoustic signals,'' in Proc. Int. Computer Music Conf. (ICMC'91), pp. 277-280, 1991.
  • M. Lee and D. Wessel, ``Connectionist models for real-time control of synthesis and compositional algorithms,'' in Proc. Int. Computer Music Conf. (ICMC'92), pp. 277-280, 1992.
  • M. Lee, G. Garnet, and D. Wessel, ``An adaptative conductor follower,'' in Proc. Int. Computer Music Conf. (ICMC'92), pp. 454-455, 1992.
  • M. Lee and D. Wessel, ``Real-time neuro-fuzzy systems for adaptative control of musical processes,'' in Proc. Int. Computer Music Conf. (ICMC'93), pp. 172-175, 1993.
  • G. H. T. Lima, M. Maes, M. Bonfim, M. V. Lamar, and M. M. Wanderley, ``Dance-music interface based on ultrasound sensors and computers,'' in Proceedings of the 3rd Brazilian Symposium on Computer Music, pp. 12-16, 1996.
  • R. Linz, ``Towards the design of a real-time interactive performance sound system,'' Leonardo Music Journal, vol. 6, pp. 99-107, 1996.
  • C. Lippe, ``A look at performer/machine interaction using real-time systems,'' in Proc. Int. Computer Music Conf. (ICMC'96), pp. 116-117, 1996.
  • F. Liu and R. Picard, ``Detecting and segmenting periodic motion,'' Tech. Rep. 400, MIT Media Laboratory Perceptual Computing Section.
  • F. Lopez-Lezcano, ``Padmaster: An improvisation environment for real-time performance,'' in Proc. Int. Computer Music Conf. (ICMC'95), pp. 104-105, 1995.
  • F. Lopez-Lezcano, ``Padmaster: Banghing on algorithms with alternate controllers,'' in Proc. Int. Computer Music Conf. (ICMC'96), pp. 425-427, 1996.
  • T. Machover, ``Hyperinstruments - a composer's approach to the evolution of intelligent musical instruments,'' Cyberarts, pp. 67-76, ???
  • T. Machover and J. Chung, ``Hyperinstruments: Musically intelligent and interactive performance and creativity systems,'' in Proc. Int. Computer Music Conf. (ICMC'89), pp. 186-190, 1989.
  • T. Machover, ``Hyperinstruments - a progress report 1987 - 1991,'' tech. rep., Massachusetts Institut of Technology, 1992.
  • T. Machover, ``"classic" hyperinstruments - 1986-1992 - a composer's approach to the evolution of intelligent musical instruments.'' Available at: http://brainop.media.mit.edu/Archive/Hyperinstruments/classichyper.html.
  • T. Machover, ``Technology and creative expression,'' October 1995. Available at: http://brainop.media.mit.edu/Archive/Hyperinstruments/creative.html.
  • T. Machover, ``The brain opera and active music.'' Available from Brain Opera's home page: http://brainop.media.mit.edu/.
  • I. S. MacKenzie, Movement Time Prediction in Human-Computer Interfaces. Morgan-Kauffman, in Readings in Human-Computer Interaction: Toward the Year 2000, 1995.
  • N. Malitz, ``Hyperreality - how multimedia whiz Tod Machover is shaking the future of opera,'' Opera News, pp. 28-30, Aug. 1992.
  • J. Mackinlay, S. Card, and G. Robertson, ``A semantic analysis of the design space of input devices,'' Human-Computer Interaction, vol. 5, pp. 145-190, 1990.
  • J. Manzolli, ``The development of a gesture's interface laboratory,'' in Proceedings of the 2nd Brazilian Symposium on Computer Music, pp. 88-91, 1995.
  • T. A. Marrin, ``Toward an understanding of musical gesture: Mapping expressive intention with the digital baton,'' Master's thesis, Massachusetts Institut of Technology, Available at: http://brainop.media.mit.edu/Archive/Thesis.html, 1996.
  • T. A. Marrin, ``Possibilities for the Digital Baton as a General-Purpose Gestural Interface,'' CHI '97 - Conference on Human Factors in Computing Systems, Extended Abstracts, ACM Press, NY, pp. 311-312. Atlanta, March 1997.
  • T. Marrin and J. Paradiso, ``The digital baton: A versatile performance instrument,'' in Proc. Int. Computer Music Conf. (ICMC'97), pp. 313-316.
  • T. A. Marrin and R. Picard, ``The Conductor's Jacket: a Testbed for Research on Gestural and Affective Expression.'' Presented at the XII Colloquium for Musical Informatics, in Gorizia, Italy, September 1998.
  • T. A. Marrin and R. Picard, ``A Methodology for Mapping Gestures to Music Using Physiological Signals.'' Presented at the International Computer Music Conference(ICMC'98), Ann Arbor, Michigan, 1998.
  • M. Mathews and G. Bennet, ``Real-time synthesizer control,'' tech. rep., IRCAM, 1978. Repport 5/78.
  • M. V. Mathews and J. R. Pierce, Current Directions in Computer Music Research. System development foundation benchmark series, MIT Press, 1989.
  • S. Matsuda and T. Rai, ``A visual-to-sound interactive computer performance system "edge",'' in Proc. Int. Computer Music Conf. (ICMC'95), pp. 599-600, 1995.
  • F. Matsumoto, ``Using simple controls to manipulate complex objects: Application to the drum-boy interactive percussion system,'' Master's thesis, Massachusetts Institut of Technology, 1993.
  • B. Merlier, ``A la conqu*te de l'espace,'' in Proc. Journ*es d'Informatique Musicale, pp. D1-1-D1-9, 1998.
  • E. Métois, Musical Sound Information - Musical Gestures and Embedding Systems. PhD thesis, Massachusetts Institut of Technology, Available at: http://brainop.media.mit.edu/Archive/Metois/Thesis0.html, 1996.
  • A. Mihalic, ``P*dalpohone,'' in Proc. Journ*es d'Informatique Musicale, pp. D4-1-D4-13, 1998.
  • M.-S. O'Modhrain, ``Feel the music: Narration in touch and sound,'' in Proc. Int. Computer Music Conf. (ICMC'97), pp. 321-324.
  • P. Modler, ``Interactive Computer Systems and Concepts of Gestalt,'' in , Springer Verlag, 1997.
  • P. Modler and I. Zannos, ``Emotional aspects of gesture recognition by a neural network, using dedicated input devices,'' in Proc. KANSEI - The Technology of Emotion Workshop, 1997.
  • P. Modler, F. Hoffmann, and I. Zannos, ``Gesture recognition by neural networks and the expression of emotions,'' in Proceedings of the 1998 IEEE International Conference on Systems, Man and Cybernetics (SMC'98), 1998.
  • R. Moog and T. Rea, ``Evolution of the keyboard interface: The bosendorfer 290SE recording piano and the moog multiply-touch-sensitive keyboards,'' Computer Music J., vol. 14, no. 2, pp. 52-60, 1990.
  • F. R. Moore, ``The disfunctions of midi,'' in Proc. Int. Computer Music Conf. (ICMC'87), pp. 256-262, 1987.
  • R. Morales-Manzanares and E. Morales, ``Music composition, improvisation, and performance through body movements,'' in Proc. KANSEI - The Technology of Emotion Workshop, 1997.
  • H. Morita, S. Otheru, and S. Hashimoto, ``Computer music system which follows a human conductor,'' in Proc. Int. Computer Music Conf. (ICMC'89), pp. 207-210, 1989.
  • H. Morita, H. Watanabe, T. Harada, S. Otheru, and S. Hashimoto, ``Knowledge information processing in conducting computer music performance,'' in Proc. Int. Computer Music Conf. (ICMC'90), pp. 332-334, 1990.
  • A. Mulder, ``Virtual musical instruments: Accessing the sound synthesis universe as a performer,'' in Proceddings of the First Brazilian Symposium on Computer Music, 1994.
  • A. Mulder, ``Human movement tracking technology,'' tech. rep., Simon Fraser University, 1994. Hand Centered Studies of Human Movement Project.
  • A. Mulder, ``How to build an instrumented glove based on the powerglove flex sensors,''PCVR Magazine, vol. 16, pp. 10-14, 1994.
  • A. Mulder, ``Human movement tracking technology: Resources,'' tech. rep., Simon Fraser University, Canada, 1995. Addendum to Technical Report 94-1.
  • A. Mulder, ``The i-cube system: moving towards sensor technology for artists,'' in Proceedings of the ISEA, 1995.
  • A. Mulder, ``Getting a grip on alternate controllers: Addressing the variability of gestural expression in musical instrument design,'' Leonardo Music Journal, vol. 6, pp. 33-40, 1996.
  • A. Mulder, ``Hand gestures for HCI,'' tech. rep., Simon Fraser University, 1996. Hand Centered Studies of Human Movement Project.
  • A. Mulder, S. Fels, and K Mase, ``Empty-handed Gesture Analysis in Max/FTS,'' in Proceedings of the Kansei - The Technology of Emotion Workshop, (Genova - Italy), Oct. 1997.
  • A. Mulder, S. Fels, and K Mase, ``Mapping virtual object manipulation to sound variation,'' in IPSJ SIG notes Vol. 97, No. 122, 97-MUS-23 (USA/Japan intercollege computer music festival, Tokyo, Japan, 13-16 december 1997, Takayuki Rai, Rick Basset (eds.)), pp. 63-68. Available in Postscript and pdf format.
  • A. Mulder and S. Fels, ``Sound Sculpting: Manipulating Sound through Virtual Sculpting,'' in Proceedings of the 1998 Western Computer Graphics Symposium, (Whistler, BC, Canada, 23-26 April 1998, Maria Lantin (ed.)), pp. 15-23, Burnaby, BC, Canada: Simon Fraser University. Available in Postscriptand pdfformat.
  • A. Mulder and S. Fels, ``Sound Sculpting: Performing with Virtual Musical Instruments,'' in Proceedings of the Fifth Brazilian Symposium on Computer Music, Belo Horizonte, Minas Gerais, Brazil, 3-5 August 1998.
  • A. Mulder, Design of Gestural Constraints Using Virtual Musical Instruments. PhD thesis, School of Kinesiology, Simon Fraser University, Canada, 1998.
  • Y. Nagashima, ``Biosensorfusion: New interfaces for interactive multimedia art,'' in Proc. Int. Computer Music Conf. (ICMC'98), pp. 129-132.
  • N. Negroponte, Hospital Corners, in B. Laurel (ed.) The Art of Human-Computer Interface Design, pp. 347-353, Addison Wesley, 1990.
  • G. Nottoli, M. Salerno, and G. Constantini, ``A new interactive performance system for real-time sound synthesis,'' in Proc. Int. Computer Music Conf. (ICMC'98), pp. 33-36.
  • N. Orio, ``A gesture interface controlled by the oral cavity,'' in Proc. Int. Computer Music Conf. (ICMC'97), pp. 141-144.
  • N. Orio and C. Pirro, ``Performance with refractions: Understanding musical gestures for interactive live performance,'' in Proc. KANSEI - The Technology of Emotion Workshop, 1997.
  • N. Orio and C. dePirro, ``Controlled refractions: A two-levels coding of musical gestures for interactive performances,'' in Proc. Int. Computer Music Conf. (ICMC'98), pp. 88-92.
  • N. Orio, ``Progetto di un controllore gestuale azionato dal cavo orale.'' Tesi di Laurea. Universita di Padova, Italy, 1998.
  • J. A. Paradiso, ``The interactive ballon: Sensing, actuation, and behavior in a common object,'' IBM Systems Journal, vol. 35, no. 3/4, pp. 473-487, 1996.
  • J. A. Paradiso, ``New Ways to Play: Electronic Music Interfaces,'' IEEE Spectrum, Dec. 1997.
  • D. Phillips, ``The control freaks,'' Electronic Musician, pp. 48-59, June 1992.
  • R. Picard, Affective Computing. MIT Press, 1997.
  • R. Picard, ``Human-Computer Coupling,'' in Proceedings of the IEEE, vol. 86, No. 8, pp. 1803 - 1807, August 1998.
  • P. Pierrot and A. Terrier, ``Le violon MIDI,'' tech. rep., IRCAM, 1997.
  • R. Pinkston, J. Kerkhoff, and M. McQuilken, ``A touch sensitive dance floor/midi controller,'' in Proc. Int. Computer Music Conf. (ICMC'95), pp. 224-225, 1995.
  • R. Polfreman and J. Sapsford-Francis, ``A human factors approach to computer music systems user-interface design,'' in Proc. Int. Computer Music Conf. (ICMC'95), pp. 381-384, 1996.
  • S. Pope, ``Real-time performance via user interfaces to musical structures,'' Interface, vol. 22, pp. 195-212, 1993.
  • D. Pousset, ``La flute-midi, l'histoire et quelques applications.'' M*moire de Maîtrise, 1992. Universit* Paris-Sorbonne.
  • R. Povall, ``Realtime control of audio and video through physical motion: Steim's bigeye,'' in Proc. Journ*es d'Informatique Musicale.
  • J. Pressing, ``Nonlinear maps as generators of musical design,'' Computer Music J., vol. 12, no. 2, pp. 35-46, 1988.
  • J. Pressing, ``Cybernetic issues in interactive performance systems,'' Computer Music J., vol. 14, no. 1, pp. 12-25, 1990.
  • M. Puckette and Z. Settel, ``Nonobvious roles for electronics in performance enhancement,'' in Proc. Int. Computer Music Conf. (ICMC'93), pp. 134-137, 1993.
  • C. Ramstein, Analyse, Repr*sentation et Traitement du Geste Instrumental, PhD thesis, Institut National Polytechnique de Grenoble, 1991.
  • C. Ramstein and V. Hayward, ``The pantograph: a large workspace haptic device for a multi-modal human-computer interaction,'' in CHI'94, Conference on Human Factors in Computing Systems, ACM/SIGCHI, 1994.
  • C. Renard, Le Geste Musical. Hachette/Van de Velde, 1982.
  • F. Reynier and V. Hayward, ``Summary oif the kinesthetic and tactile function of the human upper extremities,'' Tech. Rep. CIM-93-4, McGill University, 1993.
  • J. C. Risset and S. V. Duyne, ``Real-time performance interaction with a computer-controlled acoustic piano,'' Computer Music J., vol. 20, no. 1, pp. 62-75, 1996.
  • J. C. Risset, ``Nouveaux gestes musicaux: quelques points de repere historiques,'' in Colloque Les Nouveaux Gestes de la Musique, GMEM - Marseille, 1997.
  • C. Roads, ``The second STEIM symposium on interactive composition in live electronic music,'' Computer Music J., pp. 44-50, 1986.
  • C. Roads, Computer Music Tutorial. MIT Press, 1996.
  • X. Rodet, A. Terrier, and P. Pierrot, ``Jerry,'' tech. rep., IRCAM, 1997. Brevet - Boîtier amovible adaptable sur un p*riph*rique du type souris d'ordinateur (Ndeg. 96 14759), A. Terrier et X. Rodet.
  • S. Rossignol, ``Segmentation - extraction du vibrato,'' tech. rep., IRCAM, 1997. Rapport d' Activite.
  • J. Rovan, ``Personnal communication,'' 1997.
  • B. Rovan, ``Capteurs gestuels.'' Notes from his presentation at IRCAM's Academie d'Ete, June 1997.
  • J. Rovan, M. Wanderley, S. Dubnov, and P. Depalle, ``Instrumental gestural mapping strategies as expressivity determinants in computer music performance.,'' in Proceedings of the Kansei - The Technology of Emotion Workshop, (Genova - Italy), Oct. 1997. Available in Postscript (172 K).
  • J. B. Rovan and M. Wanderley, ``Gestural controllers: Strategies for expressive application.'' Abstract of the article presented at the SEAMUS Conference, 1998. Hanover, New Hampshire, USA.
  • R. Rowe, Interactive Music Systems - Machine Listening and Composing. MIT Press, 1993.
  • R. Rowe and E.-L. Singer, ``Two highly integrated real-time music and graphics performance system,'' in Proc. Int. Computer Music Conf. (ICMC'97), pp. 133-140.
  • D. Rubine, ``Specifying gestures by example.'' article available from his home page.
  • D. Rubine and P. McAvinney, ``The videoharp,'' in Proc. Int. Computer Music Conf. (ICMC'88), pp. 49-55, 1988.
  • P. Sartor and E. Parisi, ``Expressive control by fuzzy logic of a physical model clarinet in csound,'' in Proc. KANSEI - The Technology of Emotion Workshop, 1997.
  • H. Sawada, S. Ohkura, and S. Hashimoto, ``Gesture analysis using 3d acceleration sensor for music control,'' in Proc. Int. Computer Music Conf. (ICMC'95), pp. 257-260, 1995.
  • H. Sawada, N. Onoe, and S. Hashimoto, ``Acceleration sensor as an imput device for musical environment,'' in Proc. Int. Computer Music Conf. (ICMC'96), pp. 421-424, 1996.
  • H. Sawada amd N. Onoe and S. Hashimoto, ``Sounds in hands - a sound modifier using datagloves and twiddle interface,'' in Proc. Int. Computer Music Conf. (ICMC'97), pp. 309-312.
  • A. Schaeffner, Origine des instruments de musique. Mouton *diteur, Paris, 1968, 428 p.
  • E. Scheirer, ``Extracting expressive performance information from recorded music,'' Master's thesis, MIT - Media Lab., 1995.
  • A. Schloss and D. Jaffe, ``Intelligent musical instruments: The future of musical performance or the demise of the performer?,'' Interface, vol. 22, pp. 183-193, 1993.
  • B. Schoner, C. Cooper, C. Douglas, and N. Gershenfeld, ``Data-Driven Modeling and Synthesis of Acoustical Instruments,'' in Proc. Int. Computer Music Conf. (ICMC'98), pp. 66-73.
  • Z. Settel, T. Holton, and D. Zicarelli, ``Remote control applications using "smart-controllers" in versatile hardware configurations,'' in Proc. Int. Computer Music Conf. (ICMC'93), pp. 156-159, 1993.
  • W. Siegel, ``DIEM -the danish institute of electroacoustic music: Studio report,'' in Proc. Int. Computer Music Conf. (ICMC'97), pp. 83-85.
  • W. Siegel, ``DIEM: Studio report,'' in Proc. Int. Computer Music Conf. (ICMC'98), pp. 312-314.
  • J. R. Smith, ``Field mice: Extracting hand geometry from electric field measurements,'' IBM Systems Journal, vol. 35, no. 3/4, pp. 587-608, 1996.
  • M. Starkier and P. Prevot, ``Real-time gestural control,'' in Proc. Int. Computer Music Conf. (ICMC'86), pp. 423-426, 1986.
  • C. Sul, K. Lee, and K Wohn, ``Virtual Stage: A Location-Based Karaoke System, '' IEEE Multimedia, vol. 5, No. 2, pp. 42-52, 1998.
  • K. Suzuki, A. Camurri, S. Hashimoto, and P. Ferrentino, ``Intelligent Agent System for Human-Robot Interaction through Artificial Emotion,'' in Proceedings of the 1998 IEEE International Conference on Systems, Man and Cybernetics (SMC'98), 1998.
  • A. Takanishi and M. Maeda, ``Development of an anthropomorphic flutist robot WF-3RIV,'' in Proc. Int. Computer Music Conf. (ICMC'98), pp. 328-331.
  • A. Tanaka, ``Musical Technical Issues in Using Interactive Instrument Technology with Application to the Biomuse,'' in Proc. Int. Computer Music Conf. (ICMC'93), pp.124-126, 1993.
  • L. Tarabella and G. Bertini, ``Original gesture interfaces for interactive computer music performances,'' in Journees d'Informatique Musicale - JIM, 1997.
  • L. Tarabella, ``Studio report of the computer music lab of cnuce/c.n.r.,'' in Proc. Int. Computer Music Conf. (ICMC'97), pp. 86-88, 1997.
  • L. Tarabella, M. Magrini, and G. Scapellato, ``A system for recognizing shape, position and rotation of the hands,'' in Proc. Int. Computer Music Conf. (ICMC'98), pp. 288-291.
  • E. Tobenfeld, ``A system for computer assisted gestural improvisation,'' in Proc. Int. Computer Music Conf. (ICMC'92), pp. 93-96, 1992.
  • F. Tobey, ``The ensemble member and the conducted computer,'' in Proc. Int. Computer Music Conf. (ICMC'95), pp. 529-530, 1995.
  • F. Tobey, ``Extraction of conducting gestures in 3D space,'' in Proc. Int. Computer Music Conf. (ICMC'96), pp. 305-307, 1996.
  • T.Todoroff, C. Traube, and J.-M. Ledent, ``Interfaces graphiques pour la commande d'outils de transformations sonores,'' in Proc. Journ*es d'Informatique Musicale, 1997.
  • T. Todoroff, C. Traube, and J.-M. Ledent, ``Nextstep graphical interfaces to control sound processing and spatialization instruments,'' in Proc. Int. Computer Music Conf. (ICMC'97), pp. 325-328.
  • T. Ungvary and M. Kieslinger, : "Creative and Interpretative Processmilieu for Live-Computermusic with the Sentograph". Schriften zur Musikpsychologie und Musikästhetik 12, Controlling Creative Processes in Music, Kopiez und Auhagen (Eds.), Peter Lang Verlag, Frankfurt am Main 1998
  • T. Ungvary and R. Vertegaal, ``The SensOrg: A musical cyberinstrument with a cognitive ergonomical touch,'' in Proceedings of the 1998 IEEE International Conference on Systems, Man and Cybernetics (SMC'98), 1998.
  • S. Usa and Y. Mochida, ``A Multi-modal conducting simulator,'' in Proc. Int. Computer Music Conf. (ICMC'98), pp. 25-32, 1998.
  • R. Vertegaal, ``An evaluation of imput devices for timbre space navigation,''Master's thesis, Department of Computing - University of Bradford, 1994. Available in Postscript (1700 K).
  • R. Vertegaal and E. Bonis, ``ISEE: An intuitive sound edditing environment,'' Computer Music J., vol. 18, no. 2, pp. 212-29, 1994. Available in Postscript (179 K).
  • R. Vertegaal, ``The Standard Instrument Space Libraries: Demonstrating the Power of ISEE,'' in Proc. Int. Computer Music Conf. (ICMC'95), 1995. Available in Postscript (247 K).
  • R. Vertegaal and T. Ungvary, ``The Sentograph: Imput devices and the communication of bodily expression,'' in Proc. Int. Computer Music Conf. (ICMC'95), pp. 253-256, 1995. Available in Postscript (153 K).
  • R. Vertegaal and B. Eaglestone, ``Comparison of imput devices in an ISEE direct timbre manipulation task,'' Interacting with Computers, vol. 8, no. 1, pp. 13-30, 1996. Available in Postscript (1.1 MB).
  • R. Vertegaal, T. Ungvary, and M. Kieslinger, ``Towards a musician's cockpit: Transducer, feedback and musical function,'' in Proc. Int. Computer Music Conf. (ICMC'96), pp. 308-311, 1996. Available in Postscript (77 K).
  • R. Vertegaal, Look Who's Talking to Who: Mediating Joint Attention in Multiparty Communication & Collaboration. PhD thesis, Cognitive Ergonomics Department, University of Twente, Netherlands, 1998.
  • A. Waibel, B. Suhm, M. T. Vo, and J. Yang, ``Multimodal interfaces for multimedia information agents,'' in Proc. ICASSP'97, pp. 167-170, 1997.
  • M. Waisvisz, ``The hands, a set of remote midi-controllers,'' in Proc. Int. Computer Music Conf. (ICMC'85), pp. 313-318, 1985.
  • M. Waisvisz, J. Ryan, and N. Collins, ``STEIM - de zoetgevooisde Bliksem,''(chief editor: S. Sjollema), STEIM 1993 .
  • M. Wanderley et al., ``Gestural research at Ircam: A progress report,'' in Proc. Journ*es d'Informatique Musicale, 1998.
  • M. Wanderley, N. Schnell and J. B. Rovan, ``Escher - modeling and performing composed instruments in real-time,'' in Proceedings of the 1998 IEEE International Conference on Systems, Man and Cybernetics (SMC'98), 1998.
  • Y. Weiss and E. H. Adelson, ``Motion extimation and segmentation using a recurrent mixture of experts architecture,'' in IEEE Workshop on Neural Networks for Signal Processing, 1995.
  • D. Wessel, ``Improvisation with highly interactive real-time performance systems,'' in Proc. Int. Computer Music Conf. (ICMC'91), pp. 344-347, 1991.
  • D. Wessel, ``Instruments that learn, refined controllers, and source model loudspeakers,'' Computer Music J., vol. 15, no. 4, pp. 82-86, 1991.
  • A. D. Wilson and A. F. Bobick, ``Recognition and interpretation of parametric gesture,'' in International Conference on Computer Vision, 1998. Submitted for publication.
  • A. D. Wilson and A. F. Bobick, ``Using configuration states for the representation and recognition of gesture,'' in ICCV, 1995. Extended version.
  • A. D. Wilson, A. F. Bobick, and J. Cassell, ``Recovering the temporal structure of natural gesture,'' in Proceedings of the Second Internationalo Conference on Automatic Face and Gesture Recognition, October 1996.
  • T. Winkler, ``Making motion musical: Gestural mapping strategies for interactive computer music,'' in Proc. Int. Computer Music Conf. (ICMC'95), pp. 261-264, 1995.
  • T. Winkler, ``Motion-sensing music: Artistic and technical challenges in two works for dance,'' in Proc. Int. Computer Music Conf. (ICMC'98), p. 471474.
  • R. Woehrmann, ``Das studio on-line projekt am IRCAM,'' , 1997.
  • M. Wright and A. Freed, ``Open SoundControl: A New Protocol for Communicating with Sound Synthesizers,'' in Proc. Int. Computer Music Conf. (ICMC'97), pp. 101-104.
  • M. Wright, D. Wessel, and A. Freed, ``New musical control structures from standard gestural controllers,'' in Proc. Int. Computer Music Conf. (ICMC'97), pp. 387-390.
  • M. Wright, ``Implementation and performance issues with opensound control,'' in Proc. Int. Computer Music Conf. (ICMC'98), pp. 224-227.
  • J. Yokono and S. Hashimoto, ``Center of gravity sensing for motion interface,'' in Proceedings of the 1998 IEEE International Conference on Systems, Man and Cybernetics (SMC'98), 1998.
  • S. Ystad, Sound Modeling Using a Combination of Physical and Signal Models. PhD thesis, Universit* Aix-Marseille II, 1998.
  • B. Zagonel, ``Le geste musical,'' Master's thesis, IRCAM - Ecole de Hautes Etudes en Sciences Sociales, 1990. M*moire de DEA em Musique et Musicologie du XXeme si*cle.
  • I. Zannos, P. Modler, and K. Naoi, ``Gesture controlled music performance in a real-time network,'' in Proc. KANSEI - The Technology of Emotion Workshop, 1997.
  • I. Zannos, ``Designing an audio interaction modeling language: Some basic concepts and techniques,'' in Proc. Int. Computer Music Conf. (ICMC'98), pp. 443-446.
  • D. Zicarelli, ``Music technology as a form of parasite,'' in Proc. Int. Computer Music Conf. (ICMC'92), pp. 69-72, 1992.
  • T. Zimmerman, J. Smith, J. Paradiso, and N. Gershenfeld, ``Applying electric field sensing to human-computer interfaces,'' in IEEE SIG - CHI, 1995.

  •  

     


  • ``Atau tanaka: le musicien cyber.'' PC Team - hors serie, Printemps 1997. No. 2, pages 86 - 87.
  • Infusion Systems Ltd., The I-Cube System.
  • INPI, ``Exposition: Touches a touches.'' Interfaces - Marseille, Avril 1997.
  • Y. Corp., WX7 Wind MIDI Controller. Owner's manual.
  • Nouvelles interfaces homme-machine. Observatoire Francais des Techniques Avancees, 1996.


  • Related topics:
     


    Home | Virtual Musical Instruments | Gestures | Capture | Mapping | Sensory Feedback | Interface Examples | Bibliography | Email us!

    This page is UNDER CONSTRUCTION


    Marcelo Wanderley

    280 references on Nov. 3rd, 1998