DocumentCode :
2309471
Title :
ESCHER-modeling and performing composed instruments in real-time
Author :
Wanderley, Marcelo M. ; Schnell, Norbert ; Rovan, Joseph
Author_Institution :
Ircam, Paris, France
Volume :
2
fYear :
1998
fDate :
11-14 Oct 1998
Firstpage :
1080
Abstract :
This article presents ESCHER, a sound synthesis environment based on Ircam´s real-time audio environment jMax. ESCHER is a modular system providing synthesis-independent prototyping of gesturally-controlled instruments by means of parameter interpolation. The system divides into two components: gestural controller and synthesis engine. Mapping between components takes place on two independent levels, coupled by an intermediate abstract parameter layer. This separation allows a flexible choice of controllers and/or sound synthesis methods
Keywords :
electronic music; humanities; real-time systems; signal synthesis; ESCHER; Ircam audio environment; gestural controller; jMax; modular system; music; parameter interpolation; real-time system; sound synthesis; Acoustic applications; Control system synthesis; Engines; Guidelines; Instruments; Interpolation; Keyboards; Prototypes; Real time systems; Vocabulary;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Systems, Man, and Cybernetics, 1998. 1998 IEEE International Conference on
Conference_Location :
San Diego, CA
ISSN :
1062-922X
Print_ISBN :
0-7803-4778-1
Type :
conf
DOI :
10.1109/ICSMC.1998.727836
Filename :
727836
Link To Document :
بازگشت