Development of an intuitive control- and sound manipulation system for improvised electro-acoustic music performance
My goal is to improve and expand certain aspects of my instrumental practices and live setup, making it a flexible and expressive system for both solo and ensemble situations.
Specifically, I want to focus on the development of DSP software and a control system, involving gestural and hardware control.
It should allow for the intuitive, spontaneous and instantaneous action / reaction needed in the context of free improvisation.
In my live performance practice, I usually combine a set of instruments in the traditional sense (like saxophone, bass clarinet, theremin) with electronic sound processing.
Originally, I transformed their sounds with hardware effects – a modified delay stomp box, a reverb and a multi effects pedal – into soundscapes.
The last years however, I envision a more complex sound universe. When I brought in Max/MSP (which before I primarily used in the context of live visuals) together with Max for Live, a huge amount of sound shaping- and transformation possibilities opened up.
The source sound material that is fed into the system is ‘harvested’ live by sampling the output of my ‘traditional’ instruments. I repurpose these sounds into loops (broken by bufferShuffling) or create layers that accompany further improvisation while morphing and evolving in (semi-)automated and generative ways.
This temporal reorganizing of the sounds, together with filtering / shaping, mixing and further DSP, results in an increasingly complex system of variable elements that will have to be controlled in a live performance context.
The use of general-purpose hardware controllers for this purpose however makes it much more detached from mind and body than for example the saxophone reed, the keys of the saxophone or even the knobs of my DD3 delay stomp box; a problem I often encountered in performances where I combined computer-based DSP with ‘traditional’ instruments.
Especially in the context of (free) improvisation it is necessary to facilitate a link as direct as possible between the involved ‘colonized neurons’ (Joel Ryan during Sonic Acts 2012) and the musical action.
My research thus not only involves developing the DSP further, but also condensing the complexity of possible parameters by meaningful mapping into an intuitive control system. By adding the possibilities of gestural controls to my system, I hope to create an expressive and meaningful control-‘vocabulary’, giving my system the properties of a true instrument which will require practicing, experimentation and performance-experience.