Gestural Mixing in Live Performance

After working on a number of projects with large channel count sound systems I decided to focus the designer/live-engineer relationship- specifically the way in which typically mono/stereo content is translated to a multi-point sound system. While this is not novel or interesting in itself, the standard ways of doing this are extremely time consuming and the accepted technology that allows for flexible mixing control in real time typically does not led itself to such a creative collaborative environment devised and experimental performance. The ability to make quick, expressive choices that allow for the continuation of creative conversation is paramount.

I wanted a director, choreographer, or conductor to be able to express and idea for the mix of an instrument or sound through gesture or another physical means.

I developed and used the first iteration of this idea for the piece romeoandjuliet/VOID at Arizona State University, directed by Stephen Wrentmore.