Beatscape is a mixed virtual-physical environment for musical ensembles where sound objects interact with temporal waves to create rhythmic grooves. Musical outcomes in the virtual world are determined by the ensemble’s actions in the physical world. Part of the ensemble manipulates physical objects representing sounds while the other part triggers the sound objects by generating waves with hand gestures.
Me with Akito Van Troyer, Aaron Albin, Brian Blosser and Oliver Jan have built this instrument in Spring, 2010 as our Technology Ensemble project. By a camera, the system detects objects that are put on top a transparent glass table. Each object has a location, a pitch and they’re associated with different sounds. These objects are projected to a screen. However the objects cannot produce any sound by themselves. There are also waves that can be triggered by another player. The waves can be single, repetitive or continuous. Yet again, they too cannot produce any sound by themselves. It is the collision between the objects and the waves that produces the sound. The instrument requires an ensemble that needs active and effective collaboration with each other. We have written a paper about the instrument which appeared in NIME 2011.
The source code can be downloaded at Google Code under the name audio-sketch.
The instrument has been performed twice: the first is in the concert Listening Machines 2010:
…and the second is in FutureMedia Fest 2010.
In the first performance me, Aaron, Akito, Brian and Oliver has played and in the second it was me, Aaron, Andrew Colella, Avinash Sastry and Sang Won Lee. In both performances, we have performed the same semi-improvisational composition. In this piece, both parts of the ensemble explore the degree to which their contributions are both static and dynamic, and how combining the two elements finally results in the ensemble arriving at its full expressive potential. The composition starts with an intro where we make the audience familiar with what’s going on. In the first section, the objects are stationary, while waves are single and allowed to move. In the second section, the roles change: now the objects can be moved, added and removed, while waves are repetitive and cannot change their locations. In the third section everything becomes free; waves are now continuous which can travel on the screen or be placed to anywhere. Objects can also move freely.
We are using Reactivision framework for object tracking. The framework uses Max/MSP and OSC to communicate with the software based on Akito Van Troyer’s Nular. This software receives messages from wiimotes which are used to trigger waves, handles the object/wave collisions and shows the visuals projected to the screen. The waves have two discrete speed, which are controllable by changing the force of the gesture done to wiimote. The playback speed (thus the pitch) of an object is controlled by its angle obtained from Reactivision. It is mapped to four discrete values. Object art is drawn in Processing language. Finally sounds samples are triggered back in a second Max/MSP patch.
I and Brian have worked on setting the physical table (which was heavily vulnerable to lighting, height of the table and the camera), Reactivision framework, communication between the framework and our software via Max/MSP and OSC, and making audio playback patch in Max/MSP.