Beatscape is a mixed virtual-physical environment for musical ensembles where sound objects interact with temporal waves to create rhythmic grooves. Musical outcomes in the virtual world are determined by the ensemble’s actions in the physical world. Part of the ensemble manipulates physical objects representing sounds while the other part triggers the sound objects by generating waves with hand gestures.
Me with Akito Van Troyer, Aaron Albin, Brian Blosser and Oliver Jan have built this instrument in Spring, 2010 as our Technology Ensemble project. By a camera, the system detects objects that are put on top a transparent glass table. Each object has a location, a pitch and they’re associated with different sounds. These objects are projected to a screen. However the objects cannot produce any sound by themselves. There are also waves that can be triggered by another player. The waves can be single, repetitive or continuous. Yet again, they too cannot produce any sound by themselves. It is the collision between the objects and the waves that produces the sound. The instrument requires an ensemble that needs active and effective collaboration with each other. We have written a paper about the instrument which appeared in NIME 2011.
The source code can be downloaded at Google Code under the name audio-sketch.
The instrument has been performed twice: the first is in the concert Listening Machines 2010:
…and the second is in FutureMedia Fest 2010.