Computer Music Journal 34(4), Winter 2010, a special issue on Human-Computer Interaction that I guest-edited, is now online: http://www.mitpressjournals.org/toc/comj/34/4 or http://muse.jhu.edu/journals/computer_music_journal/toc/cmj.34.4.html.
Last weekend, I dragged 4 pianists into the Sonic Lab to play in a pilot for a study on discrimination of expressive intent by motion and EMG data, with Cavan Fyans, Javier Jaimovich and Nick Gillian. We inaugurated the MuSE group’s new Qualisys system and figured out how to sync audio, video, EMG and motion capture data via SMPTE. Desperately required: omnidirectional infrared light source.
An installation for 2 motorized microphone pendulums. Visitors must use their voices to stop a pendulum from swinging, along with its associated machine noises, or to set a stationary one in motion, creating peaceful chime sounds.