Controlling Sound with Your Hands
We have been experimenting with full-body interactions in the lab of late. Yesterday we were invited to install some of our work at NYC MediaLab's Future Interfaces event:
In the installation, the position of people's hands are used to control variables on virtual synthesizers in real-time. The screen also shows a visual display to help particpants work out exactly how they are affecting the sound.
The gestures of participants are read in real-time by a Kinect2 device just under the screen display. The Kinect2 is a great advance on the Kinect, however this new breed of Microsoft depth sensor is effectively bound to Windows computers.
From Windows, the KinectV2-OSC app is used to stream the Kinect2 data over to openFrameworks on a Mac. On the Mac, the data is read in using the ofxKinectV2-OSC addon.
Once inside openFrameworks, the data is routed to control variables on virtual sound synthesizers called Audio Units. These synths are launched and manipulated with the help of Adam Carlucci's ofxAudioUnit addon. This is a fairly simple addon to use, but offers a great deal of power. Connecting openFrameworks to Audio Units allows you to manipulate a wide variety of sound-generating devices.
If you want to check out the code and see how the addons relate to each other, check out the audioUnitExperiments repo. Note that if you want to see the code exactly as shown in the video, you'll need to make sure you are on this specific commit.