The Ravey Dancer

In this two weeks Processing workshop we realized an interactive music visualization for live stage performances. We created a music-reactive point cloud using two Kinects facing each other to track any object or person that stands in the area inbetween. In this case the cloud builds up a 360° model of the person dancing inbetween the two Kinects while the cloud itself reacts to live data from Ableton.
We uncovered more than one major problem by trying to use both Kinect's signals at once and work with the data in Processing. That's why we ended up using seven libraries to get the pointcloud react as we wanted.


Making Of