Thx! The visuals are a piece I've been kicking around for a while – I was originally planning to shoot them as a music video synced to a track. They're essentially a particle simulation built on layers of sine waves and simplex noise with plenty of random variables. The whole thing can turn up a lot of variation and there are certain 'events' that only happen every few hours on average.
I then got approached to show something at Day for Night and felt for that festival environment it'd be fun to take it there but add some interactivity whereby you can push the particles out of the way and generally mess around with the whole thing.
It runs as two custom apps built using openFrameworks – one machine runs tracking software that combines and analyses depth data from two Kinects, the other machine takes the tracking data and generates the visuals.
Here's the tracking app in action: