Interactive Room Research
Jason Hendrik, Pascal Champagne, Matt Fernandez, Ralph Nahkle, Nara Van Rossum
The whole is greater than the sum of its parts. We are developing an interactive installation where the “parts” are made up of semi-automated life-forms. (that is moving pixels/lines) We will be mapping these automata onto a sculptural center-piece. The moving life-forms will be interactive, they will either be attracted to our interactions or repelled away from us, depending on varying sets of conditions. The entire space will evolve and be transformed by the interactions that viewers have. We will be using sound in ambient and sculptural ways. In order to allow for variations we will create connections between the viewers body movements and the sonic environment.
Maya/Blender, Photoshop/Illustrator, MAX/MSP Jitter, Synaps,e OpenNI, Ableton Live, MadMapper
Projection Surface Mapping, Open Source Kinect Body Tracking, Audio Responsiveness, Computer Vision Tracking,
Proximity Sensing, Velocity Calculations, 3D Surface creation/manipulation, OSC data send/receive
So far we have a room, soundscapes being designed,
interactions being considered, choreographies being contemplated…
here is some preliminary “making of” in bits and pieces:
– – –
Sonic Profiles of the project:
Your browser does not support the audio element.
Thanks to JON BELLONA’s Max/Msp work we have acces to the UDP or OSC data coming from Synapse, NiMate, Processing and others.
The following is a quick patch bringing together all X,Y,Z co-ordinates from JON’s patch into our simplified point-of-entry.