KINECT VFX DEV

 

 

View the Processing Kinect v1 points recorder code at github

View the blenderPython script at github

 

The goal of this project is to (re)develop the way to record 3D Mesh data from performers and to use it for content creation and motion graphics purposes in any pipeline.

I developed a script in blenderPython for importing Kinect point cloud data as vertices into blender ‘s 3D space. I am able to emit particles and smoke and create other effects using 3d information from dancers and stage artists.

I originally used Open Frameworks to record my point cloud data to a text stream.

I have updated this post to include a new Processing 3.x based recording method

For issues with kinect drivers please use Zadig found here:
https://zadig.akeo.ie/

 

smoke_demo

 

MOTION ACCUMULATOR

Screen shot 2013-05-12 at 4.31.50 PM

These two Max-Msp patches work together. In the main patch you will find
a particle system which listens over the network for incoming data.
Particularly data from the “MotionDetector” patch.

Team: Jason Hendrik & Thierry Dumont

MainParticles.maxpat
MotionDetector.maxpat

To generate particles or bits of light ( to add energy into the environment),
we need to be connected and active in these networked environments.