View the Processing Kinect v1 points recorder code at github
View the blenderPython script at github
The goal of this project is to (re)develop the way to record 3D Mesh data from performers and to use it for content creation and motion graphics purposes in any pipeline.
I developed a script in blenderPython for importing Kinect point cloud data as vertices into blender ‘s 3D space. I am able to emit particles and smoke and create other effects using 3d information from dancers and stage artists.
I originally used Open Frameworks to record my point cloud data to a text stream.
I have updated this post to include a new Processing 3.x based recording method
For issues with kinect drivers please use Zadig found here:
These two Max-Msp patches work together. In the main patch you will find
a particle system which listens over the network for incoming data.
Particularly data from the “MotionDetector” patch.
Team: Jason Hendrik & Thierry Dumont
To generate particles or bits of light ( to add energy into the environment),
we need to be connected and active in these networked environments.
Production Year: 2012
Softwares: Photoshop, Processing, After-Effects,
Hardware: Laptop, Projector, Re-arrangeable wood sculpture,
MARIA JULIA GUIMARAES
I mapped a small corner of wooden shapes. Using Blender 3D and AE/Photoshop.
A collaboration with Eric Grice.