The goal of this project is to (re)develop the way to record 3D Mesh data from performers and to use it for content creation and motion graphics purposes in any pipeline.
I developed a script in blenderPython for importing Kinect point cloud data as vertices into blender ‘s 3D space. I am able to emit particles and smoke and create other effects using 3d information from dancers and stage artists.
I originally used Open Frameworks to record my point cloud data to a text stream.
I have updated this post to include a new Processing 3.x based recording method
For issues with kinect drivers please use Zadig found here: