This creative tool comes in two parts, the first is a processing sketch which outputs kinect frame data. The second being a blender script which imports the frames to a mesh.
The goal of this project is record 3D Mesh data
from performers and to use it for content creation
and motion graphics purposes in any pipeline.
I developed a script for for animation using vertices in blender ‘s 3D space. I am now able to emit particles and smoke and create other effects using 3d information from dancers and other stage performers.
I originally used Open Frameworks to record
my point cloud data to a text stream.
I have updated this post to include
a new Processing 3.x based recording method
For issues with kinect drivers please use Zadig found here: