View the Processing Kinect v1 points recorder code at github
View the blenderPython script at github
The goal of this project is to (re)develop the way to record 3D Mesh data from performers and to use it for content creation and motion graphics purposes in any pipeline.
I developed a script in blenderPython for importing Kinect point cloud data as vertices into blender ‘s 3D space. I am able to emit particles and smoke and create other effects using 3d information from dancers and stage artists.
I originally used Open Frameworks to record my point cloud data to a text stream.
I have updated this post to include a new Processing 3.x based recording method
For issues with kinect drivers please use Zadig found here:
In an attempt to create an experience that enables one to revisit their childhood feelings of curiosity and exploration. I built an interactive pinwheel with a visual counterpart. The viewer draw particles onto the surface using the pinwheel, and the can “blow them away” by blowing on the pinwheel.
Max/MSP Programming, Arduino, RF send/receive modules, IR LED, one small motor for speed input.
A play on/in stereo audiophonics.
Phased from Jason Hendrik on Vimeo.
I mapped a small corner of wooden shapes. Using Blender 3D and AE/Photoshop.
A collaboration with Eric Grice.
Timen LadyFace from Jason Hendrik on Vimeo.
Made for Freida Abtan’s CART 211 class in 2011.
Homework piece about Creative Commons.
The video explores aspects of connectivity and networking, the birth of ideas and creative energies in motion, it is a poetic exploration of would-be generic content, mulched into a mash of color-laced video art. The sound is an adventure into frequency tweaking… Originating from a piano tune, it has been transformed into a playful electro-sonic excursion for your ears.