Wednesday, February 29, 2012

Blender and Kinect again

In my previous post I introduced my first experiments with Blender and Kinect using Blender Game Engine. I've revisited the project and added some parts I have researched a while ago. So this time I have added a softbody mesh which is controlled by the Kinect as well (hand movements) and added live video/webcam.
I think Blender is the only (free) tool out that allows us having a softbody mesh to be rendered in real-time with live video/webcam feed and now with Kinect support.
This opens a whole new world for Machinima puppetry. Now it just need's some code for networking. It would be nice to be able to control multiple characters/objects by different users (Kinects) and to stream the webcam/video over the net.

Monday, February 20, 2012

Blender and Kinect

Having played around with the Kinect already for some time and using it as an input device to control some of my favourite tools (Processing, vvvv, and Supercollider), I was excited to see NI Mate released .
NI Mate is a small but powerful software that takes real-time motion capture data from a Kinect or other supported device and turns it into two industry standard protocols: OSC (Open Sound Control) and MIDI (Musical Instrument Digital Interface). And they have showed already how to use it to control characters in Blender.
I joined their Beta program, but unfortunately it didn't work with the drivers I have installed (SensorKinect, OpenNI or NITE). Uninstalling and updating it with their versions of the drivers was not an option since I fear of some of my other applications might no longer work. So I looked at their Blender examples and after a weekend of Python coding and testing I finally got my own Kinect Blender setup to work. It uses OSCeleton as middleware to convert the Kinect sceleton to OSC data which are then sent to Blender. Although NI Mate is more advanced, OSCeleton is simple and Open Source. I've read that NI Mate maybe will become a commercial product after the Beta.
Now lets welcome the "Kinect Blender Bunny". This example was rendered in realtime using the Blender Game Engine.



It's still a long way to use this for a machinima movie project, but I'm excited to have this technology ready to be used. Controlling a 3D character in realtime with markerless motion capture is amazing.