In my previous post I introduced my first experiments with Blender and Kinect using Blender Game Engine. I've revisited the project and added some parts I have researched a while ago. So this time I have added a softbody mesh which is controlled by the Kinect as well (hand movements) and added live video/webcam.
I think Blender is the only (free) tool out that allows us having a softbody mesh to be rendered in real-time with live video/webcam feed and now with Kinect support.
This opens a whole new world for Machinima puppetry. Now it just need's some code for networking. It would be nice to be able to control multiple characters/objects by different users (Kinects) and to stream the webcam/video over the net.
NONE from Ash Thorpe
6 months ago