In my previous post I introduced my first experiments with Blender and Kinect using Blender Game Engine. I've revisited the project and added some parts I have researched a while ago. So this time I have added a softbody mesh which is controlled by the Kinect as well (hand movements) and added live video/webcam.
I think Blender is the only (free) tool out that allows us having a softbody mesh to be rendered in real-time with live video/webcam feed and now with Kinect support.
This opens a whole new world for Machinima puppetry. Now it just need's some code for networking. It would be nice to be able to control multiple characters/objects by different users (Kinects) and to stream the webcam/video over the net.
Director Commentary / Breakdown – You and I
3 years ago
This is very cool, I really enjoy reading your posts. Several years ago I was experimenting with Machinima/digital puppetry in Blender, but the technology wasn't quite there yet. I think that has definitely changed.
ReplyDeleteWhat I would really like to see is a method for recording armature movements in the game engine so they could be edited as animation curves, similar to the way that physics simulations can be recorded and incorporated in to animation.
That would give you an incredible amount of flexibility and really make Blender a Machinima powerhouse.