"Sometimes I don't know what to feel
Everything I thought that I knew starts to look so unreal"
— Todd Rundgren, 1973.
Project Pinocchio is our technology preview of the ability to create human characters using just a browser. You can learn more about in on the Autodesk Labs site:
Recently there have been two developments that I want to call your attention to.
-
Both Director and Star
Technical Evangelist, Brian Pene, did a really interesting thing. He created a character using Project Pinocchio. He exported his character, including its skeletal structure, to a Maya file and downloaded that file to his computer. Along with this he also installed Faceshift (get the trial version) which allows a Kinect device to track a person's face. Brian trained Faceshift to recognize his face by making various facial expressions in front of an Xbox Kinect to match those displayed in Faceshift. So instead of having a computer camera track based on a fiducial marker, the tracking was based on Brian's face. The result is that Brian could control the skeleton of his Project Pinocchio character just by moving his face. Here's a YouTube video explaining the process.
Brian's files can be downloaded from my Buzzsaw site:
Brian notes that to load the fsmap file after loading the Maya model go to File > Load Retargeting… from FaceShift Maya Plugin menu and select the fsmap file provided in the zip above. It will only work for the Pinocchio model provided but may work with others provided they are not too different topologically or in rig/joint positions.
Technologist for the Office of the CTO, Shaan Hurley, postulated how fun Brian's approach would be to read a bed-time story to one's children:
Brian shared some resources for those who want to get started with Faceshift. There are Faceshift plug-ins for both Autodesk Maya and Autodesk MotionBuilder. There are video tutorials for Faceshift installation, set up, and working with the Maya or MotionBuilder plug-ins:
- Faceshift: General Faceshift installation and tracking setup
- Faceshift: Working with the Maya Faceshift Plugin
The example Faceshift datasets are great for those who wish to try without a Kinect or other supported 3D sensor
- Faceshift: Test scans, profile, and performance
- Faceshift: Test bvh, c3d sequence
- Faceshift: Data Exported in TXT Format
- Faceshift: Test FBX sequence
With this technology, now anyone can be both actor and director — you can be both Sam Worthington and James Cameron from Avatar.
-
The Outer Limits
A second interesting thing was accomplished by Software Architect, Kean Walmsley. Augmented reality has been around for a while. Normally augmented reality is used to replace a fiducial marker with a virtual object. A popular use case is to see a building in the context of other buildings or see how a piece of furniture might look in one's own home. In Kean's case, instead of inserting a building or piece of furniture into the scene, he inserted a character generated by Project Pinocchio.
In the 1960's there was an American TV show called The Outer Limits. In one episode, "The Galaxy Being," a character from a TV screen comes out into the real world. This reminds me of that.
So both Brian Pene and Kean Walmsley have done something unique. Check 'em out.
Virtual people are alive in the lab.