The majority of the movies are in .mp4 format. The most recent version of Quicktime will allow you to see these. There are a few .avi movies as well. |
This link should take you to the other mocap pages that are closer to puppetry than traditional human mocap. If you have any thoughts or suggestions, please e-mail me. Thanks, Brian |
This is the first part of my full body test. One hand for the mouth and hand and capturing the rest of the body. This is the first run. All rigging, set up, etc. was done in Maya, so I'm only using Motion Builder to pass the marker data through and building my skeletal set-ups in Maya. The audio is worse than anything else. I used a wireless mic and got too much static, I even lost large segments of audio. When the quarter is over and I get time to revisit this I'll update the geometry, create key frames over the mocap on order to fix pops and jiggles, etc. and post more finished video. | |
Here's the latest version. The audio is still bad and really only exists at the end. I was jabbering away not realizing that the mic was picking up so poorly. The is the same motion as the one above, but there are obvious edits to the body. The edits and changes were done in Maya on the rigs I put together. There are still a few pops and shakes, but the bad audio will probably keep me from doing much more with this project. |
This is from when I was working at Giant Studios. Jeremy Garrison modeled and created Squit. Jason Maloney built the body rig, and Kelly Nelson wore the rig to perform the motion. I captured the mocap, mapped the data to the creature, did some editing, and lit the scene. I just recently had the time to pull it together after it say on Jeremy' hard drive for 4 or 5 years. |
These are tests using different parts of the body to add a little life to the puppet. I'm not a big fan of using clusters, but that was part of what I was experimenting with. Next time, I think blend shapes will give better results. The .avi files are the initial raw files with nothing done to the mocap data. The .mp4 files have the motion baked on and then the skeletal data is smoothed from there to hopefully give a better end product. Once again, the audio is not the strongest, but it's getting better thanks to some borrowed equipment from Katie Whitlock via the Theater Dept. | |||
sitting down to perform |
|||
Test 1 |
Test 1 Complete |
Test 2 |
Test 2 Complete |
More of the same from Thanksgiving. I just now got around to putting these together. The audio is not so good because the acoustics on the stage are pretty bad. I make no excuses for my lack of dialog material. | |
New and unfiltered | |
Newest thing I've attempted. It's extremely raw, but my first use of hand and face with audio. This is also done on the really large stage, so I'm not confined to a small area as with previous hand capture. I thought I'd get this up before Thanksgiving. |
More digital puppets. | ||
We had our14 camera vicon system down for some close up face and hand animations, so I decided to take advantage of it and use it for a few experiments. By putting a normal set of markers on my hand and recording a little audio, I've created a virtual sock puppet. The sock puppet worked OK, but now I'm trying to move this concept over to drive different characters. | ||
The first couple of runs I used was to take the existing marker information and use it to create geometry. A simple geometry was created with a few loft surfaces and by creating a few polygons. The extension of this idea was to make a more solid interpretation that looked a little more like and oven mitt. | ||
After this, I started playing around with not having the points drive the animation, but have them drive a joint driven hand. From working with hand capture in the past, I know how difficult it can be to get the hands to look dead on correct with mocap data, but since I wasn't going for that effect, I could be less precise with my skeletal and IK set up. | ||
I used the marker points to set up where my joint centers are (again, this is not acceptable for recreating a direct interpretation of hand motion since the markers are off axis, but I can let things like this slide for the sake of just moving an object). I then created simple IK connections from the 1st "joint" of a digit to the last. I grouped the IK handle under the proper end locator and then set up some other constraints to get everything in place. | ||
The geometry is just a poorly deformed nurbs sphere. The eyes are set to follow some markers, but this doesn't work too well at the end. A large amount of this is due to the fact that I didn't want to spend a lot of time getting my binding to look right. I already had other ideas if this worked. I think it worked fairly well. My son (3-1/2 years old) really liked it and started talking to it...yet another really good reason to get this approach working in realtime. If movie doesn't open, save it off and then open it using quicktime. I'm using .mp4 files to save on file space. | ||
The next approach is to get something a little less sock puppet like and something a little more puppet like. I hope to extract out of this what information is really useful to this process. I've done this by simplifying some of the motion data and using aim and point constraints. I thought about using IK constraints and took several runs at this, but couldn't get the mouth to deform very well. I think what I came up with worked out fairly well. I also didn't bother with setting the eyes up correctly. The middle and right images are linked to .mp4 files. I had to experiment with getting rendered images to match my sound. The middle one is where I stretched the frames. The one on the right is where I compressed the audio. | ||