Puppeteering of Character Faces and Gestures

Many years ago I developed a UI and workflow for another 3D package (Maya) that was used to animate the faces of characters for a television show. I would be very interested to help bring this into Blender. Is there a developer who might want to talk to me about this?

My workflow used a wireless game controller to drive the gestures. It worked pretty well and I think this could be a nice addition to the Blender platform.

Here’s a screenshot from back in the day. It made extensive use of MEL and had a very sweet node tree to drive the UI to the mesh. My work was exclusively around the UI and data management (stored takes as a CSV text file). The end product was a selection of takes which would be baked out as keyframe data.

Screenshot of Maya removed by @sybren

Hit me up and let’s chat and collaborate if anyone is interested.

14 Likes

It’s a great idea and I’m curious about using Blender for something like TV production, so this sounds interesting.

I’m relatively new to the Blender code base, but I have some experience writing code for Blender with Python and C++. One of my early Blender Python projects included working with external input from a device, which seems like something that could apply here. I’d be interested to hear more about what software artifacts you may have and what timeline you were thinking.

I have my original MEL scripts and some ancient scene files. The scene files are proving problematic as the demo of Maya 2022 is not happy about some of the data blocks that are present in the now 18 year old data. I can break it down into basic flow steps, and I’d like to see if I can find a way to get the node graph to be generated so I can get a screen grab to share with you.

Tell me about the external input set up? I’m unaware of how to get something like this working in Blender, and ideally this should be something which can work universally.

My Python is rudimentary, but I can generally read most code (C++ is a conceptual struggle for me).

We need a way to input data from a device (I was using a wireless game controller that was able to stream data from 2 x joysticks, 4 x buttons and a slider (thrust)).

At its most basic we’d need a way to build a dynamic list of attributes that we can build a basic node tree to connect to for driving with the device, recording this data as keyframe data during playback, and writing it out to disk as a text file for referencing, or baking/pasting as animation data when the nodes are disconnected.

Additional features would include being able to “overdub” animation data on an existing set of data.

The tool really came into its own when using the device data to puppeteer poses and blending between then (angry<=>happy with eye blinks and lip compression). My approach was inspired by mulit-track music recording.

I connected to the Sensel Morph multi-touch device using a Python library which I added to the embedded Python interpreter in Blender. Here’s a post on the forum with a link to the Github: Sensel Morph Raw for Blender - #4 by nikkinemo95 - API, R&D and Technology - Sensel Forum

It’s a pretty basic proof of concept, but I would guess that we should be able to use a game controller using Python and create some kind of add-on. Writing to text would not be too difficult, and we could maybe use that as a starting point.

What wireless controller are you using? And do you have a description of the logic? Like what each pad/button should affect?

The puppeteering tool I originally built was designed to map a shape key to each axis of the game controller, and then the puppeteer would essentially control the mix factor between shapes using the game controller and the changes to the mix factor would be recorded as part of the performance, and this animation data would then be saved as a take for later use.

The concept was that the motion noise from muscles in the hand would add an element of realism to the changes between the poses over time.

As an add-on I later created an option to enable the performance of any animatable parameter (transforms in most cases, but sometimes it could be a custom parameter that would be used to control something else).

I’ve started my own journey to try and learn if I can port any of the interesting bits of my previous code into something that might work with Blender. I have been eagerly watching the excellent tutorial series Scripting for Artists by @sybren and this is providing excellent insight into how to interact with Blender, but I am getting hung up on how to access certain types of data like shape keys.

So, this leaves me with a question which I can not answer:
Is it possible to use Python to query what shape keys are available on a given object/mesh?

[EDIT] The answer is yes, shape keys can be queried with the following.

import bpy

for ob in bpy.context.selected_objects:
    if len(bpy.data.shape_keys)>0:
        for key in bpy.data.shape_keys["Key"].key_blocks:
            print(key)
    else:
        print("No shape keys found.")

And, in response to @samuelmiller
I have a (now vintage) Wingman Wireless Game Controller. I also have a sneaky copy of a Windows game control server, but this was written back in 2004 and I am having trouble to get it working on my current computer. Hopefully there is a newer, simpler way to get this data into Blender.

Not sure what you mean here by data - do you mean the gamepad data?

Since gamepads are an input device, they will likely have to be handled differently for each operating system. I’m not sure if there’s anything for this already in the cross-platform UI that Blender uses. I think a proof of concept could be done in Blender with Python, but it might be platform specific. What operating system are you using? Windows, Mac, Linux?

Yes, gamepad control data into Blender. It appears that I am having trouble using my ancient gamepad server software as it was written for Windows NT, and under Windows 10 there appears to be some firewall issues (not that I can yet determine though) which are not allowing Maya to received any input from it.

My next steps are to deconstruct my original script so that I can get a screenshot of the input node graph and then look to replicate building the nodes in Blender. I used shader and utility nodes to build my networks in Maya and I am hoping I can do the same inside of Blender.

I’m eager to work further on this, but my day job of running our brewery is needing my attention for the rest of today (for those who are in the Netherlands, some of our beer is headed your way just now).

I have removed the Maya screenshot, as it violates the rules of this forum. See Copyright guidelines for devtalk for more information.

Puppeteering with a game controller sounds like great fun! Maybe it’s an idea to talk with the VR team (#xr on Blender Chat) to see how they interact with the VR controllers? I’m guessing it’s a different library (OpenXR vs. something like SDL), but I think there is enough overlap to have a talk about it.

For non-VR devices I’d recommend looking at SDL. Blender is already (optionally) built with that library, and I think it’s the best way to get joysticks/gamepads to work.

1 Like

It is heaps of fun to puppeteer with a game controller and makes it possible to rapidly try out combinations and layered passes using the takes and mappings process I worked on. This can be tweaked and refined by adjusting the resulting keyframes, but opens up the possibility of using rudimentary motion capture via a game controller to drive a handful of base pose groups, or layering additional action with more detailed actions using the underlying poses for eyebrow compression, cheek puffs, etc.

@sybren, I’m sorry for treading on the Copyright Guidelines and will not do that again. The input UI and workflow used within Maya to connect to a mesh with blendshapes and record input data from a game controller was my own creation and it is this workflow and process that I think would benefit the Blender community.

I’ve seen a few demonstrations from the past 10 years showing attempts to do similar things, but no one appears to be using this as a general purposes puppeteering tool and I am excited to cut my teeth on Python in Blender to help make this happen. I am hoping as I push ahead with scratching together bits and pieces that there will be others who might want to join in the process.

I’m going to start placing my related scripts for Blender at this GitHub location. I am new to GitHub so bear with me as I find my way, thanks. https://github.com/kererubrewing/Pupazzare

3 Likes