NumesSanguis / FACSvatar

An Open Source Modular Framework From Face to FACS Based Avatar Animation (Unity3D / Blender)
GNU Lesser General Public License v3.0
430 stars 97 forks source link

Facsdnnfacs for dynamic AUs? #29

Closed philippbb closed 3 years ago

philippbb commented 3 years ago

Hi

In the description you wrote:

"Deep Neural Network generation of facial expressions for Human-Agent Interaction (See modules/process_facsdnnfacs)"

I just quickly checked out the module. How is it supposed to work for dynamic facial expression?

I can input emotion label or corresponding Facs list and it outputs me a new more dynamic and "realistic" AU dict kinda realtime?

Thank you for an answer

philippbb commented 3 years ago

Ah sorry i think i missunderstood something. FACSvatar needs a webcam or somekind of source to create the AU base and its not creating AU's itself based on emotion labels or something.

NumesSanguis commented 3 years ago

@philippbb Currently it has been focusing on animating (real-time) FACS input or responding to it. The process_facsdnnfacs module for example takes FACS input, puts it through a DNN, and then generates new FACS values in response.

However, the OpenFace input module is just that, a module. If this is a project you're working on, or find another project that turns emotion labels into FACS values, you can still use the rest of the framework for the animation part. The only thing required is to send a dict with FACS values from your new module to the process_bridge module.