Upliner / CharMorph

Other
305 stars 14 forks source link

Native Moho Lipsync with Blending & Automation #6

Closed Hopefullyidontgetbanned closed 9 months ago

Hopefullyidontgetbanned commented 2 years ago

https://github.com/Hunanbean/Papagayo-NGLipsyncImporterForBlender/blob/main/io_import_lipSync_Importer_2_93.py Lip-syncing is one of the hardest animations to pull off correctly so, increasingly people automate the process. Two Opensource solutions mainly rhubarb and Papagayo have the ability to export .dat files. exposure_sheet_02

The problem with these software solutions is that they can't control intensity of mouth movements and emotion as well as having limited control of mouth speed. As You can see here some the mouth opens unnaturally wide

https://user-images.githubusercontent.com/45346421/151741028-664a4dab-5f6f-4f36-ad11-c5b6b27b997c.mp4

Automating blinking would also make it easier since it's mostly repetition and shouldn't be too hard to implement (Interval every 180 frames@30FPS)

Design: Intensity can be controlled by controlling the difference between the rest transform and phonemes transform. Intensity should be able to be controlled globally and control certain phonemes at the same time. ( Ex. Global intensity = .5 , AI = .1)

Emotions can controlled by doing a similar thing but comparing rest transform of neutral to the desired emotion's rest pose and calculating difference between rests and adding them back into the phonemes transforms. (Ex. Rest Bon emote: A = .5 Z, emote B = .75 Diff = .25 Phonemes +/- diff)

You can download the pose library here