spassvogel / ImmersiveLanguage

Virtual Reality project
0 stars 1 forks source link

Implement lip sync #5

Closed spassvogel closed 8 years ago

spassvogel commented 8 years ago

Some rudimentary form of lip sync would be nice. Or at least, some animation showing a persons' mouth move whenever the players mic is active.

I think this package might be helpful: https://www.assetstore.unity3d.com/en/#!/content/16944

spassvogel commented 8 years ago

I looked into this matter. It seems for lip sync to work you need to have BlendShapes (a type of keyframe based vertex animation) on the model. The models we use do not have this, and it will be difficult to add it, as I believe the models do not have any 'inside mouth'. I contacted the creator of the models, perhaps they can assist.

spassvogel commented 8 years ago

I got into contact with the supplier of the models and they told me that unfortunately the models did not have blend shapes (I knew this already but I was hoping they would consider adding them, to increase the value of their product). Anyway I tried to add the blend shapes myself but due to my lack of modelling skills it did not work as expected. In the end I managed to add the blend shapes (using Blender) but when I exported the entire armature rig was fubar, so rather than pursue this matter even further I opted for another approach, animate using armature.

Luckily the bones in the face are rigged so there was something to work with. First I considered using Unity's mechanim animation system but in the end I used a simpler approach, I just rotate the jaw bone along the Z axis in a random manner. Obviously you won't get a truly realistic approach, (e.g. there is no phoneme deforming of the mouth) . but at least it communicates visually that the avatar is talking. https://youtu.be/qYLYXCW4B2k