ButzYung / SystemAnimatorOnline

XR Animator, AI-based Full Body Motion Capture and Extended Reality (XR) solution, powered by System Animator Online
https://sao.animetheme.com/XR_Animator.html
850 stars 74 forks source link

Emotion Tracking #37

Open ghost opened 1 year ago

ghost commented 1 year ago

I'm running into an issue where I'm assuming the emotion tracking isn't working as intended. After I do the full calibration on startup, below the first line it states 'Neutral' however it is never the neutral face that I created through Vroid. Also, it is never switching from Neutral to any other emotion on that line.

My tracking is working fine outside of that. The mouth movements I wish I could add sensitivity though as I feel I have to open my mouth pretty wide to see the models mouth move.

The angry face, sad face, etc. never display in VR Animator, they do work in, for example, expressions in vSeeFace.

MioBlackHeart commented 1 year ago

Try vNyan instead of VSF, it has expressions where you can make adjustment. Just make sure your VROID files expression are all set to 0. You can adjust afterewards with both VROID Expression Setting and vNyan to make it work for you. I can't get everything to work but the i'm fine with just my eyes, eye lid, and mouth to track. Not everything work but it's close enough until programs get better overtime.

ghost commented 1 year ago

I guess you are just using xr animator for the body tracking? VNyan working great with the expressions by the way. Thanks, only way I've found this to work! @MioBlackHeart

MioBlackHeart commented 1 year ago

Yes. I am using XRA for Full Or Half Body Tracking. vNyan is wonder for XRA, as VSF isn't that great for expression as it doesn't have any individual setting for expression tune up. While you can't get all kind of expression out of it just yet, keep in mind that XRA is still on-going development and takes time to get better. Same goes for vNyan.

ButzYung commented 1 year ago

It is actually normal that the "Netural" indicator stays and never changes. It originated from another emotion AI model which showed the emotion status, but that module has been removed for quite some time and now it just stays at "Neutral".

If you want to know how to actually trigger the enotion expressions with facial motions, check this out. https://youtu.be/00Mj-wzX5Jk?si=aKgw8PJG9jr6NSTX&t=125