lyuma / Av3Emulator

Emulator for VRChat's Avatars 3.0 system, built on the Unity PlayableGraph API
Other
535 stars 31 forks source link

User-friendly way of quickly testing face/eye tracking shapes #143

Open Casuallynoted opened 8 months ago

Casuallynoted commented 8 months ago

Hello!

It would be useful to have a menu for simulating face/eye tracking input from OSC, to test and see what about an avatar is working and not working. ATM, AV3emu has OSC implementation, which is great- but still kind of wonky to set up with real hardware.

jellejurre commented 8 months ago

I use VRCFaceTracking, and it works fine for me to just enable OSC and have it talk to VRCFaceTracking. What kind of changes would you want to see?

Casuallynoted commented 8 months ago

I use VRCFaceTracking, and it works fine for me to just enable OSC and have it talk to VRCFaceTracking. What kind of changes would you want to see?

Honestly just drop down menu with face expression OSC options to toggle through and test to make sure each expression is working would be great.

jellejurre commented 8 months ago

I'm not too into the face tracking scene, but I believe the main issue with that is that we have to keep up with the different types of expressions and parameters that people use. It's far from standardized, and people sometimes only use subsets of the OSC parameters, so we'd have to perform some form of detection which OSC parameters would work for your avatar, and I think this is kinda out of scope for the AV3 Emulator. I recommend using an OSC tool to emit standardized values that are tuned to your avatar instead. (Maybe something like https://github.com/Duinrahaic/dOSC might work for that? Unsure, as I havent used it myself)

Casuallynoted commented 8 months ago

Nowadays a lot of people will be using VRCFT's OSC implementation, I don't fully understand the intricacies of OSC but I would imagine covering those with something like this would cover the majority of use cases, at least for VRC users.

jellejurre commented 8 months ago

Hm, it does seem they only use a single expression list (the Unified Expressions), but there is a bit of complexity due to them being able to detect prefixed parameters, and bool-based multiplexed parameters as well as float based parameters. I'll keep the issue open as it seems doable, but personally don't expect it to be implemented soon

lyuma commented 6 months ago

I think the place to test OSC / VRCFT expressions is the "User Inputs" float values.

The new av3emulator inspector makes it easier to manually drive (synced parameters) using the User Inputs section of the inspector. It even includes a search so you can quickly filter for the parameter you want.

I'd be curious if this is good enough or if you would like more polish on the UI, and do you have a mockup / how would you format it to make it easier to test.

Casuallynoted commented 6 months ago

I think the place to test OSC / VRCFT expressions is the "User Inputs" float values.

The new av3emulator inspector makes it easier to manually drive (synced parameters) using the User Inputs section of the inspector. It even includes a search so you can quickly filter for the parameter you want.

I'd be curious if this is good enough or if you would like more polish on the UI, and do you have a mockup / how would you format it to make it easier to test.

So I have been using User Inputs as a means of testing when I'm working on FT/ET blend shapes, but the reason why they aren't especially effective at actually gauging how things work is that a lot of setups have it so that a single OSC value can set off an animation that controls multiple things. So while I can manually toggle, say, boolean values for MouthSmile1, MouthSmile2, MouthSmile4, etc. I may not even know that the template I'm using is actually going to turn on all of those plus MouthStretch1 and MouthStretch2- leaving a very different look in VRC than what I think I'm getting from AV3Emu's user inputs. Being able to actually simulate OSC calls being sent eliminates the black box of not knowing which values the template is enabling when data is sent over OSC.

lyuma commented 6 months ago

Ok let's say we made av3emu support the bit-quantized parameters (MouthSmile1,2,3,4) so each one is a slider, so you could filter for "mouth" and get two sliders. Would that do what you want?

a single OSC value can set off an animation that controls multiple things

And that single OSC value should already show up in the emulator, because OSC values in VRChat are defined as parameters. If you can control something with OSC then there is a slider for it (except for the boolean quantized stuff which I admit we should try and add support for).

I don't understand what a "template" is or how that UI would look. You could turn on OSC and send these from outside too