Open Stellanora64 opened 11 months ago
Adding some context for this as someone who's currently struggling to get a hand gesture system set up:
Hand gesture systems are difficult. A component / node to determine wether a invidual finger is "down" or "up" would massively simplify those setups. Most existing systems rely on finger rotation data, but that's not super trivial. Especially for the thumb rotation it's difficult to really find a suitable expression that reliably detects "down", as that finger is moving around more than a single axis. It's also something that heavily depends on the controllers used. Looking at implementation from other platforms, most get around the thumb issue by having controler-specific bindings to see if the thumb is touching any button. This complicates setting up a generic custom system further currently, as I for example don't have access to the hardware to verify my system would actually work for all controller types. The fact that SteamVR is moving fingers to a "resting" position if you rotate your hand downwards isn't helping either, especially for the thumb.
I do think it would make sense for an official implementation to simplify individual avatar setups and create a system that works decently accross all controllers developers can rely upon. This also comes up quite frequently with (new) users I try to help importing their avatars.
"A component / node to determine wether a invidual finger is "down" or "up" would massively simplify those setups." that would probably be covered by the solution Frooxius outlined in #42.
We'd probably start with a component that allows you to "simulate" hand poses, by pre-defining a set of them in a number of ways and then selecting with an index, to interpolate.
This would serve as a building block to build arbitrary systems on top of it and give a lot of flexibility in how you want to approach this.
"A component / node to determine wether a invidual finger is "down" or "up" would massively simplify those setups." that would probably be covered by the solution Frooxius outlined in https://github.com/Yellow-Dog-Man/Resonite-Issues/issues/42.
In the light of today's patchlog, I'm re-reading this comment again and am a bit confused. This issue is about detecting finger poses, not setting a custom finger pose. Does the FingerReferencePoseSource suit this requirement? #42 seems to go the other way around, providing a way to set arbitrary hand poses, but this issue is mostly concerned for detecting finger poses and driving something (in this case expressions, but it could be any arbitrary thing) with them.
In 2024.9.4.357 I've introduced a set of components for making custom hand poses, which allow building systems like this. Does this help/resolve your issue?
I haven't looked into them yet, I will later. From the description it sounded like they pose the fingers though, and not detect certain finger poses that you're doing with your hands. I might have misunderstood the patchlogs, will verify later.
So just to clarify, the components allow you to create a bunch of reference poses, and then give you an output based on how close your current finger pose is to one of the reference poses to trigger effects like avatar expressions?
Per the patch notes- the referenced set of components are as follows:
- Added FingerReferencePoseSource which provides a finger pose based on an in-world reference skeleton -- This can be used as a basis to build custom finger posing systems, combined with the additional components below (based on requests by @Toni Kat, @Jack, @Stella, InconsolableCellist, @Furf, @Dogbold_system 🐧, issues #42, #935 and #1307) -- To ease creating of poses, the inspector actions provide several methods to capture existing poses -- You can capture a pose from a pose source -- You can also capture your current left or right hand pose - this lets you make a pose with your controller or finger tracking solution -- This will automatically create necessary skeleton for capture, which you can then modify -- Note that the skeleton MUST use the coordinate conventions of Resonite's input system, you cannot use arbitrary skeleton
- Added FingerPoseLerp -- This will accept to finger poses sources as input and compute a new pose that's a blend between them
- Added FingerPoseMultiplexer -- This allows you to switch between multiple sources of finger poses -- When switching to a different pose, the poses are briefly interpolated during transition
- Added FingerPoseModifier -- This accepst a finger pose as source and will offset the poses of individual fingers -- You can offset the curl/splay of individual fingers on the hand
- Added FingerSplayModifier -- This accepts a finger pose source and will modify the global splay of the fingers -- You can use this to essentially make the fingers more spread apart or closer together
- Added FingerPosePreset -- This provides common finger pose presets for easy development and testing -- Currently Idle, Fist and Point presets are added. More can be added in the future
Based on my understanding of this issue / request, they may not suit the needs for the purpose requested, but could you verify, @Stellanora64 @JackTheFoxOtter?
From my understanding of the request, you aren't looking to pose the hand / fingers of the avatar, but rather read the pose data from controllers and have some mechanism to determine when the fingers are in an approximate position based on live pose data from the skeletal input rather than pre-made poses. Does that sound about right?
As per issue description:
Changing avatar expressions using hand gestures is a common feature in other social VR platforms and when new players try out Resonite they are often confused how to set it up.
This issue is requesting a simple way to detect finger poses from skeletal / controller input to then control avatar gestures / expressions with.
In which case, it sounds the mentioned components are not suitable for the needs of this issue, @Frooxius.
Is your feature request related to a problem? Please describe.
Changing avatar expressions using hand gestures is a common feature in other social VR platforms and when new players try out Resonite they are often confused how to set it up.
Describe the solution you'd like
A component that has inputs for blend shapes (like a blend shape to smile) that will then monitor hand gestures to change the expressions based on the component configuration, the specifics to how specific gestures are assigned to specific (or multiple) gestures is up to however the devs want to implement this component.
This component could possibly be applied automatically through the avatar creator, with a check box similar to how volume meter can automatically be set up on the avatar or not (This will require the avatar creator rework).
An example of its functionality would be to assign a smile expression to a finger gun (all fingers except pointer and thumb closed) gesture on the right hand. So when the user makes that expression with that hand the smile blend shape is interpolated from 0 to 1.
Describe alternatives you've considered
It is possible to re-create this functionality using protofulx and has been done before. However, these systems are often specific to the avatar they were created for, and are often quite complex systems that a new user would not be able to implement without the help of an experienced player.
Additional Context
No response