Open Jazzneo opened 5 years ago
Thanks for the suggestion! Having some simple system for expressions would definitely help casual users.
Touching the touchpad would work for this with the Vive controllers, however I'm not sure what to do with Oculus Touch controllers, since moving the joystick makes you move.
Any ideas?
Hmm... oculus touch is tricky. Really the only way I could see it is maaaaaybe either pushing in the joystick and moving it a specific way or binding to finger gestures. Neither is really ideal.
Pushing in the joystick is already used for jump, so unfortunately that's not an option either.
A lot of the finger gestures are done with other buttons too, so it would trigger actions that you're not meant to do (e.g. grabbing something) or you'd be making faces by just interacting with things.
The only thing I can think of is either sacrificing movement with one controller (so one joystick is gestures, another is movement) or making them menu options you could quickly invoke and select - e.g. holding the face for 5 seconds by default, with the ability to lock it.
I definitely know of a few people who might be interested in gestures via the Vive wand touchpad. Does WMR have gesture support as well?
Sent with GitHawk
WMR has no finger gestures because as far as I've heard, the WMR controllers lack a skeletal model for skeleton input via SteamInput/SteamVR.
As is, community-made gesture-wheels such as ProbablePrime's exist, and does a good job of filling this gap for now, @Casuallynoted. You assign up to eight blend shapes per controller to be driven by the touchpad directions.
How about using pressing both joysticks simultaneously as the command to toggle expressions mode; and maybe using which of the joysticks was pressed first, or something like which hand is higher/lower, to decide which hand will control blendshapes?
for oculus controllers I don't think using a touchpad style input for emotes is good, it's usually a lot more intuitive to have different gestures triggered based on what face buttons you are touching, as the touch controllers have capacitive buttons. It would be nice to be able to say "this combo of buttons touched and/or pressed triggers this emote".
For index controllers it would be nice to be able to do a similar thing, as the index also has capacitive buttons, but it would also be nice to be able to register various hand shapes as gesture triggers. I.E: if I make a peace sign by holding up only my index and middle finger, and all the other controls are touched down, that would be the finger input for the gesture.
This component would let person drag and drop the blend shape into slots for axis on the touchpad. so people can have emotion on avatar without a lot knowledge on logix. person would just have make button on avatar to turn off and on component. so they can use tools without have werid face use tools