Yellow-Dog-Man / Resonite-Issues

Issue repository for Resonite.
https://resonite.com
130 stars 2 forks source link

Node to obtain user's locomotion direction mode (head/hand) #2356

Open 5H4D0W-X opened 2 months ago

5H4D0W-X commented 2 months ago

Is your feature request related to a problem? Please describe.

Avatar or world systems such as flight may profit from being able to obtain a user's preferred locomotion direction mode

Describe the solution you'd like

A node that outputs a bool mirroring the input user's "Use head direction for walking" setting

Describe alternatives you've considered

Manual checks comparing a user's walking direction to their head facing direction and joystick positions. Neither reliable nor compact

Additional Context

No response

Requesters

ShadowX

Frooxius commented 2 months ago

This might be a bit too direct. We generally don't want users checking against specific settings directly, because those might change.

The way this works is that the input system samples a "movement vector". This has all of the user's settings baked into it and provides an abstraction layer.

I'd be much more willing to expose that instead. Would that work well enough for you?

5H4D0W-X commented 2 months ago

That does seem useful. It would be nice if this movement vector could be accessed while in an anchor as well, perhaps with a dedicated node or as an addition to AnchorLocomotionData

Foohy commented 2 months ago

Wanted to second exposing the movement vector, as that'd be especially useful for custom locomotion modes. (Instead of having to read device data yourself in protoflux and reinvent the wheel)