Closed danwillm closed 2 years ago
From what I've seem, it's really hard (or even impossible) to set up controller agnostic inputs on most game engine XR middlewares. Most developers seem to end up hardcoding support for specific controllers by checking for the controller name.
The abstracted input systems in OpenXR and SteamVR Input should make it easier in theory to do controller agnostic input, but it requires good support from both game engine and runtime developers. Both Unity And Unreal have standardized XR interfaces, but in my opinion they haven't really been battle proven yet. They would really need first-party games made for them to make sure they have all the necessary features working.
Valve could also probably help things a lot by publishing best-practices on how things are supposed to be set up.
Sorry for the late (very) reply, but thought it would be useful for other devs if they come across this issue with skeletal input compatibility and what we've found:
Keith mentioned in an email to us that Valve was aware of issues with game developers implementing skeletal input poorly, in that it would only become active if the game detected an index controller present, much like the scenario you mentioned above but instead with OpenVR. As such, trying to make a new custom controller with skeletal input results in no actual skeletal animations in these games, as it seemingly gets disabled with these controllers. This results in us needing to simulate being an index controller (exposing the properties of one) to get finger tracking working. This isn't ideal, as it's not possible to have custom bindings for a different type of controller.
However, what we've found is that in some games (ones which weren't working before) if you were to change the controller type property (Prop_ControllerType_String
) to something custom and keep everything else the same (controller model, etc.), you're able to get both custom bindings and skeletal animations working, at least for all the games we've tested so far. Although, it's a bit of a mucky solution and not really ideal.
Valve could also probably help things a lot by publishing best-practices on how things are supposed to be set up.
I agree, and it's a shame that the interface isn't being adopted properly by game devs.
@zite you mentioned that Valve were looking into alternative ways to remedy this issue, but we haven't heard anything back. Can we expect anything to come of this soon or help in any way?
This is becoming even more important now that everything is starting to get switched over to OpenXR. There is very little documentation available on how to set up input (or anything else about the API), and most of what is available is written by Oculus and Microsoft who don't need to deal with third-party devices. OpenXR exposes a lot less data about what equipment the user has in order to force designing things in an device agnostic way, but without proper documentation people end up trying to work around the system instead. Since a lot less information is exposed, these workarounds could get even worse than current ones.
I would also like to add that there should be a way to differentiate between tracking levels in OpenXR like you can with OpenVR - estimated for controllers without any finger tracking, partial for curls and full for full tracking of all hand joints, maybe even something more granular, like having the tracking level set per each joint. It would also be nice to get finger curl and splay values directly from the device, rather than estimations from the skeletal input as I believe is happening now. The finger curls and even splays on the Index controllers change, depending on the orientation of the controllers. I know Valve isn't developing OpenXR, so they don't have much agency over it, but they can still make an extension.
out of curiosity, what's your use case for different tracking levels for skeletal input?
having the tracking level set per each joint
I don't think this will work, the tracking level from what I've seen is an overall "how accurate is the measurement and how true is it to the hand"? So having it per joint doesn't really make sense - I don't think it's very likely that the estimation will change for each joint and is it really useful to know that information?
It would also be nice to get finger curl and splay values directly from the device, rather than estimations from the skeletal input as I believe is happening now Curl and splay values do get estimated, but allowing devices to input a skeleton allows them to have full control over what the hand looks like in game. Hardware vendors could do things that aren't possible if you just provide splay and curl. In my opinion, how the skeletal input system works right now works fine.
As for OpenXR, there is already a hand tracking extension, where Valve have helped developed it and taken the best parts of the skeletal input system and put it into the extension. It works well.
To provide an update to my original message, Valve have recently integrated a compatibility mode into openvr that resolves this issue. Thanks @zite !
My use case is this: I'm trying to use controllers/gloves finger tracking for what it can measure/what isn't estimated, and then use custom IK for the rest, so that someone with gloves that support force feedback would have full control over what their fingers do in VR. Since the Index controllers only support curls, I would only use that to curl fingers, and then change the finger splays to contour to an object the player is trying to grasp, for example. Might be a very niche use case, but I don't like the way some OpenVR drivers convert button presses into skeleton input. For example on the Vive Wands, if you press the trigger, the entire hand closes. Though if I were to handle that conversion in-engine, I would still need to write code specifically for each controller, unless the positions of each button could also be passed into OpenXR.
I would also like to add that there should be a way to differentiate between tracking levels in OpenXR like you can with OpenVR - estimated for controllers without any finger tracking, partial for curls and full for full tracking of all hand joints, maybe even something more granular, like having the tracking level set per each joint. It would also be nice to get finger curl and splay values directly from the device, rather than estimations from the skeletal input as I believe is happening now. The finger curls and even splays on the Index controllers change, depending on the orientation of the controllers. I know Valve isn't developing OpenXR, so they don't have much agency over it, but they can still make an extension.
It might be a good idea to tell Khronos about it, so that they know people have run into limitations with the hand tracking extension. I've already brought it up once on their discussion board. https://community.khronos.org/t/how-is-a-limited-finger-tracking-controller-like-index-controller-supposed-to-work/108265/20?u=rectus
Going to close this - SteamVR recently implemented simulated controller functionality which means we can simulate the index controllers to get skeletal input in the games that weren't previously working. Thanks @zite and all at Valve for implementing the feature, it's been a big help!
This issue relates to problems with support for Skeletal Input in a variety of SteamVR games.
Currently, games appear to be enabling Skeletal Input only when they detect an index controller present, making it difficult for other hardware vendors to create their own finger tracking devices, instead having to pretend to be index controllers (exposing device properties of index controllers) to get this working as intended.
We've so far found that only Valve titles appear to work with custom devices and Skeletal Input.
@zite @lucas-vrtech