Open Lokathor opened 4 years ago
This should be approached as an OpenXR extension, which will need to begin life as a vendor extension, consisting of a specification of the added interfaces and an implementation for some existing runtime. https://github.com/KhronosGroup/OpenXR-Docs/pull/40 is an example of an in-progress vendor extension being developed external to Khronos, beginning as a Monado extension with the objective of eventually becoming a multi-vendor extension. Monado is the only existing open-source OpenXR runtime and is in early stages (Linux only, major functionality incomplete, little hardware support) but welcomes contribution.
Some care will be necessary as OpenXR does not yet provide for relative input like raw mouse motion, needed for first-person cameras, due to its state-oriented interface. New functionality, ideally an input event queue, will need to be introduced, perhaps as a separate extension.
Personally I'd love to use the OpenXR action system even outside of VR :) SteamInput is similar but proprietary, and also "earlier" evolution-wise: the OpenXR action system is more capable and incorporates lessons from SteamInput as well as the SteamVR Input action system.
I'd love to hear proposals for an input event queue, as well - I've been assured that it's not strictly necessary for VR, but it still seems more natural to me, and probably essential for non-VR uses.
KhronosGroup/OpenXR-Docs#48 is a vendor extension that was merged (essentially the finished version of 40)
Could be quite neat long term approach, though having OpenXR available and having the crate be lean and mean likely would be a bunch of work and take time.
Pinging @VZout for visibility who recently integrated OpenXR here at Embark (for VR).
Discussion in the Monado discord tentatively concluded that the OpenXR loader architecture is probably a large barrier: a non-VR application on a system with VR-only runtimes installed will have a difficult time selecting a suitable runtime without significant changes to how runtime discovery works. It might be easier to build something new modeled on the same concepts, at least for the time being.
That is understandable
OpenXR is currently the "standard" for working with VR, but it unfortunately only supports VR. It would be nice if the general concept (mapping hardware inputs to semantic actions within the game) could be used with keyboard and mouse too. This is an open standard, so it should be possible to help drive this forward, but also we would have to naturally work with many other groups in the process to we can expect it to be a slow change.