KhronosGroup / OpenXR-Tutorials

OpenXR Tutorials
https://www.openxr-tutorial.com/
Apache License 2.0
83 stars 15 forks source link

Explain Why Interaction Profiles Exists #30

Closed rbessems closed 1 year ago

rbessems commented 1 year ago

It seems to be somewhat more focused on using the interaction profiles than explaining the why...

It would be good to have something in there describing why interaction profiles exist / why this pattern is used.

rvkennedy commented 1 year ago

I'm actually a little unsure of the reason that an interaction profile path has to be specified. I understand why it can be; but what of the case where you know you want e.g. /user/hand/left/input/trigger/value, but you don't know what hardware will be used. As it stands it seems like you have to run through all the known profiles, in case one of them is supported. But if a runtime comes back with a new extension profile, you won't find it. There's no way to query which profiles a runtime or device offers, or within a profile to query what paths it offers. Is this a conscious design choice?

lionleaf commented 1 year ago

This was a conscious design decision, let me try a quick summary here.

It was based on a few assumptions:

(There's probably a few more)

The fundamental flip is that instead of the apps asking "give me the truth of what is available" the API is designed for the app to tell the runtime "here's the actions that drive my game, and I have tested it on this 3 specific devices, and I suggest you set up the bindings like this:"

For example if you create an application and suggest bindings for /interaction_profiles/oculus/touch_controller your app will work just fine if the user has a /intaction_profiles/facebook/touch_controller_pro, but your app will just be told the user has a touch_controller. (I guess this can be viewed as a compatibility mode). It should also work on a wide set of other controllers. But once a developer wants to utilize unique features of, say, Touch Pro, they can just add bindings for that IP as well. (Say the new haptics)

Now, while it would be possible that every runtime understands every interaction profile which would make an app work if they suggest bindings for a single IP no matter which one, in practice there are some headsets that require you suggest certain device specific interaction profiles. But other than that the development flow was designed to be:

rpavlik commented 1 year ago

Some of this is redundant with the above but I thought I'd post my little writeup.

The underlying principle is, provide the runtime with high-level information as early as possible, so it can make a decision (or let the user decide) how best to map actions to available hardware. (Assume that your action localized names and action set localized names will appear in an input binding GUI like in SteamVR: that's what the intent is, and that is actually what happens when running on SteamVR's OpenXR runtime.) And, because we can't conformance-test apps, make the app tell the runtime what data it expects and won't crash with. We will only tell the app things that it would have received when testing.

(We could have had a "get controller name" function that is open-ended returning any string, not for use in setting up input. Older APIs have had things like this. But somebody would switch on that, checking only the two controllers they have access to, and then when a new controller comes out, their app would break, even though "technically" they weren't supposed to switch on that string and were supposed to expect an arbitrary string there. So now there's an appcompat issue, which means either the app is broken in the future, or everybody has to carry around app-specific compatibility code for that one app if it's important enough. This is not a hypothetical situation, it actually happens with older APIs.)

See also https://github.com/KhronosGroup/OpenXR-Guide/blob/main/chapters/goals_design_philosophy.md which is a more complete writeup on some interaction profile stuff.

rvkennedy commented 1 year ago

Thanks @lionleaf @rpavlik. I think this gives me enough. If I'm to paraphrase, it seems the app-profile relationship here is a one-way graph, intended to prevent situations where a device/app combination without a usable mapping occurs, so it maximizes compatibility at the expense of flexibility.

rpavlik commented 1 year ago

It's not really that. It is one way, in that the runtime won't give back anything that the app didn't suggest, but it's mostly to maximize compatibility, flexibility, and user choice, at the expense of theoretical-flexibility on the app side that would be inherently untestable. (If you can test all the devices, then suggest for all the profiles and keep updating your app!)

There is the assumption that the runtimes are more likely to be continuously developed and updated more than applications: it's easy to update a runtime to add mappings, it's hard to find the right people and time and rights and... to get a new build of an app.

Having app-specific compatibility code in a runtime really sucks and does not scale. My OpenXR-Guide writeup above talks about it better.

rpavlik commented 1 year ago

OK, here's what I would expect as the process a dev would follow for designing their actions, etc. Note that interaction profiles are pretty late in the process. I also wrote up a quick example of how to use sub-action paths for a teleport mechanic, since sub-action paths are one of the more complicated aspects of the interaction system, but teleport is a pretty easy example for showing why they exist.

OK, there, I swear I am done editing this for now. That was stuff I meant to write up for a long time now, and have been saying verbally in a lot of situations. Hopefully that can be turned into some useful text, whether in the tutorial or in some other OpenXR developer docs.

lionleaf commented 1 year ago

Thanks, Ryan! :D

I'd also add another resource that might be helpful if you haven't seen it. I wrote an example using the Action system with plenty of comments in the Meta OpenXR SDK, you can download it here: https://developer.oculus.com/downloads/package/oculus-openxr-mobile-sdk/

And I'm specifically referring to XrSamples/XrInput/Src/main.cpp once you unzip that package :)

And finally, another resource that might be helpful is looking at how Steam input works, the API was designed to allow runtimes to build a similar rebinding flow as Steam offers today (try opening the binding options from Half-Life: Alyx for instance)

rvkennedy commented 1 year ago

Please post any feedback on Interaction Profile parts of the new Chapter 4 text here.

rbessems commented 1 year ago

tentative closing pending feedback