Currently godot_openxr hardcodes "trigger", "grab", "menu" and "handpose" actions. For godot 3.x we probably want to leave those actions hardcoded and create a unified UI for creating and binding actions with godot 4.x.
The plugin code also hardcodes bindings for the interaction profiles "/interaction_profiles/khr/simple_controller", "/interaction_profiles/valve/index_controller" and "/interaction_profiles/mnd/ball_on_stick_controller" which has essentially the inputs of a Playstation Move controller.
It would be nice to be able to add and change bindings without having to recompile the plugin. Perhaps put them in a text file like with OpenVR?
Unlike with OpenVR, there (currently) is no file format for actions and bindings that can be given to a runtime. For 2. we would have to define our own file format and have our own parser. We could also choose to write a parser for OpenVR manifests and reuse those, but they do not map perfectly to OpenXR.
For those unfamiliar with the OpenXR action system:
Like with OpenVR, an application creates "actions" which is an arbitrary name with a type, for example you might create a jump boolean action and then later querying the state of this action results in a true or false value.
Then the application can suggest bindings for those actions which are supposed to be sane defaults for a set of known controllers, for example you might say "If a Valve Index Controller is used, the a button should be bound to the jump action. If a HTC Vive Controller is used, the touchpad click should be bound to the jump action" and so on.
If the user has for example an Oculus Touch controller and the application didn't suggest actions for this controller, then it is not exactly specified what a runtime has to do. Maybe the runtime engages a compatibility mode where it knows how to map for example Index Controller bindings to an Oculus Touch Controller. Maybe the runtime pops up a binding dialog with all the actions that the application suggested and requires the user to map the inputs of the current controller to those actions. Etc.
Either way runtimes are free to allow remapping all of the suggested bindings as they like with UIs like the one found in SteamVR. For example a user might not like that the application developer suggested binding the jump action to the a button of their Valve Index Controller, and uses the runtime's binding UI to map the jump action to the b button instead, without the application knowing that the source of the jump action is not what they suggested (if the application really wants, it can find out with xrEnumerateBoundSourcesForAction()).
This relates to https://github.com/GodotVR/godot_openvr/issues/71 and https://github.com/GodotVR/godot_openvr/pull/80
Currently godot_openxr hardcodes "trigger", "grab", "menu" and "handpose" actions. For godot 3.x we probably want to leave those actions hardcoded and create a unified UI for creating and binding actions with godot 4.x.
The plugin code also hardcodes bindings for the interaction profiles "/interaction_profiles/khr/simple_controller", "/interaction_profiles/valve/index_controller" and "/interaction_profiles/mnd/ball_on_stick_controller" which has essentially the inputs of a Playstation Move controller.
Unlike with OpenVR, there (currently) is no file format for actions and bindings that can be given to a runtime. For 2. we would have to define our own file format and have our own parser. We could also choose to write a parser for OpenVR manifests and reuse those, but they do not map perfectly to OpenXR.
For those unfamiliar with the OpenXR action system:
Like with OpenVR, an application creates "actions" which is an arbitrary name with a type, for example you might create a
jump
boolean action and then later querying the state of this action results in a true or false value.Then the application can suggest bindings for those actions which are supposed to be sane defaults for a set of known controllers, for example you might say "If a Valve Index Controller is used, the
a
button should be bound to thejump
action. If a HTC Vive Controller is used, thetouchpad click
should be bound to thejump
action" and so on.If the user has for example an Oculus Touch controller and the application didn't suggest actions for this controller, then it is not exactly specified what a runtime has to do. Maybe the runtime engages a compatibility mode where it knows how to map for example Index Controller bindings to an Oculus Touch Controller. Maybe the runtime pops up a binding dialog with all the actions that the application suggested and requires the user to map the inputs of the current controller to those actions. Etc.
Either way runtimes are free to allow remapping all of the suggested bindings as they like with UIs like the one found in SteamVR. For example a user might not like that the application developer suggested binding the
jump
action to thea
button of their Valve Index Controller, and uses the runtime's binding UI to map thejump
action to theb
button instead, without the application knowing that the source of thejump
action is not what they suggested (if the application really wants, it can find out withxrEnumerateBoundSourcesForAction()
).