NeoSpark314 / godot_oculus_quest_toolkit

An easy to use VR toolkit for Oculus Quest development using the Godot game engine
MIT License
368 stars 39 forks source link

Documentation on Quest Touch Controller Input Mapping #23

Closed Ongnissim closed 4 years ago

Ongnissim commented 4 years ago

Hello!

I'm currently working on a project where I'm using the button and joystick inputs of the touch controllers fairly frequently.

I'm currently using a mix of signals, but this very quickly gets out of hand, since I'm doing a lot of prototyping and debugging outside of VR.

I'd very much like to be able to just map the Touch Controller Button and Joysticks in the Input Map, and haven't seen any information on this.

Thanks for all of your work! Getting everything up and running was very easy with the tools provided.

NeoSpark314 commented 4 years ago

Hi,

so far I have not used the Input mapping for VR development. I don't know right now if it is already possible to map using the normal joystick inputs to ARVRController. If it is not yet possible it would probably need some additions in the core of godot itself.

Could you explain a bit more what your needs are? For example for doing development on desktop I implemented the VRFeatureSimulator node that allows to emulate controller movements and buttons directly on desktop when launching your project with the toolkit.

Ongnissim commented 4 years ago

Hello again!

I found the information that I needed, which was in the vrAutoload script.

It was essentially an accurate button map, as follows:

enum CONTROLLER_AXIS {
    None = -1,

    JOYSTICK_X = 0,
    JOYSTICK_Y = 1,
    INDEX_TRIGGER = 2,
    GRIP_TRIGGER = 3,
}

# the individual buttons directly identified left or right controller
enum BUTTON {
    None = -1,

    Y = 1,
    LEFT_GRIP_TRIGGER = 2, # grip trigger pressed over threshold
    ENTER = 3, # Menu Button on left controller

    TOUCH_X = 5,
    TOUCH_Y = 6,
    X = 7,

    LEFT_TOUCH_THUMB_UP = 10,
    LEFT_TOUCH_INDEX_TRIGGER = 11,
    LEFT_TOUCH_INDEX_POINTING = 12,

    LEFT_THUMBSTICK = 14, # left/right thumb stick pressed
    LEFT_INDEX_TRIGGER = 15, # index trigger pressed over threshold

    B = 1 + 16,
    RIGHT_GRIP_TRIGGER = 2 + 16, # grip trigger pressed over threshold
    TOUCH_A = 5 + 16,
    TOUCH_B = 6 + 16,
    A = 7 + 16,

    RIGHT_TOUCH_THUMB_UP = 10 + 16,
    RIGHT_TOUCH_INDEX_TRIGGER = 11 + 16,
    RIGHT_TOUCH_INDEX_POINTING = 12 + 16,

    RIGHT_THUMBSTICK = 14 + 16, # left/right thumb stick pressed
    RIGHT_INDEX_TRIGGER = 15 + 16, # index trigger pressed over threshold
}

I was able to manually assign the buttons to controllers in the Input Map by following this as a guideline, using Device 0 for the left controller and Device 1 for the right controller.

This information might be useful to highlight somewhere, in case anyone else wants to be able to call buttons from a standard Input.is_action_pressed("") call, for instance if someone is developing a game that runs both in and out of VR, or, like in my case, where running tests is faster to just run it on the computer without uploading to the headset.

made a quick edit, Left Controller is Device 0, and Right is Device 1

NeoSpark314 commented 4 years ago

Yes; running on desktop is essential for quick iteration. Did you checkout the Feature_VRSimulator (just drop it onto your OQ_ARVROrigin); this already maps most oculus touch controls (and headset movement). And it is really useful for testing as it will behave the same as in the headset when you press buttons are look around.

For more advanced use I also have the Feature_VRRecorder that allows you to record interaction on device and save it to a file that you can than playback on desktop.

I will put a link to the information in the wiki.

Cheers, Holger