Open S-Dafarra opened 1 year ago
Considering that the SRanipal SDK has been deprecated, it would also be better to use these extensions:
Considering that the SRanipal SDK has been deprecated, it would also be better to use these extensions:
Unfortunately, neither these two seem supported by SteamVR yet. See https://github.khronos.org/OpenXR-Inventory/extension_support.html#valve_steamvr
Unfortunately, neither these two seem supported by SteamVR yet. See github.khronos.org/OpenXR-Inventory/extension_support.html#valve_steamvr
Nonetheless, if I list the extensions from the OpenXr device, I do see them. Maybe they are added by the VIVE Console
application
Nonetheless, if I list the extensions from the OpenXr device, I do see them. Maybe they are added by the
VIVE Console
application
This is the output I see
[INFO] |yarp.device.openxrheadset| Supported extensions:
- XR_EXT_hand_tracking
- XR_HTC_hand_interaction
- XR_EXT_eye_gaze_interaction
- XR_HTC_facial_tracking
- XR_HTC_vive_srworks_pass_through
- XR_MSFT_scene_understanding
- XR_HTC_passthrough
- XR_KHR_vulkan_enable
- XR_KHR_vulkan_enable2
- XR_KHR_D3D11_enable
- XR_KHR_D3D12_enable
- XR_KHR_opengl_enable
- XR_KHR_win32_convert_performance_counter_time
- XR_EXT_win32_appcontainer_compatible
- XR_KHR_binding_modification
- XR_KHR_composition_layer_depth
- XR_KHR_visibility_mask
- XR_EXT_active_action_set_priority
- XR_EXT_dpad_binding
- XR_EXT_frame_composition_report
- XR_EXT_hand_tracking_data_source
- XR_EXT_hand_joints_motion_range
- XR_EXT_hp_mixed_reality_controller
- XR_EXT_local_floor
- XR_EXT_palm_pose
- XR_EXT_uuid
- XR_FB_display_refresh_rate
- XR_HTC_vive_cosmos_controller_interaction
- XR_HTC_vive_focus3_controller_interaction
- XR_HTC_vive_wrist_tracker_interaction
- XR_MND_headless
- XR_VALVE_analog_threshold
- XR_HTCX_vive_tracker_interaction
- XR_EXT_debug_utils
Moreover, if I click on MANAGE OPENXR API LAYERS
in the SteamVR settings I see
that seems suggesting they are available.
I started adding the facial expression extension in https://github.com/ami-iit/yarp-device-openxrheadset/commit/e73e989368e085b49313dc0ed6bd18226db1be14
I still need to expose the weightings and test it.
I still need to expose the weightings
I also added the gaze extension with https://github.com/ami-iit/yarp-device-openxrheadset/pull/45/commits/d283165887e8aab4d708d981176d96afd9fc7d2e and https://github.com/ami-iit/yarp-device-openxrheadset/pull/45/commits/ebf35b52ef63a2fa80bd9139dbf2714a264b4e8c
Related PR: https://github.com/ami-iit/yarp-device-openxrheadset/pull/45
I still need to test the whole thing.
Today I started testing it. Overall, the results are not satisfactory.
VALIDATION_FAILURE
and I am not able to find the root cause.
- I am not able to get the eye expressions. This code failed with
VALIDATION_FAILURE
and I am not able to find the root cause.
It was my fault. The fix is in https://github.com/ami-iit/yarp-device-openxrheadset/commit/9b3d2b3bf171f7476e27ea8bbb4f2e09bf7c28d2. Basically I was using in the wrong way the OpenXR handles.
- the VR performances seems completely degraded
I tried again today and it did not seem so problematic. Probably a reboot was enough
- This makes me think that I am not able to get the fixation point, but only the gaze direction.
I moved to publishing the gaze direction in https://github.com/ami-iit/yarp-device-openxrheadset/commit/5cd1ac1ebb47c86905d539c6a7c58f3c00b86ebd
Right now, the SRanipalModule needs to run on the operator side since it needs to access the data coming from the headset. At the same time, it needs to open the head remote control board to control the gaze. In case of delayed network, connecting to the control board could be expensive. Hence, it would be better to separate the parts that connect to the headset from the part that connects to the robot.