Open S-Dafarra opened 3 years ago
Using OpenXR would also drastically simplify the CI (at least for compilation, see https://github.com/robotology/yarp-device-ovrheadset/issues/2) and distribution of binaries, as during compilation we only need to use and link https://github.com/KhronosGroup/OpenXR-SDK that is licensed under Apache 2.
I guess the easiest first step is to modify the existing code substituting the libovr
calls with equivalent openxr calls (so xrCreateInstance instead of ovr_Create, etc etc). Probably it would be easier to just do so on an Oculus, so to ensure that the behavior is the same independently of the backend (libovr or openxr).
We have been discussing the adoption of OpenXR in a few occasions, but this probably means rewriting the whole device from scratch, or at least most of it. Also at the time (~1/1.5 years ago), it wasn't really clear if the devices we had in the lab supported the library, the situation is probably different now, but I haven't done any research recently.
Anyway, if we are talking about rewriting the device, I think it is worth to consider extracting some common interfaces, and using tf instead of the rotations, etc, so that it will be easy to switch between the old and new devices. robotology/yarp#2516 is strictly related to this, since it introduces a way to pass a frame id with the envelope
We have been discussing the adoption of OpenXR in a few occasions, but this probably means rewriting the whole device from scratch, or at least most of it.
I am not sure about this. By looking a bit in the code, I think the only files that actually use ovr_
are (beside headers that pass around ovrSession session
objects, but that can be make general by just defining a typedef or wrapper class for both ovrSession
and XrInstance
, fortunately we don't install headers so it is ok to change the ABI of classes for each device):
So I think that exploring just re-writing this files may be worth, and if this turns out to be too difficult we can always fallback to do a full rewrite.
Anyway, if we are talking about rewriting the device, I think it is worth to consider extracting some common interfaces, and using tf instead of the rotations, etc, so that it will be easy to switch between the old and new devices. robotology/yarp#2516 is strictly related to this, since it introduces a way to pass a frame id with the envelope
Could it make sense to open a separate issue to track that? While as you pointed out this aspects are related, I think that they are also doable without any OpenXR migration, so tracking them separately could be easier.
Also at the time (~1/1.5 years ago), it wasn't really clear if the devices we had in the lab supported the library, the situation is probably different now, but I haven't done any research recently.
Unless I am getting something wrong in the docs, I think that both Oculus Rift and Oculus Quest are supported, see:
Also at the time (~1/1.5 years ago), it wasn't really clear if the devices we had in the lab supported the library, the situation is probably different now, but I haven't done any research recently.
Unless I am getting something wrong in the docs, I think that both Oculus Rift and Oculus Quest are supported, see:
Since July 2020 https://developer.oculus.com/blog/openxr-for-oculus/?locale=it_IT
I tried and it turns out that monado
runtime actually provides essentially a OpenXR runtime that works fine even without any VR physical device, and simply renders the two images in a windows.
Here are the instructions for running the xrgears demo on Ubuntu 20.04 (real thing, not WSL/WSL2):
# Install monado packages with monado ppa ( https://monado.freedesktop.org/packages-ubuntu.html )
sudo add-apt-repository ppa:monado-xr/monado
sudo apt-get update
sudo apt install libopenxr1-monado
# Install compilation dependencies to compile xrgears
sudo apt-get install build-essential meson libvulkan-dev libopenxr-dev libvulkan1 libopenxr-loader1 xxd libglm-dev glslang-tools
# Compile xrgears
git clone https://gitlab.freedesktop.org/monado/demos/xrgears
cd xrgears
meson build
ninja -C build
Then, on one terminal run the monado-service
:
monado-service
and in the other run the xrgears demo:
cd xrgears
./build/src/xrgears
This is what you will obtain:
I don't know if there is any way to emulate the controllers, but even just that is convenient to kickoff the development of an OpenXR-based device even if no VR device is available.
Another example perhaps more similar to this YARP device, as it uses OpenGL and CMake instead of Vulkan and Meson is https://gitlab.freedesktop.org/monado/demos/openxr-simple-example (assuming monado-service
is already running) :
sudo apt-get install cmake libsdl2-dev
git clone https://gitlab.freedesktop.org/monado/demos/openxr-simple-example
cd openxr-simple-example
mkdir build
cd build
cmake -DCMAKE_BUILD_TYPE=Release ..
make
./openxr-example
that results in:
However, note that to run this example I had to use the Nvidia driver, as using the Intel driver the ./openxr-example
was failing with error:
Using OpenGL version: 4.6 (Core Profile) Mesa 20.2.6
Using OpenGL Renderer: Mesa Intel(R) HD Graphics 630 (KBL GT2)
ERROR [client_gl_xlib_compositor_create] client_gl_xlib_compositor_create - Required OpenGL extension GL_EXT_memory_object not available
XR_ERROR_INITIALIZATION_FAILED in xrCreateSession: Failed to create an xlib client compositor
Failed to create session [XR_ERROR_INITIALIZATION_FAILED]
However, note that to run this example I had to use the Nvidia driver, as using the Intel driver the
./openxr-example
was failing with error:
Thanks @traversaro for the heads-up. Pretty useful. I have run into similar issues, and I had to set to the "NVIDIA (Performance Mode)" in the "PRIME Profiles" of the "NVIDIA X Server settings".
Also, I was having issues in restarting the monado-service
. In particular, I had the following:
ERROR [create_listen_socket] Could not bind socket to path /tmp/monado_comp_ipc: is the service running already?
ERROR [create_listen_socket] Or, is the systemd unit monado.socket or monado-dev.socket active?
Apparently, the clean way to close the monado-service
is to press enter in the terminal, rather than CTRL+C. See https://monado.freedesktop.org/getting-started.html#running-openxr-applications
However, note that to run this example I had to use the Nvidia driver, as using the Intel driver the
./openxr-example
was failing with error:Thanks @traversaro for the heads-up. Pretty useful. I have run into similar issues, and I had to set to the "NVIDIA (Performance Mode)" in the "PRIME Profiles" of the "NVIDIA X Server settings".
Also, I was having issues in restarting the
monado-service
. In particular, I had the following:ERROR [create_listen_socket] Could not bind socket to path /tmp/monado_comp_ipc: is the service running already? ERROR [create_listen_socket] Or, is the systemd unit monado.socket or monado-dev.socket active?
Apparently, the clean way to close the
monado-service
is to press enter in the terminal, rather than CTRL+C. See https://monado.freedesktop.org/getting-started.html#running-openxr-applications
Thanks, I also experienced similar issues but I did not investigated them (and restarting proved to be a useful strategy for my limited needs).
However, note that to run this example I had to use the Nvidia driver, as using the Intel driver the
./openxr-example
was failing with error:Thanks @traversaro for the heads-up. Pretty useful. I have run into similar issues, and I had to set to the "NVIDIA (Performance Mode)" in the "PRIME Profiles" of the "NVIDIA X Server settings".
Also, I was having issues in restarting the
monado-service
. In particular, I had the following:ERROR [create_listen_socket] Could not bind socket to path /tmp/monado_comp_ipc: is the service running already? ERROR [create_listen_socket] Or, is the systemd unit monado.socket or monado-dev.socket active?
Apparently, the clean way to close the
monado-service
is to press enter in the terminal, rather than CTRL+C. See https://monado.freedesktop.org/getting-started.html#running-openxr-applications
Thanks! I had the same issue. I just did not realize that I needed to delete /tmp/monado_comp_ipc
before again starting with monado-service
. Now it works!
Right now we are pretty limited on the set of VR headsets we support, basically only Oculus devices. In particular, we strongly rely on the "old" Oculus setup, where a set of external receivers is used to triangulate the position of the hands and of the headset. Recently, Oculus has moved to a new "inside-out" tracking design and the Oculus we have is out of stock. Basically, the hand position is tracked by means of a set of cameras installed in the headset. The problem with this is that it may be possible for the hands to be outside the field of view of the cameras. This is not a problem for VR applications, as the user is interested only on what it is in its field of view. On the other hand, it is more of a problem for telerobotics since the references to the robot should be always meaningful.
In addition for the ANA Avatar Xprize competition, we have to push on the emotional engagement of a person interacting with the Avatar. For this reason, it would be nice to use headsets like https://enterprise.vive.com/us/product/vive-pro-eye-office/ that allows tracking the eye movements for example. It would be also nice to investigate what would change in the immersion level when we retarget the fixation point of the operator into the robot.
Recently, a new "standard" has been developed in order to try unifying the use of such devices, OpenXR (https://www.khronos.org/openxr/), and is it backed by the main actors in the VR field.
OpenXR is constituted by an interface (https://github.com/KhronosGroup/OpenXR-SDK), that can then be implemented by different "runtimes". Basically, every headset has its own supported "runtime", but at least the interface is a single one. This allows writing the application once and then resort to these runtimes in order to run the same application on different hardware.
Oculus has announced his support to the OpenXr project:
For HTC devices instead, there are two possible runtimes:
SteamVR
https://www.steamvr.com/it/ that is closed-source, but largely supported into OpenXr (https://store.steampowered.com/news/app/250820/view/3044967019267211914)Monado
(https://gitlab.freedesktop.org/monado/monado and https://monado.freedesktop.org/) that is open source, supporting Linux, but probably requiring a bit more of work. For example, it supports the external tracking devices through an external library: https://github.com/cntools/libsurviveIt would be nice to start using this new framework, allowing the use of different headsets.
cc @DanielePucci @traversaro @nunoguedelha