KhronosGroup / OpenXR-SDK-Source

Sources for OpenXR loader, basic API layers, and example code.
https://khronos.org/openxr
Apache License 2.0
657 stars 244 forks source link

OpenXR doesn't support acceleration and angularAcceleration #504

Open rmaroy opened 6 days ago

rmaroy commented 6 days ago

Hi,

Hardware providers rely more and more on OpenXR. However, OpenXR does not support acceleration and angularAcceleration input (not velocity derived), which are, as position and orientation, key inputs of XR devices and tha are available for many headset products on the market. Accelerations and AngularAccelerations are the only vectors which are available when the handler are not visible by the headset. As such, they allow for the estimation of the movement when the disappearance of the handlers are momentary.

Would it be possible to add a feature to get the instant acceleration and angular acceleration from the devices ? This lack is a major issue since, at the present time, there is no other way to get this informations.

Best Regards

UnrealEngine 5 4 4 Right Hand linear and angularAcceleration are (0, 0, 0) UnityVersion 2021 3 0f1 with Right Controler linear and angular accelerations return always (0, 0, 0)

rpavlik commented 6 days ago

The runtime is expected to use acceleration data itself to perform the tracking for the app. We do not want to delegate dead-reckoning tracking to the application because there are far too many ways to get it subtly wrong, among other reasons. So it is a conscious decision to not expose it to the application. It is definitely assumed to be used by the runtime, however. (In such a situation you'd probalby expect POSITION_VALID (but not TRACKED), and ORIENTATION_TRACKED as well as ORIENTATION_VALID)

Do you have any other use cases besides handling degraded tracking quality?

rpavlik-bot commented 3 days ago

An issue (number 2373) has been filed to correspond to this issue in the internal Khronos GitLab (Khronos members only: KHR:openxr/openxr#2373 ), to facilitate working group processes.

This GitHub issue will continue to be the main site of discussion.

rmaroy commented 2 days ago

Hi, Please let me explain why linear and angular accelerations are in my humble opinion precious data : 1) First of all, headset without external captors for handler detection and location are the most popular products. For these types of product, as soon as the handler is out of the FOV of the detection and location cameras, position and velocity are lost, and will only be available when (for the position) and 1 frame after (for the velocity) the redetection of the handler.   Consequently, all the gesture information meanwhile is lost and the first velocity value is at best (0,0,0), and in the worst case completely "random" (what is the case for the Meta Quest 1, 2 & 3)

The only information that remains valid when the handers are invisible is the handler acceleration (linear and angular). They allow for the reconstruction of all the gesture during the handler non detection time interval. This reconstruction is crucial for the following types of games (non exhaustive list):    a) FPS when you want to throw a grenade    b) Ball sport games c) Games requiring large gesture identification.   etc.

Part of the reconstruction is performed by the built in sofware (or OpenXR), but it rapidly stops and adds a supplementary delay on the estimated velocity when the hander become visible again.

2) For solving the motion sickness issue, two types of values are of importance : the velocity of the avatar versus the scenery, which indicates the discrepancies between the eye and the inner ear, and the headset instant acceleration and angular acceleration, which informs on the inner ear perturbation.

Therefore, acceleration invaluable for me. Several projects of mine are blocked due to the "disappearance" of accelerations, since they work only on Unity until 2020 and Unreal Engine until 4.27. This represents years of work. I thought the "loss" of acceleration was a bug to be corrected. Since I understand now that it was a design choice, I feel like being in a dead end.

I deeply and sincerely hope you will consider the inclusion of acceleration into OpenXR. Would it be possible to be have news of the issue when you have reached a decision?

Best Regards, Renaud Maroy

Note : I am also a scientist in applied mathematics working on tools for game developers that would allow them to develop games that are impossible or difficult to develop right now.

rmaroy commented 2 days ago

"We do not want to delegate dead-reckoning tracking to the application because there are far too many ways to get it subtly wrong, among other reasons. So it is a conscious decision to not expose it to the application"

I understand your concern and your decision. Could it be possible not to expose accelerations as default, but to allow acceleration to be retrieved if specially needed, what is my case. I would be instantly unblocked.

rmaroy commented 2 days ago

My other use cases are: 1) I can't throw a ball in my sport game if the ball is detected by the camera during all the gesture (and it rarely is), so the game I develop is considerably degraded. It works admirably with the acceleration, with a gesture as natural as in real life. Without acceleration more than once on two throws, the ball either fall at my feet or is thrown in a random direction. 2) Same problem with grenade throw in a FPS game 3) I can not identify correctly a "cut" combat gesture in my sword combat game (I am an expert in modern era fencing) for the opponent to react efficiently (with a minimum of anticipation) 4) I can not detect efficiently if the user is experiencing motion sickness when he turns the head while moving in the virtual environment