Closed alcooper91 closed 3 years ago
Hi @alcooper91!
If WebXR was to add back linear and angular velocity for input sources, but omit linear and angular acceleration, would your throwing and other scenarios be covered? In OpenXR, we've been finding that acceleration can be fairly noisy and differ across platforms, and most real-world physics scenarios we know of seem to be covered by velocity.
FWIW, I'm not currently aware of a specific use case that needs acceleration.
I haven't actually tried implementing a throwing mechanism myself, but I found this discussion while researching the issue, and I trust doc_ok on this:
https://www.reddit.com/r/oculus/comments/87kzs5/why_cant_games_get_throwing_right_or_at_least/dwdse80/ https://www.reddit.com/r/oculus/comments/87kzs5/why_cant_games_get_throwing_right_or_at_least/dwe2lnl/
According to that, you'd need the controller linear and angular velocity at the time of the throw, in addition to the offset from the controller origin to the thrown object's center of mass, to calculate the thrown object's flight path. (Intuitively I think this approach makes sense, you should be able to throw an object just with a wrist rotation even when the controller's origin stays in one place.)
From a physics point of view, linear and angular momentum is a real concept and based on velocities, but there's no such thing as accelerational momentum. It could conceivably be useful to have acceleration data to do extrapolation or prediction, but that would need to be done very carefully if the data is noisy, and most likely it's better to let the platform's low-level pose calculations take that into account if appropriate.
(Please take all this with a grain of salt since I'm not an expert on this, and speak up if we're missing something here.)
Totally fine with me! Mostly just included acceleration because it looks like WebVR exposed it, but if all we need to do throwing right is velocity, then we can definitely start with just that, and re-visit acceleration if we find that that's a gap.
FWIW I definitely feel like I've seen a fair bit of noise in the acceleration bits on the WebVR pages as well.
In developing apps for Windows Mixed Reality, we've found that a big reason that apps get throwing wrong is use of the wrong time to sample the velocity from the platform.
Apps often do most of their spatial reasoning at the predicted future photon time for the current frame, so the rendered view will line up with where the user's head is predicted to be. However, if the user is throwing a ball, you care about where the ball was when they actually released the trigger, which is at some time in the past. If the user was throwing with a downward motion and the app uses a future photon time rather than the past release time, the object will likely be thrown downward into the ground at the wrong speed.
There's a bunch of best practices for throwing in the Windows MR docs. These best practices map into WebXR as well:
XRFrame.getPose
on the XRFrame
returned by the current XRSession.requestAnimationFrame
callback.XRFrame.getPose
on the XRFrame
from the frame
attribute in the XRInputSourceEvent
payload for that input event.Totally valid, I think we'd expect apps to sample the velocity from the historical data when they get that input source release event (and perhaps a note could be added indicating such). There's currently no notion of a time delta (which would be necessary for apps to calculate those trajectories) exposed on either the XRFrame or the XRInputSourceEvent objects, and it feels wrong to rely on the time difference between events/the framerate.
In discussion with klausw on Slack I was wanting velocities more for compatibility with older getGamepads code than for accuracy. It is easy enough to compute as long as you have accurate time information from the poses. Then you can compute acceleration too if you need it, and decide how much to smooth out the noise, and how mach backwards/forwards looking is needed for a particular situation. However, as others have pointed out above, that crucial timing detail seems to be missing.
I think it would be much more important to put a timestamp on the XRPose than to provide the velocities. A timestamp on the frame would help, but might often be inappropriate, especially if there were different kinds of input devices with different polling and latencies.
However, that assumes that position is the basic information captured by the device. Many devices naturally capture acceleration: velocities or positions must be synthesized rather than the other way about. It would be a shame to lose out the capabilities of such devices.
This was discussed at TPAC. While there's a general agreement that velocity data is a good thing to expose (not acceleration, for reasons that were thoroughly explored by the OpenXR group) there's not a sense that it's absolutely necessary to get into the spec immediately, especially given that the spec language around the math involved will need to be fairly precise. As such this is will remain in the future milestone.
Now that we've shipped WebXR for a while, maybe it's time to revisit this.
We've seen some use cases where it would be really handy to have the velocity of the controller provided by the runtime. @thetuvix is your comment about acceleration being noisy still true?
/agenda
Agreed that this is probably a decent point to revisit this topic!
The Gamepad API (used by WebVR) exposes the concept of Linear and Angular Acceleration and Velocities which are not currently included in the Gamepad API in the WebXR spec. Issue 185 removed these velocity and accelerations for the headsets, but no suitable workarounds exist to get these values from the gamepad.
Including these values would be critical for any developers that want to accurately implement any form of throwing or other gamepad contact even without button presses.