w3c / geolocation-sensor

Geolocation Sensor
https://www.w3.org/TR/geolocation-sensor/
Other
59 stars 21 forks source link

Geospatial positioning use cases and requirements for WebXR #12

Closed anssiko closed 5 years ago

anssiko commented 6 years ago

This issue is to discuss WebXR-inspired geospatial positioning use cases and requirements. See https://github.com/immersive-web/ideas/issues/5

Looping in @blairmacintyre for input.

blairmacintyre commented 6 years ago

Thanks, @anssiko

My interest here is in supporting geospatial AR. Right now, a user's location is obtained via the location API, and their orientation can be (crudely) obtained via the deviceorientation API.

I'm pretty sure (?) that the deviceorientation API is constantly under threat of being phased out, and one of the hopes (from what I've heard) is that the sensor APIs might provide similar, or better, functionality.

Regarding geolocation for AR, a key requirement is that we can get both position AND orientation in the world, so it would be great if this API (or set of APIs) provided both. Having to do the sensor fusion in the web page has always resulted in crude results, especially on some Android phones we've tried it on. The poor responsiveness and noise of the compass (in particular) has been problematic.

In contrast, when I use something like ARKit or ARCore, the orientation I get is rock solid. ARKit even lets you (the programmer) choose to align the coordinate system with EUS if you want.

So, an ideal situation for me would be that these new APIs coordinate with the (also new and being worked on) WebXR APIs, and when possible, have the data align and leverage each other.

Consider this: if I had a geoposition with ~1-2m accuracy, but was using ARKit, I could estimate the geolocation of ARKit's origin within 1-2m's AND I could get continuous updates to my geoposition that were smooth and precise. There might even be opportunities to do even more fusion internally to cause the data being exposed to be more useful.

anssiko commented 6 years ago

Thanks for your valuable input @blairmacintyre!

Regarding geolocation for AR, a key requirement is that we can get both position AND orientation in the world, so it would be great if this API (or set of APIs) provided both.

The Generic Sensor-based APIs have been designed from day 1 to enable sensor fusion.

The GeolocationSensor inherits Sensor.timestamp so the timestamps across all the sensor APIs (including Orientation Sensor you'd like to fuse with) that extend the Sensor abstract base class (defined in the Generic Sensor API) use the same time base and can be correlated.

Having to do the sensor fusion in the web page has always resulted in crude results, especially on some Android phones we've tried it on.

@kenchris has done Generic Sensor API fusion experimentation with encouraging results in https://github.com/intel/websensor-compass (demo).

We might do further similar experimentation with Geolocation Sensor and Orientation Sensor fusion when our Chromium implementation work commences to figure out if we can satisfy AR requirements.

@blairmacintyre, I'm aware you've also done quite a bit of experimentation with https://www.argonjs.io/, but I assume you were constrained by the APIs provided by the web engines and/or the underlying platform?

My hope is we can together move forward the low-level sensor APIs exposed to the web platform to allow powerful sensor fusion in JS for rapid prototyping and innovation.

blairmacintyre commented 6 years ago

This is great. I need to look more through all the sensor APIs to see what else is there, and how things like security and perms are handled.

I'm also really interested to brainstorm how these APIs may be "coordinated" with the WebXR APIs. For example, it would be really nice if a page that has obtained permissions to use AbsoluteOrientationSensor and the WebXR device API could have the values return at exactly the same timestamps, and (probably more importantly) leverage the platform APIs being used for WebXR to give much better / more responsive reports to the orientation sensor.

Specifically, the VR/AR system APIs put a ton of effort into high performance tracking, so that we can render the graphics as close to correct as possible and eliminate swim (post rendering adjustments complete that puzzle, of course). If the browser's Orientation Sensor is only using the non-AR/VR sensors (accelerometer / gyro / magnetometer) and doing it's own fusion, there is no way the estimates will match even if the time stamps do. But, if the AR/VR platform is active, it should be possible to use it's orientation (potentially combined with these other sensors) to get a better estimate.

tomayac commented 5 years ago

This has moved over to https://github.com/immersive-web/geo-alignment. I'm closing the present Issue https://github.com/w3c/geolocation-sensor/issues/12.