immersive-web / proposals

Initial proposals for future Immersive Web work (see README)
95 stars 11 forks source link

Proposal to Standardize Controller Interaction in VR for 2D Web Pages #14

Closed jonobr1 closed 5 years ago

jonobr1 commented 6 years ago

I initially posted this as a question over on WebXR. In testing 2D web pages on both Chrome Canary in Daydream and Oculus Browser I noticed that only the click event is audible while not presenting the page in VR. This 2D Drawing Page has no way to respond to the controllers interaction because it listens to mouse and touch events. Here's an example of me trying to draw something in Daydream:

39648017-e22c8506-4f95-11e8-9f4b-1177c7279b83

I propose that Browser Vendors support higher fidelity of interactions to 2D web pages. It increases the types of engaging experiences that are worth checking out on the web in VR. And it offers web developers the ability to support the unique expression VR provides without having to worry about any of the complex design / technical issues that arise from the third dimension. To get the ball rolling I've come up with a couple of approaches that could help break this barrier:

  1. Spoof Existing Event: The Browser Vendor could override pointerevents or mouse / touch equivalents for pressing and drawing capability.
  2. New Event: The Browser Vendor could add an additional event that the web page could listen for. e.g: xr-pointer-enter xr-pointer-leave.
  3. Something Else? Totally up for whatever you guys think would be best.
bluemarvin commented 6 years ago

For Firefox Reality, we are currently spoofing a mouse. So hover event work in the page if they are supported.

jonobr1 commented 6 years ago

CSS hover events work great on Daydream / Oculus Browser too. I'll test the drawing demo in FF Reality this weekend. Sounds awesome!

TrevorFSmith commented 6 years ago

@jonobr1 How did your test go?

jonobr1 commented 6 years ago

Sorry for the delay on this @TrevorFSmith! I actually don't know where to get Firefox Reality. Is it publicly available? Maybe it's only for iPhone?

I didn't see it in the Google Play Store... But, a small update. Oculus Browser now handles either mousemove or touchmove events and it's really nice. I've uploaded a video here. Would be great to have some language somewhere to tell browser vendors to implement the mouse / touch in this way.

Super handy!

TrevorFSmith commented 6 years ago

@jonobr1 Firefox Reality is in pre-release development and so is not yet in app stores. If you care to side-load and APK onto one of the Android stand-alone headsets (Mirage Solo, Oculus Go, Vive Focus) then you get get one of the nightly builds using the "build results" link at the bottom of the README.md

In other exciting news, the Google Chrome for Daydream is now available, so if you have a Mirage Solo then you can also check out how they're handling mouse and touch events for 2D pages.

jonobr1 commented 6 years ago

I'll give it a try on my Go and I'll have to pick up a Mirage Solo! Thanks for the update.

jonobr1 commented 6 years ago

Slight update: haven't tried the Mirage Solo, but I have tried Daydream with Pixel. The behavior exhibited on Daydream is different from Oculus Go in that the drag events are triggered when the controller's touchpad is firing. It's nice that it offers some functionality, but the UX is for more limiting than in the Oculus Go.

Many ( already made ) websites will be accessible in VR by hooking into the same event patterns as mouse or touch in the way Oculus Go has implemented them.

ryzngard commented 6 years ago

@jonobr1 have you tried how WinMR handles this? With Edge in VR it should be handled as touch events for controller input. You can try with the Mixed Reality Portal and simulate controllers/headset.

For everything (that I can think of) that isn't a mouse, touch tends to better represent how a user is interacting with the page. If a user is using an actual mouse, I think VR should just allow the mouse to behave as such. Even for outside of Edge, 2d apps behave this way since WinMR injects input at the platform layer rather than exposing new events. You can try with 2d UWP apps, or even use the desktop slate and test an app that way. It would be interesting to see how this compares to other paradigms for what works and doesn't.

jonobr1 commented 6 years ago

@ryzngard, that's awesome to hear! I haven't tried but I think you make a great point about using touch, because if you have more than one controller then the touch events could handle both at the same time. If it's helpful I can compile a list of drawing projects for other people to try out as well. Unfortunately, the chances that I'll be able to try this out on every type of device are low

TrevorFSmith commented 5 years ago

Hey, folks. It seems like progress on this has stalled so I'm going to close it for now. If someone has a plan for making progress on this topic then let me know and we can re-open it.