This is not perfectly emulating the real behavior. For example, ideally the DOM elements should be rendered in 3D scene and the events (e.g. click event) should be fired with xr input. And in AR mode the elements should be rendered in emulated device.
But with the current web API, it's very difficult to do them especially with precisely keeping the quality, dom hierarchies, events, and JavaScript code. (Main issue is there is no proper way to render dom elements in Canvas/Texture.)
So the approach used in this PR would be a reasonable solution. It doesn't emulate perfectly, but would be still useful for users. For example they can test a case where dom event listener has an effect to VR 3D scene, like adding a new object by clicking a button.
And on desktop, it would be easier to click dom elements rather than touching them with xr controllers.
This PR adds DOM overlay API support with the approach I mentioned in https://github.com/MozillaReality/WebXR-emulator-extension/issues/222#issuecomment-626083760, just putting the dom overlay elements on the canvas as screen type.
This is not perfectly emulating the real behavior. For example, ideally the DOM elements should be rendered in 3D scene and the events (e.g. click event) should be fired with xr input. And in AR mode the elements should be rendered in emulated device.
But with the current web API, it's very difficult to do them especially with precisely keeping the quality, dom hierarchies, events, and JavaScript code. (Main issue is there is no proper way to render dom elements in Canvas/Texture.)
So the approach used in this PR would be a reasonable solution. It doesn't emulate perfectly, but would be still useful for users. For example they can test a case where dom event listener has an effect to VR 3D scene, like adding a new object by clicking a button.
And on desktop, it would be easier to click dom elements rather than touching them with xr controllers.