Open gabrielwebphy opened 1 year ago
This is also related on the THREE.js discord: https://discord.com/channels/685241246557667386/1121160381214240838 Any workarounds would also be appreciated.
After a closer look I can say there is no support for the WebXR Raw Camera Access Module yet (https://immersive-web.github.io/raw-camera-access/).
Supporting this module requires changes in the renderer/XR manager. As long as these enhancements are missing, you can't use this API.
@Mugen87 it has been available on Chrome for Android for a bit: https://chromestatus.com/feature/5759984304390144
Do you have any insights on what changes needs to made to the renderer so someone can sketch a PR?
When reading the spec correctly, WebXRManager
potentially requires a new method that returns a texture representing the raw camera pixels. This texture is created and maintained by the XR manager based on a plain WebGLTexture
object received with the new getCameraImage()
API.
Can someone explain a few use cases where the WebXR Raw Camera Access Module is useful? The introduction section from the spec didn't clear things up for me.
Can someone explain a few use cases where the WebXR Raw Camera Access Module is useful? The introduction section from the spec didn't clear things up for me.
Sure, the most common use case is to fill the transmission render target with the camera feed texture. Currently the transmissive objects appear black in AR.
Demo: https://twitter.com/stspanho/status/1455150311438487554
Currently it's really hard to handle a raw WebGLTexture
object as a user in three.js as you explained here. It would be ideal if WebXRManager/WebGLRenderer did this internally.
Thanks for the example! Agreed, ideally the XR manager (or at least some sort of library component) takes care of the texture handling.
Can someone explain a few use cases where the WebXR Raw Camera Access Module is useful? The introduction section from the spec didn't clear things up for me.
In my case, the camera access is essential for something else. It doesn't change anything for this example, but it was just a showcase to show the bug. Sorry for not being clear
It seems this issue is a bit more tricky to implement than expected.
Accessing the raw camera texture from the XR binding is managable. However, the engine assumes instances of WebGLTexture
are created and managed inside the WebGLTextures
module. This use case would be an exception since it creates the WebGL texture object in WebXRManager
. So the challenge is to make WebGLTextures
work with an external "raw" texture. This potentially requires a new texture class.
The texture returned by getCameraImage()
is a so called opaque texture. Special rules apply to such a texture like calling gl.deleteTexture()
leads to an invalid operation. Besides, the texture must be unbounded and detached from all WebGLShader
objects at the end of requestAnimationFrame()
(not sure about the consequences of this).
I think it would help to have a code example like the ones from https://immersive-web.github.io/webxr-samples/ so it gets more clear how to use Raw Camera Access Module.
I think it would help to have a code example like the ones from https://immersive-web.github.io/webxr-samples/ so it gets more clear how to use Raw Camera Access Module.
@Mugen87 I was able to find a discussion about it here https://github.com/immersive-web/computer-vision/issues/2
Looks like the example is on a different website (the demo is working on my phone), and this is the source code.
Apparently they get the texture and the just pass it as an uniform to the program which samples from it?
// Specify the texture to map onto the faces.
// Tell WebGL we want to affect texture unit 0
gl.activeTexture(gl.TEXTURE0);
// Bind the texture to texture unit 0
gl.bindTexture(gl.TEXTURE_2D, texture);
// Tell the shader we bound the texture to texture unit 0
gl.uniform1i(programInfo.uniformLocations.uSampler, 0);
They also read from the texture using readPixels
to debug.
:warning: Just a warning I have discovered that this solution doesn't work in all circumstances. It all depends on the order of raw webgl calls and I think only works if the camera feed is the first object / texture rendered. I am investigating why this is happening now.
For those wanting a workaround I was able to get a custom build of three.js, however I can't vouch that it's tested and stable. It adds a function,
renderer.xr.getCameraTexture()
, which returns a custom texture type (uses the WebGLTexture provided by the XR session under the hood) but can be used as normal.
:warning: Because of how the camera-access module works, it will be black / empty if rendering outside of an xrSession's
requestAnimationFrame
callback.
The only downside is that it's for r151 and not packaged. I will upstream it on latest I have finished a project for a client.
@Mugen87 Any chance you'd be able to advise on my changes? Seems to work quite well but not sure if I'm missing something. Would I also be able to request a review from you when I PR it to main (need to rebase to latest)?
Any chance you'd be able to advise on my changes?
Thanks for sharing the repo! When looking at your code, I think we should split the change into at least two PRs.
The first one should add a new texture class that enables applications to use custom WebGL textures objects. This is currently only possible with hacking the renderer. I wouldn't call the class ExternalTexture
though. Maybe GLTexture
(to match GLBufferAttribute
) or even RawTexture
or NativeTexture
. I vote for the latter ones since I think we should avoid using GL* syntax in class names because of WebGPU.
If this PR is merged, a second PR can be filed that adds the WebXR related changes. I'm not sure but i think it's not necessary to create a separate class for opaque textures. A new flag RawTexture.opaque
should be sufficient. We could also think to move to flag to the Texture
class if we require the opaque state in other texture types as well.
Are you interested in giving this approach a try?
Can someone explain a few use cases where the WebXR Raw Camera Access Module is useful? The introduction section from the spec didn't clear things up for me.
Just to bump this up and provide additional user context behind the request of WebXR camera feed based on our work in the past 6 months on AR applications for civil engineering use cases.
We have done an extensive evaluation of WebAR and native AR solutions. The 2 primary features missing from WebAR to make it competitive against Native Apps and/or getUserMedia SLAM (8th wall) AR solutions are the following:
In summary both of these critical commercial AR features depend on this WebXR camera feed feature. (Of course the normal caveat applies that this resolves Android cases but iOS is unresolved, that is a separate issue 😊)
Description
I have a THREE.PlaneGeometry that has a videotexture on it, and I position it using the hit detection from WebXR just by touching the screen. No problems. But if I enable camera-access from WebXR, the texture will become the live camera feed. I want to be able to have the camera image, while also having the normal videotexture and 3D scene.
Reproduction steps
Code
const videoTexture = new THREE.VideoTexture(video); ... navigator.xr .requestSession("immersive-ar", { requiredFeatures: ["camera-access", "hit-test", "local"], optionalFeatures: ["dom-overlay"], domOverlay: { root: document.getElementById("overlay") }, }) .then(onSessionStarted, onRequestSessionError);
Live example
Screenshots
No camera-access:
Camera-access:
Version
0.147.0
Device
Mobile
Browser
Chrome
OS
Android