philpax / wgpu-openxr-example

a barebones example of how to integrate OpenXR with wgpu (Vulkan-only)
MIT License
49 stars 9 forks source link

Web support #13

Open philpax opened 1 year ago

philpax commented 1 year ago

This is strictly not possible right now, as the official linkage between WebGPU and WebXR has not landed. However, it may still be possible by copying the output from WebGPU to WebGL, and then from WebGL to WebXR. Requires more investigation.

philpax commented 1 year ago

I've opened an issue on the immersive-web proposal regarding its current status: https://github.com/immersive-web/WebXR-WebGPU-Binding/issues/5

rcelyte commented 2 months ago

It actually is possible to use WebGPU with WebXR right now, using one blit from canvas->WebXR: Demo Source

philpax commented 2 months ago

Oh, fascinating, nice work!

Ramith-D-Rodrigo commented 1 month ago

It actually is possible to use WebGPU with WebXR right now, using one blit from canvas->WebXR: Demo Source

Hi, I'm also interested in creating WebXR (Specifically AR) using WebGPU, but still a rookie to this domain. Can you briefly give an overview on how did you use WebGPU with WebXR API in the demo source? What are the roles of WebGPU and WebGL? I see that both have been used in the demo.

In any case I also have my own observation on the concept here. Correct me if I'm wrong, Basically, the rendering is done by WebGPU after getting the pose information from WebGL. Doesn't that mean GPU computation with respect to XR operations is done by WebGL?

Also a humble suggestion, if possible, please add comments to the source. It would be really helpful in understanding the approach you have employed.

Thanks!

rcelyte commented 1 month ago

WebGL is used strictly to forward rendered frames from WebGPU to WebXR, that's all. The core mechanism making this possible is API support for importing an HTMLCanvasElement as a blittable texture in WebGL via WebGLRenderingContext.texImage2D().

getting the pose information from WebGL

WebGL isn't involved in anything besides textures, the rest of WebXR is CPU-side. Start at the documentation for XRFrame.getViewerPose() to learn how poses in WebXR are handled.

still a rookie to this domain

I would recommend MDN's guides as a starting point for learning how WebXR works and the roles of each API. As for learning WebGPU, I can't really give advice on that front as I came from a background in native graphics (GL[ES], D3D11, Vulkan, Metal) and was able to apply that existing knowledge reading through sample code to understand how the concepts mapped.

Ramith-D-Rodrigo commented 1 month ago

Oh wow, thanks for the quick reply! I understand now. I'm currently learning WebGPU using the dawn implementation since I love C++.

WebGL isn't involved in anything besides textures, the rest of WebXR is CPU-side.

So you are referring to what is mentioned in the spec? If so, my bad. I should've read it more clearly.

I wanted to clarify the abstraction between WebXR and the rendering APIs despite WebXR having some WebGL based interfaces (Ex: XRWebGLBinding and XRWebGLLayer). Now I think they are cleared. Also, thanks for the guidance. Much appreciated.