uhhhci / immersive-ngp

We present the first open-source VR NERF Unity package that brings high resolution, low-latency, 6-DOF NERF rendering to VR. This work is based on Nvidia's ground breaking instant-ngp technique.
Other
294 stars 25 forks source link

Do I have to buy a VR device to test the demo? #10

Open YiChenCityU opened 1 year ago

YiChenCityU commented 1 year ago

As above.

keli95566 commented 1 year ago

Yes, the main focus of this repository target immersive applications with a VR headset + steamVR. Please feel free to fork this repo and build immersive desktop applications.

rocco-haro commented 1 year ago

Hi @keli95566 , we are interested in following your recommendation on forking this repo to build a desktop application. Could you give us some pointers on what would have to change to make this repo work in non-VR settings? Or perhaps a better question/request is some architectural documentation on how this repo currently operates?

At the moment we have a rigid tech stack where we use openGL and URP do perform server side rendering and stream it to the browser. Is your repo running instant-gp alongside Unity, or are you importing the mesh converted from NeRF into Unity?

And as always, thank you very much for your contribution to the open source community. You're a blessing!

keli95566 commented 1 year ago

Hi @rocco-haro , thank you for the questions!

Q: "Is your repo running instant-gp alongside Unity, or are you importing the mesh converted from NeRF into Unity?" A: The repo runs on top of instant-ngp through a Unity native rendering plugin we developed. This manuscript documents how Unity native rendering works together with instant-ngp. It is actually quite simple, since instant-ngp uses OpenGL backend, we only need to pass texture handle pointers from Unity to instant-ngp for instant-ngp to render images to these Unity textures directly.

Q: "Could you give us some pointers on what would have to change to make this repo work in non-VR settings? " A: To develop for non-VR settings, feel free to refer to the StereoNerfRenderer.cpp and the unity.cu scripts. For VR application, the renderer has two render buffers to create stereoscopic images. I think for desktop applications, you could remove one render buffer, and convert the camera transform in Unity to a camera view matrix like in the StereoNerfRenderer.cpp.

It would certainly be interesting to see a desktop application built from Unity, and I hope the quick explanation helps!

rocco-haro commented 1 year ago

Fantastic explanation, thank you!