Closed yellowYuga closed 1 year ago
Hi, in our project, we obtained all the necessary information from Instant-NGP by directly using the command line instead of relying on the GUI. Specifically, we utilized the code provided in scripts/run.py
. Within this code, you have the flexibility to modify the render_mode argument based on your requirements. You can choose between testbed.render_mode = ngp.Depth
, testbed.render_mode = ngp.AO
, or leave it as the default to render RGB images.
However, it is important to note that the updated version of the Instant-NGP code might have undergone changes since we used it for data generation.
PS. Yes, we used transforms_left.json
and transforms.json
separately for the data generation.
Thank you, I have been able to render based on the JSON file you provided. Although the rendered images have some artifacts and appear slightly blurry, I will examine the input images of instant-ngp.
Hi, in our project, we obtained all the necessary information from Instant-NGP by directly using the command line instead of relying on the GUI. Specifically, we utilized the code provided in
scripts/run.py
. Within this code, you have the flexibility to modify the render_mode argument based on your requirements. You can choose betweentestbed.render_mode = ngp.Depth
,testbed.render_mode = ngp.AO
, or leave it as the default to render RGB images.However, it is important to note that the updated version of the Instant-NGP code might have undergone changes since we used it for data generation.
PS. Yes, we used
transforms_left.json
andtransforms.json
separately for the data generation.
Hi Fabio,
First thanks for your excellent work!
I can get the left and right images now. However, the depth map are white and black if I only use testbed.render_mode = ngp.Depth
and get this depth map if I use
testbed.render_mode = ngp.Depth
testbed.exposure = -3.8
The value ranges [0, 255], do you know how to get absolute depth?
Hi, in our project, we obtained all the necessary information from Instant-NGP by directly using the command line instead of relying on the GUI. Specifically, we utilized the code provided in
scripts/run.py
. Within this code, you have the flexibility to modify the render_mode argument based on your requirements. You can choose betweentestbed.render_mode = ngp.Depth
,testbed.render_mode = ngp.AO
, or leave it as the default to render RGB images. However, it is important to note that the updated version of the Instant-NGP code might have undergone changes since we used it for data generation. PS. Yes, we usedtransforms_left.json
andtransforms.json
separately for the data generation.Hi Fabio,
First thanks for your excellent work! I can get the left and right images now. However, the depth map are white and black if I only use
testbed.render_mode = ngp.Depth
and get this depth map if I usetestbed.render_mode = ngp.Depth testbed.exposure = -3.8
The value ranges [0, 255], do you know how to get absolute depth?
Have you solved the problem yet?
Hi, in our project, we obtained all the necessary information from Instant-NGP by directly using the command line instead of relying on the GUI. Specifically, we utilized the code provided in
scripts/run.py
. Within this code, you have the flexibility to modify the render_mode argument based on your requirements. You can choose betweentestbed.render_mode = ngp.Depth
,testbed.render_mode = ngp.AO
, or leave it as the default to render RGB images.However, it is important to note that the updated version of the Instant-NGP code might have undergone changes since we used it for data generation.
PS. Yes, we used
transforms_left.json
andtransforms.json
separately for the data generation.
Are you referring to the scripts/run.py
in instant-ngp? There is no render_mode
parameter that can be set in the current code.
Thank you for this great work. However, I encountered a problem while building my own dataset. I can use your code to generate a new transforms.json file and train Instant NGP successfully. But I'm wondering how to export RGB and AO/depth images from the trained Instant NGP. Do I need to use transforms.json and transforms_left.json separately to generate different training results for nerf? By the way, I am using the GUI on Windows.