google-research / jax3d

Apache License 2.0
729 stars 94 forks source link

How do I generate a dataset for real 360? #190

Open guswhd opened 1 year ago

guswhd commented 1 year ago

@hawkinsp First, I wanted to confirm that it worked well on the nerf synthetic dataset and then create a real 360 dataset and train it. However, there is not much information, so I query https://github.com/google-research/jax3d/issues/175 After seeing this problem, I thought about dragging mip nerf's dataset to real 360 and downloading the dataset to learn. It was trained well with mip nerf's dataset, but the dataset I created myself did not have a file called pose_bounds.npy, but instead had a file called transform.json, and as expected, it did not learn well. In conclusion, I am curious about how a file called pose_bounds.npy is created and what role it plays.

kim-jinuk commented 12 months ago

If you only have RGB images, using COLMAP and the link below will help you create a "poses_bounds.npy" file.

https://github.com/Fyusion/LLFF/tree/master

guswhd commented 12 months ago

If you only have RGB images, using COLMAP and the link below will help you create a "poses_bounds.npy" file.

https://github.com/Fyusion/LLFF/tree/master

thank you !! Are there any data generation tips that might be helpful in learning nerf well?

DWhettam commented 11 months ago

I'd recommend looking at the local_colmap_and_resize.sh script from here also: https://github.com/google-research/multinerf

I believe a combination of that script and the LLFF imgs2poses.py script is what is used to create your own dataset.