google-research / multinerf

A Code Release for Mip-NeRF 360, Ref-NeRF, and RawNeRF
Apache License 2.0
3.6k stars 341 forks source link

How to train your data? #6

Open gateway opened 2 years ago

gateway commented 2 years ago

Sorry if this is in the documentation, I have a set of jpgs that I have and ran bash scripts/local_colmap_and_resize.sh my_dataset_dir but from here im stumped as to what to do next?

Also does this train 360 equirectangular images or video?

bmild commented 2 years ago

You should be able to run something like

python -m train \
  --gin_configs=configs/360.gin \
  --gin_bindings="Config.data_dir = 'my_dataset_dir'" \
  --gin_bindings="Config.checkpoint_dir = 'my_dataset_dir/checkpoints'" \
  --logtostderr
gateway commented 2 years ago

You should be able to run something like

python -m train \
  --gin_configs=configs/360.gin \
  --gin_bindings="Config.data_dir = 'my_dataset_dir'" \
  --gin_bindings="Config.checkpoint_dir = 'my_dataset_dir/checkpoints'" \
  --logtostderr

cool and for just normal images? btw this is the 360 camera I want to be testing.. can i just use the 360 video or ? https://www.insta360.com/product/insta360-oners/1inch-360

momentmal commented 2 years ago

Sorry if this is in the documentation, I have a set of jpgs that I have and ran bash scripts/local_colmap_and_resize.sh my_dataset_dir but from here im stumped as to what to do next?

Same here, I ran "local_colmap_and_resize.sh" (normal images, no 360°) and don't really know what to do next. I am completely new to this and the documentation completely loses me at the end of "using your own data".

bmild commented 2 years ago

Hi, so far we don't support stitched 360 data but you can use fisheye data (the direct output from one half of the 360 camera) if you use the OPENCV_FISHEYE argument to the colmap script.

I just added more explicit instructions here for what to do after running local_colmap_and_resize.sh.