google-research / multinerf

A Code Release for Mip-NeRF 360, Ref-NeRF, and RawNeRF
Apache License 2.0
3.58k stars 339 forks source link

dataset problem #19

Open RuiYangQuan opened 1 year ago

RuiYangQuan commented 1 year ago

hi,i meet some problem when run this : python -m train --gin_configs=configs/360.gin --gin_bindings="Config.data_dir = 'my_dataset_dir'" --gin_bindings="Config.checkpoint_dir = 'my_dataset_dir/checkpoints'" --logtostderr 16-1 how can i do next ,please give me some advice ,thanks!!!

StarsTesla commented 1 year ago

Did you try to use your custome data? Maybe you should use the dataset of class llff instead of blender?

bmild commented 1 year ago

Did you successfully run COLMAP? There should be a subdirectory my_dataset_dir/sparse/0/ if so, which should prevent the code from reaching this error.

Palisand commented 1 year ago

I'm getting the same No such file or directory: 'my_dataset_dir/transforms.json' issue. I ran COLMAP via scripts/local_colmap_and_resize.sh, but sparse is empty because the command that populates it (I think) fails on my machine; the Mapper.ba_global_function_tolerance option is unrecognized. This probably has to do with the fact that I'm running an older version of COLMAP (3.5) because of issues I had getting COLMAP running on my OS (macOS Monterey).

Palisand commented 1 year ago

After installing suite-sparse I was able to upgrade to the latest version of COLMAP (3.7) and to run the aforementioned command successfully. sparse/0 now exists in my system.

Palisand commented 1 year ago

I'm currently training multinerf. Hopefully, the steps I presented will help you get further @RuiYangQuan. Thanks @bmild for the explanation.

chinnarouge commented 1 year ago

This problem exists in google colab also, any solution with respect to that?, I tried installing colamp 3.8 but it wasn't useful