SpectacularAI / 3dgs-deblur

[ECCV2024] Gaussian Splatting on the Move: Blur and Rolling Shutter Compensation for Natural Camera Motion
https://spectacularai.github.io/3dgs-deblur/
Apache License 2.0
152 stars 11 forks source link

Cannot optimize with velocities #13

Open jerredchen opened 2 days ago

jerredchen commented 2 days ago

Hi,

Thank you for this codebase! I would like to test the velocity optimization for the method by running python train.py --dataset=synthetic-mb --case=9. However, I'm running into the following error:

File "/home/jerred/3dgs-deblur/nerfstudio/nerfstudio/models/splatfacto.py", line 491, in after_train
    assert self.xys.absgrad is not None  # type: ignore
AttributeError: 'Tensor' object has no attribute 'absgrad'
The full error message: ``` Traceback (most recent call last): File "/home/jerred/miniconda3/envs/nerfstudio_copy/bin/ns-train", line 8, in sys.exit(entrypoint()) File "/home/jerred/3dgs-deblur/nerfstudio/nerfstudio/scripts/train.py", line 262, in entrypoint main( File "/home/jerred/3dgs-deblur/nerfstudio/nerfstudio/scripts/train.py", line 247, in main launch( File "/home/jerred/3dgs-deblur/nerfstudio/nerfstudio/scripts/train.py", line 189, in launch main_func(local_rank=0, world_size=world_size, config=config) File "/home/jerred/3dgs-deblur/nerfstudio/nerfstudio/scripts/train.py", line 100, in train_loop trainer.train() File "/home/jerred/3dgs-deblur/nerfstudio/nerfstudio/engine/trainer.py", line 264, in train callback.run_callback_at_location( File "/home/jerred/3dgs-deblur/nerfstudio/nerfstudio/engine/callbacks.py", line 115, in run_callback_at_location self.run_callback(step=step) File "/home/jerred/3dgs-deblur/nerfstudio/nerfstudio/engine/callbacks.py", line 105, in run_callback self.func(*self.args, **self.kwargs, step=step) File "/home/jerred/3dgs-deblur/nerfstudio/nerfstudio/models/splatfacto.py", line 491, in after_train assert self.xys.absgrad is not None # type: ignore AttributeError: 'Tensor' object has no attribute 'absgrad' Traceback (most recent call last): File "train.py", line 353, in process(args.input_folder, args) File "train.py", line 193, in process subprocess.check_call(cmd) File "/home/jerred/miniconda3/envs/nerfstudio_copy/lib/python3.8/subprocess.py", line 364, in check_call raise CalledProcessError(retcode, cmd) subprocess.CalledProcessError: Command '['ns-train', 'splatfacto', '--data', 'data/inputs-processed/synthetic-mb/cozyroom', '--viewer.quit-on-train-completion', 'True', '--pipeline.model.rasterize-mode', 'antialiased', '--pipeline.model.use-scale-regularization', 'True', '--pipeline.model.num-downscales', '0', '--pipeline.model.background-color', 'auto', '--pipeline.model.cull-scale-thresh', '2.0', '--pipeline.model.optimize-eval-velocities=False', '--pipeline.model.blur-samples=10', '--max-num-iterations', '20000', '--vis=tensorboard', '--pipeline.model.rolling-shutter-compensation=False', '--pipeline.model.min-rgb-level=10', '--pipeline.model.camera-velocity-optimizer.enabled=True', '--output-dir', 'data/outputs/synthetic-mb/motion_blur-velocity_opt', 'nerfstudio-data', '--orientation-method', 'none', '--eval-mode', 'interval', '--eval-interval', '8', '--optimize-eval-cameras', 'True']' returned non-zero exit status 1. ```

To reproduce, I am running on Ubuntu 22.04, Python 3.8, CUDA 11.8 with an Nvidia 3090 RTX. I followed the instructions from the README for installing a conda environment with the dependencies to install Nerfstudio.

oseiskar commented 1 day ago

Did you have a previous Nerfstudio installation in the same Conda environment or did you start from scratch? This error could happen if you have the wrong version of Nerfstudio or gsplat installed. Could you try again with a fresh Conda environment and running ./scripts/install.sh, please?

jerredchen commented 1 day ago

Thanks for your reply. I actually did try with a fresh conda environment following here followed by ./scripts/install.sh (did not run pip install nerfstudio).

I also noticed that the velocity optimization is fine for using the downloaded SAI data, is there a difference in between the synthetic and the real data that causes this?