ucaszyp / STEPS

This is the official repository for ICRA-2023 paper "STEPS: Joint Self-supervised Nighttime Image Enhancement and Depth Estimation"
https://arxiv.org/abs/2302.01334
MIT License
175 stars 10 forks source link

RuntimeError: shape '[-1, 3]' is invalid for input of size 22400 #19

Open Larissa0829 opened 2 weeks ago

Larissa0829 commented 2 weeks ago

When I run 'python3 train.py --config steps_ns --gpus 0' and set the environment variable 'CUDA_VISIBLE_DEVICES=0', I get the following error in SCI/loss.py:

File "D:\Projects\Depth Estimation\STEPS\train.py", line 94, in <module>
    main()
  File "D:\Projects\Depth Estimation\STEPS\train.py", line 90, in main
    trainer.fit(model, train_dataloaders=loader)
  File "D:\software\anaconda3\envs\STEPS\lib\site-packages\pytorch_lightning\trainer\trainer.py", line 771, in fit
    self._fit_impl, model, train_dataloaders, val_dataloaders, datamodule, ckpt_path
  File "D:\software\anaconda3\envs\STEPS\lib\site-packages\pytorch_lightning\trainer\trainer.py", line 723, in _call_and_handle_interrupt
    return trainer_fn(*args, **kwargs)
  File "D:\software\anaconda3\envs\STEPS\lib\site-packages\pytorch_lightning\trainer\trainer.py", line 811, in _fit_impl
    results = self._run(model, ckpt_path=self.ckpt_path)
  File "D:\software\anaconda3\envs\STEPS\lib\site-packages\pytorch_lightning\trainer\trainer.py", line 1236, in _run
    results = self._run_stage()
  File "D:\software\anaconda3\envs\STEPS\lib\site-packages\pytorch_lightning\trainer\trainer.py", line 1323, in _run_stage
    return self._run_train()
  File "D:\software\anaconda3\envs\STEPS\lib\site-packages\pytorch_lightning\trainer\trainer.py", line 1353, in _run_train
    self.fit_loop.run()
  File "D:\software\anaconda3\envs\STEPS\lib\site-packages\pytorch_lightning\loops\base.py", line 204, in run
    self.advance(*args, **kwargs)
  File "D:\software\anaconda3\envs\STEPS\lib\site-packages\pytorch_lightning\loops\fit_loop.py", line 269, in advance
    self._outputs = self.epoch_loop.run(self._data_fetcher)
  File "D:\software\anaconda3\envs\STEPS\lib\site-packages\pytorch_lightning\loops\base.py", line 204, in run
    self.advance(*args, **kwargs)
  File "D:\software\anaconda3\envs\STEPS\lib\site-packages\pytorch_lightning\loops\epoch\training_epoch_loop.py", line 208, in advance
    batch_output = self.batch_loop.run(batch, batch_idx)
  File "D:\software\anaconda3\envs\STEPS\lib\site-packages\pytorch_lightning\loops\base.py", line 204, in run
    self.advance(*args, **kwargs)
  File "D:\software\anaconda3\envs\STEPS\lib\site-packages\pytorch_lightning\loops\batch\training_batch_loop.py", line 90, in advance
    outputs = self.manual_loop.run(split_batch, batch_idx)
  File "D:\software\anaconda3\envs\STEPS\lib\site-packages\pytorch_lightning\loops\base.py", line 204, in run
    self.advance(*args, **kwargs)
  File "D:\software\anaconda3\envs\STEPS\lib\site-packages\pytorch_lightning\loops\optimization\manual_loop.py", line 115, in advance
    training_step_output = self.trainer._call_strategy_hook("training_step", *step_kwargs.values())
  File "D:\software\anaconda3\envs\STEPS\lib\site-packages\pytorch_lightning\trainer\trainer.py", line 1765, in _call_strategy_hook
    output = fn(*args, **kwargs)
  File "D:\software\anaconda3\envs\STEPS\lib\site-packages\pytorch_lightning\strategies\strategy.py", line 333, in training_step
    return self.model.training_step(*args, **kwargs)
  File "D:\Projects\Depth Estimation\STEPS\models\rnw.py", line 195, in training_step
    night_inputs, sci_loss_dict = self.get_sci_relight(night_inputs)
  File "D:\Projects\Depth Estimation\STEPS\models\rnw.py", line 355, in get_sci_relight
    loss, illu_list, i_k = self.S._loss(sci_gray, frame_id)  # todo: index 0 loss, index 0, -1, 1 img
  File "D:\Projects\Depth Estimation\STEPS\SCI\model.py", line 130, in _loss
    loss += self._criterion(in_list[i], i_list[i])
  File "D:\software\anaconda3\envs\STEPS\lib\site-packages\torch\nn\modules\module.py", line 889, in _call_impl
    result = self.forward(*input, **kwargs)
  File "D:\Projects\Depth Estimation\STEPS\SCI\loss.py", line 13, in forward
    Smooth_Loss = self.smooth_loss(input, illu)
  File "D:\software\anaconda3\envs\STEPS\lib\site-packages\torch\nn\modules\module.py", line 889, in _call_impl
    result = self.forward(*input, **kwargs)
  File "D:\Projects\Depth Estimation\STEPS\SCI\loss.py", line 58, in forward
    self.input = self.rgb2yCbCr(input)
  File "D:\Projects\Depth Estimation\STEPS\SCI\loss.py", line 24, in rgb2yCbCr
    im_flat = input_im.contiguous().view(-1, 3).float()
RuntimeError: shape '[-1, 3]' is invalid for input of size 22400

I observed that you passed in the depth map file when calculating the loss, the location is: models/rnw.py ==> illu_list, _, _, _, _ = self.S(sci_gray) When the rgb2yCbCr function is executed, the data with the gray scale shape of (1,1,900,1600) is OK, but when it is (1,1,112,200), it cannot run view(-1, 3).

How should this be resolved? Thank you!