jantic / DeOldify

A Deep Learning based project for colorizing and restoring old images (and video!)
MIT License
18.02k stars 2.57k forks source link

RuntimeError: expected scalar type Float but found Double #283

Closed sfremerey closed 3 years ago

sfremerey commented 3 years ago

I wanted to re train a new video model using the "NoGAN" based training approach. For doing so, I at first tried to execute the notebook ColorizeTrainingStable.ipynb, but if I do so, the following error is thrown:

Traceback (most recent call last):
  File "/home/avt/.local/lib/python3.8/site-packages/fastai/data_block.py", line 591, in _check_kwargs
    try: x.apply_tfms(tfms, **kwargs)
  File "/home/avt/.local/lib/python3.8/site-packages/fastai/vision/image.py", line 123, in apply_tfms
    else: x = tfm(x)
  File "/home/avt/.local/lib/python3.8/site-packages/fastai/vision/image.py", line 518, in __call__
    return self.tfm(x, *args, **{**self.resolved, **kwargs}) if self.do_run else x
  File "/home/avt/.local/lib/python3.8/site-packages/fastai/vision/image.py", line 464, in __call__
    if args: return self.calc(*args, **kwargs)
  File "/home/avt/.local/lib/python3.8/site-packages/fastai/vision/image.py", line 469, in calc
    if self._wrap: return getattr(x, self._wrap)(self.func, *args, **kwargs)
  File "/home/avt/.local/lib/python3.8/site-packages/fastai/vision/image.py", line 183, in affine
    self.affine_mat = self.affine_mat @ m
RuntimeError: expected scalar type Float but found Double

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "ColorizeTrainingStable.py", line 130, in <module>
    data_gen = get_data(bs=bs, sz=sz, keep_pct=keep_pct)
  File "ColorizeTrainingStable.py", line 66, in get_data
    return get_colorize_data(sz=sz, bs=bs, crappy_path=path_lr, good_path=path_hr,
  File "/home/avt/DeOldify/deoldify/dataset.py", line 28, in get_colorize_data
    src.label_from_func(lambda x: good_path / x.relative_to(crappy_path))
  File "/home/avt/.local/lib/python3.8/site-packages/fastai/data_block.py", line 502, in transform
    self.train.transform(tfms[0], **kwargs)
  File "/home/avt/.local/lib/python3.8/site-packages/fastai/data_block.py", line 721, in transform
    _check_kwargs(self.x, tfms, **kwargs)
  File "/home/avt/.local/lib/python3.8/site-packages/fastai/data_block.py", line 593, in _check_kwargs
    raise Exception(f"It's not possible to apply those transforms to your dataset:\n {e}")
Exception: It's not possible to apply those transforms to your dataset:
 expected scalar type Float but found Double

I installed the respective Python versions you are defining in the requirements.txt. Further, I am using the imagenet database for training. Originally, I planned to use a different set of images for training, but at first I wanted to somehow test the training procedure using the imagenet database.

Did I do something wrong?

jantic commented 3 years ago

Based on how you're describing then installation it sounds like you did it manually, and I'm seeing evidence of this: FastAI version appears to be different than the required 1.0.51 because I'm comparing the line numbers in that stacktrace there and they don't match up to the 1.0.51 fastai version of that data_block.py file I have here. I can tell you this much: The fastai version matters a lot. There were breaking changes made after that version, which is why I froze it there.

The better approach here is to use the conda install, via command line in the root DeOldify folder:

conda env create -f environment.yml

Then run

conda activate deoldify

and you should be set.

That'll take care of your dependencies for you so that you don't run into a potential mismatch. That's really what this looks like at this point.

sfremerey commented 3 years ago

Now it seems to work. I indeed made an error in installing the wrong requirements, as I did it by using the python3-venv module installing the requirements given in the requirements.txt file. Using the approach with conda seems to work properly.

Thank you for the hint!