yu4u / noise2noise

An unofficial and partial Keras implementation of "Noise2Noise: Learning Image Restoration without Clean Data"
MIT License
1.07k stars 237 forks source link

Deleting dataset causes error (Google Colab) #54

Closed playbyan1453 closed 1 year ago

playbyan1453 commented 3 years ago

Hi, im was training my ai and I run the ai on google colab, I have some images wich is clean image and dirty image i was curious about replacing my dataset with new image, before im replacing the file its just works fine and when Im replaced with new dataset and this problem comes up

Traceback (most recent call last):
  File "noise2noise/test_model.py", line 70, in <module>
    main()
  File "noise2noise/test_model.py", line 47, in main
    h, w, _ = image.shape
AttributeError: 'NoneType' object has no attribute 'shape'

Im not disappointed with the result but this problem was kinda easy to fix by my self like reset the runtime the problem was gone. Any clues?

ghost commented 3 years ago

Check your images, it must be 24 bit depth

playbyan1453 commented 3 years ago

Yes, after I checked the image have 32 bit depth, so im trying to make my own denoiser let me try change it to 24 bit depth. Edit : I still have same problem untill now