juglab / n2v

This is the implementation of Noise2Void training.
Other
387 stars 107 forks source link

MemoryError when running denoising2D_RGB example using my own data #102

Closed fabulousfeng closed 3 years ago

fabulousfeng commented 3 years ago

image

I got a memory error when I ran the denoising2D_RGB example using my own data. My own data consists of more than 700 images and they generated more than 1 million 64*64 patches. Then the error came out "MemoryError: Unable to allocate 112. GiB for an array with shape (1225317, 64, 64, 6) and data type float32". The data(RGB.zip) in the example seems to have only one image, so does it mean that it cannot support too many images?

tibuch commented 3 years ago

Hi @fabulousfeng,

Sorry about this inconvenience. Our current implementation is not very memory efficient regarding large datasets because all of the training patches are loaded into memory simultaneously. That is why it tries to allocate 112 GB of memory.

There are two solutions:

  1. Implementing an improved data-loading which creates patches on demand and only loads as much data as required into memory. If you want to look into that, I am happy to assist.
  2. Use a subset of your 700 images. E.g. 70 randomly chosen images and use only this subset for training.

Best wishes.

fabulousfeng commented 3 years ago

@tibuch Thanks a lot for the detailed reply! I finally decided to use the second solution and it now works.