yu4u / noise2noise

An unofficial and partial Keras implementation of "Noise2Noise: Learning Image Restoration without Clean Data"
MIT License
1.07k stars 237 forks source link

Out of memory #34

Closed victorca25 closed 3 years ago

victorca25 commented 5 years ago

Any recommendation to handle OOM problems?

I'm using tensorflow with GPU (GTX 1060 6GB) and it dies fairly quickly with higher resolutions. I know in some cases dividing the images into slices and stitching them back together works, but I don't know if in this case that's an option.

yu4u commented 5 years ago

Yes, decreasing batch_size or image_size reduces memory requirement, and it would work to some extent.

CyberLykan commented 4 years ago

Yes, decreasing batch_size or image_size reduces memory requirement, and it would work to some extent.

How would you change the batch_size

yu4u commented 4 years ago

Please see readme.