manumathewthomas / ImageDenoisingGAN

Image Denoising with Generative Adversarial Network
364 stars 104 forks source link

Memory leak #5

Open SugarShine opened 6 years ago

SugarShine commented 6 years ago

Thanks for your shared code. When I train the model on my own datasets, I found memory leak when running code :

training_batch = sess.run(tf.map_fn(lambda img: tf.image.per_image_standardization(img), training_batch)) groundtruth_batch = sess.run(tf.map_fn(lambda img: tf.image.per_image_standardization(img), groundtruth_batch))

and after some iteration, when saving checkpoints, the graphdef is larger than 2GB, program crashing. Does anyone meet this issue and how to solve it?

manumathewthomas commented 6 years ago

Yes, there is a memory leak. I'm actually working on a new version but it's not quite ready yet but you can try replacing the above code with your own normalization function.

SugarShine commented 6 years ago

@manumathewthomas Thanks so much for your reply, I have solved the memory leak issue. But when I train the model, the loss is always NaN, could you give me some advice or a new version?

manumathewthomas commented 6 years ago

try reducing the learning rate

houguanqun commented 6 years ago

How do you solve this problem, can you share it?

codaibk commented 6 years ago

I also got the loss = NaN with my dataset when training even reducing my learning rate? Can you give a advice?

wish829 commented 5 years ago

Thanks for your shared code. When I train the model on my own datasets, I found memory leak when running code :

training_batch = sess.run(tf.map_fn(lambda img: tf.image.per_image_standardization(img), training_batch)) groundtruth_batch = sess.run(tf.map_fn(lambda img: tf.image.per_image_standardization(img), groundtruth_batch))

and after some iteration, when saving checkpoints, the graphdef is larger than 2GB, program crashing. Does anyone meet this issue and how to solve it?

  hi,When I train the model with CPU, I encountered the same problem "GraphDef cannot be larger than 2GB.". How did you solve it?
  And how do you replace the dataset with your own? I don't know how to process the data.
  Thank u very much ,I am a beginner, there are many problems don't understand,I hope I'm not disturbing you.if u're Chinese,can i add your wechat?