Justin-Tan / generative-compression

TensorFlow Implementation of Generative Adversarial Networks for Extreme Learned Image Compression
MIT License
511 stars 108 forks source link

16G RAM is not enough? #35

Open Avocado5818 opened 4 years ago

Avocado5818 commented 4 years ago

as title, I use 16G RAM , monitor show my RAM using 100% memory, Does anyone have the same problem? How many G RAM is enough? Thanks~

Justin-Tan commented 4 years ago

The dataset is pretty large, you may want to subsample the data as an additional preprocessing step and modify the config file to the new input size. Try 256 x 512 or 128 x 256 if 512 x 1024 is too large.

Avocado5818 commented 4 years ago

Hello, thanks for your reply. I executed your code, but the computer's memory would gradually increase to full, and finally the program will be terminated. Is it the problem of tensorflow saving the dataset? Because I changed to pytorch, it will not have this problem, but the performance is worse than the original.

wsxtyrdd commented 3 years ago

I met the same problem. The computer's memory would gradually increase to full.

wsxtyrdd commented 3 years ago

you can delete line dataset = dataset.cache() in data.py to avoid this problem.

RandomCoins commented 3 years ago

Thanks for your answer. It really helps me.

WenBingo commented 11 months ago

Hello, thanks for your reply. I executed your code, but the computer's memory would gradually increase to full, and finally the program will be terminated. Is it the problem of tensorflow saving the dataset? Because I changed to pytorch, it will not have this problem, but the performance is worse than the original.

Hello, I've been trying to use the code recently, but the current tensorflow 2.x version is not compatible with the 1.x version, I wonder if I can borrow your pytroch version of the code, thanks a lot! it is my email 1286310571@qq.com