Open Avocado5818 opened 4 years ago
The dataset is pretty large, you may want to subsample the data as an additional preprocessing step and modify the config file to the new input size. Try 256 x 512 or 128 x 256 if 512 x 1024 is too large.
Hello, thanks for your reply. I executed your code, but the computer's memory would gradually increase to full, and finally the program will be terminated. Is it the problem of tensorflow saving the dataset? Because I changed to pytorch, it will not have this problem, but the performance is worse than the original.
I met the same problem. The computer's memory would gradually increase to full.
you can delete line dataset = dataset.cache() in data.py to avoid this problem.
Thanks for your answer. It really helps me.
Hello, thanks for your reply. I executed your code, but the computer's memory would gradually increase to full, and finally the program will be terminated. Is it the problem of tensorflow saving the dataset? Because I changed to pytorch, it will not have this problem, but the performance is worse than the original.
Hello, I've been trying to use the code recently, but the current tensorflow 2.x version is not compatible with the 1.x version, I wonder if I can borrow your pytroch version of the code, thanks a lot! it is my email 1286310571@qq.com
as title, I use 16G RAM , monitor show my RAM using 100% memory, Does anyone have the same problem? How many G RAM is enough? Thanks~