Closed Jiaxin-Wen closed 3 years ago
I've met the oom question with a TITAN xp 12GB GPU while running the original model(256px , un-light mode)
How much memory would be enough to succefully run the code , like loading the 100-epoch pre-trained model?
i did it in 8G memory
Inference is possible in 12GB, but training is not possible.
Try doing 128px, non-light. Works on 11GB
I've met the oom question with a TITAN xp 12GB GPU while running the original model(256px , un-light mode)
How much memory would be enough to succefully run the code , like loading the 100-epoch pre-trained model?