Closed dingyan1478 closed 3 years ago
We have tested our framework on V100 32GB GPU. You are getting out of memory during inference possibly due to higher spatial dimensions of some of the input images (e.g., 1800X1000) in your test set. Try reducing the image size or do a patch-wise testing. Let me know if you face any other issues.
Hello! I want to know your video memory configuration during training and why my 11G video memory is always displayed out of memory during testing?