Closed Mrhard1999 closed 7 months ago
Hi, did you use the --tile
option when running the test script? It indeed requires more than the 24GB VRAM for the Uformer_T_RLP_RPIM model to inference on 1080p images, so I adopted tiling to avoid OOM problem. The test.py will process the left and right parts separately to fit in the 24GB VRAM of your 4090.
I'm not using your '--tile' parameter, how do I set this parameter to test larger images? Please advise. Thank you
You only need to modify the test.sh
file by appending --tile
to the end. Please check the args used in test.py
.
Is it that I put '--tile' at the end of test.sh and run test.py? Am I right?
Is it that I put '--tile' at the end of test.sh and run test.py? Am I right?
Yes.
I use 4090 gpu to run the test. The training models of py Uformer_T_RLP_RPIM. PTH, but an error torch. Cuda. OutOfMemoryError: cuda out of memory. But I see that 3090 is used in your paper, why does this happen? I run a smaller picture is normal, but your original GTAV 500 test set will report an error, there are any parameters need to be modified? Looking forward to your recovery