Open D3lik opened 5 months ago
What error messages are you getting? Seems as though a similar issue was discussed here in another project: https://github.com/ultralytics/ultralytics/issues/1971
Try adjusting the batch size
Adjusting batch size doesn't work for me, I think i need to use dataparallel in pytorch to work
@D3lik I am running on single RTX 4090 but it gives an error CUDA out of memory, do I need two gpus?
@D3lik I am running on single RTX 4090 but it gives an error CUDA out of memory, do I need two gpus?
No. Please check issue 2 for solutions.
Hi. I have a desktop that has 2x Tesla T4s, and it should be working because it has 32G VRAM in total, while other people reported to have a 27G VRAM usage when inferring. It should work but when inferring, only 1 gpu has being used, which caused a cuda out of memory error, what parts of code should i edit so that it can work on multiple GPUs? Thanks in advance