Hello. I tried to down batch size because my NVIDIA Driver(GTX750 Ti) is not good to run this demo.
but, I can not find where is batch size value. Could you let me know the place that batch size value is located?
---[Error infomation]---
RuntimeError: CUDA out of memory. Tried to allocate 20.00 MiB (GPU 0; 2.00 GiB total capacity; 1.15 GiB already allocated; 0 bytes free; 1.20 GiB reserved in total by PyTorch)
The batch size is always 1, which is the absolute minimum, when you run the demo, so you cannot set a lower value. If you don't have enough GPU RAM, then I'm sorry about that.
Hello. I tried to down batch size because my NVIDIA Driver(GTX750 Ti) is not good to run this demo. but, I can not find where is batch size value. Could you let me know the place that batch size value is located?
---[Error infomation]--- RuntimeError: CUDA out of memory. Tried to allocate 20.00 MiB (GPU 0; 2.00 GiB total capacity; 1.15 GiB already allocated; 0 bytes free; 1.20 GiB reserved in total by PyTorch)