Open arbiasoula opened 2 months ago
It could be that the GPU RAM is full, so you should make sure that there are no processes consuming the RAM. You can do this by running nvidia-smi
in the terminal to check the free GPU memory.
If the GPU RAM is free, and you are still getting this message, then you can try to set the Pytorch max_split_size_mb
setting to 256. If it still doesn't work, reduce the value by 32 until you reach a working size.
import os
os.environ["PYTORCH_CUDA_ALLOC_CONF"] = "max_split_size_mb:<enter-size-here>"
BTW, you can use code and syntax highlighting to make your code and errors more readable in the issue. It is better than a screenshot, because it is easily searchable and copyable. See this link: https://github.com/adam-p/markdown-here/wiki/Markdown-Cheatsheet#code-and-syntax-highlighting
Hi, I got this message """
""