UOB-AI / UOB-AI.github.io

A repository to host our documentations website.
https://UOB-AI.github.io
1 stars 3 forks source link

torch.cuda.OutOfMemoryError #56

Open arbiasoula opened 2 months ago

arbiasoula commented 2 months ago

Hi, I got this message """

image

""

asubah commented 2 months ago

It could be that the GPU RAM is full, so you should make sure that there are no processes consuming the RAM. You can do this by running nvidia-smi in the terminal to check the free GPU memory.

If the GPU RAM is free, and you are still getting this message, then you can try to set the Pytorch max_split_size_mb setting to 256. If it still doesn't work, reduce the value by 32 until you reach a working size.

import os
os.environ["PYTORCH_CUDA_ALLOC_CONF"] = "max_split_size_mb:<enter-size-here>"

BTW, you can use code and syntax highlighting to make your code and errors more readable in the issue. It is better than a screenshot, because it is easily searchable and copyable. See this link: https://github.com/adam-p/markdown-here/wiki/Markdown-Cheatsheet#code-and-syntax-highlighting