Leon-Sander / local_multimodal_ai_chat

GNU General Public License v3.0
101 stars 66 forks source link

torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 512.00 MiB. GPU 0 has a total capacity of 1.96 GiB of which 225.06 MiB is free. #24

Open joaquindev23 opened 2 months ago

joaquindev23 commented 2 months ago

How can I solve this error?

Captura desde 2024-04-18 20-52-49

Leon-Sander commented 2 months ago

Either by using a GPU with more memory, since the message tells you that your GPU does not have enough memory. Or set gpu_layers to 0 in the config file, then the models will run on CPU.

joaquindev23 commented 2 months ago

Can you guide me to know which file I have to modify and which lines of code? Please thanks

Leon-Sander commented 2 months ago

As I already wrote, the config.yaml file, and if you didn't change it then line 11