lmstudio-ai / .github

34 stars 3 forks source link

LM studio does not detect the available VRAM #17

Closed khordoo-m closed 5 months ago

khordoo-m commented 5 months ago

Hi, I have a Windows 10 laptop with an NVIDIA RTX A2000 Laptop GPU with about 20GB of VRAM. However, when I run LM Studio, the available VRAM is displayed as 0, as shown in the picture below. Essentially LM Studio fails to identify the VRAM available on my laptop image

yagil commented 5 months ago

Possible workaround, could you please let me know if it works on your setup?

  1. open LM Studio, navigate to the Chat page
  2. check the GPU Offload checkbox. Does not matter what value you put for n layers
  3. hard exit LM Studio
  4. reopen it
khordoo-m commented 5 months ago

This workaround worked. Thanks!