lmstudio-ai / lmstudio-bug-tracker

Bug tracking for the LM Studio desktop application
10 stars 3 forks source link

Error when loading model, system falsely runs out of ram if GPU acceleration is used #73

Open zipfile6209 opened 3 months ago

zipfile6209 commented 3 months ago

Greetings. I am trying to start with GPU acceleration and I get this error:

json
{
  "cause": "(Exit code: 134). Please check settings and try loading the model again. ",
  "suggestion": "",
  "data": {
    "memory": {
      "ram_capacity": "15.53 GB",
      "ram_unused": "31.53 KB"
    },
    "gpu": {
      "gpu_names": [
        "NVIDIA GeForce GTX 970M"
      ],
      "vram_recommended_capacity": "5.93 GB",
      "vram_unused": "5.88 GB"
    },
    "os": {
      "platform": "linux",
      "version": "6.10.2-arch1-1"
    },
    "app": {
      "version": "0.2.31",
      "downloadsDir": "/home/user/.cache/lm-studio/models/"
    },
    "model": {}
  },
  "title": "Error loading model."
}

However it is false that I run out of ram, in fact it works when I disable GPU acceleration. I have the latest versions of the app, nvidia and cuda. Mention also that it works fine with GPU from ollama and jan (using the same model).