Open ZeroCool22 opened 1 week ago
+1
+1
+1
I fixed it by updating Ollama (-> start the Ollama executable, right-click the icon in the notification area and click on 'restart to update'; or just reinstall Ollama).
I don't use Ollama at all, don't need that if i use LMS.
I have the same error using, LM Studio 0.2.27, so it seems that this is an LMS problem.
Prerequisites
Please answer the following questions for yourself before submitting an issue.
{ "title": "Failed to load model", "cause": "llama.cpp error: 'error loading model architecture: unknown model architecture: 'deepseek2''", "errorData": { "n_ctx": 8192, "n_batch": 512, "n_gpu_layers": 31 }, "data": { "memory": { "ram_capacity": "31.91 GB", "ram_unused": "26.74 GB" }, "gpu": { "gpu_names": [ "NVIDIA GeForce GTX 1080 Ti" ], "vram_recommended_capacity": "11.00 GB", "vram_unused": "9.98 GB" }, "os": { "platform": "win32", "version": "10.0.19045", "supports_avx2": true }, "app": { "version": "0.2.24", "downloadsDir": "C:\Users\ZeroCool22\.cache\lm-studio\models" }, "model": {} } }```