Open moas opened 3 days ago
Would a quick fix for this be defining no default model from the get-go? I just tried that and it seemed fine, but I don't have time just yet to look into it myself. No default model, cookies deleted, it was able to initially auto-select Anthropic & Sonnet (new) with Ollama loading my model list OK.
@wonderwhy-er Any thoughts?
Hi all, I also use Ollama with Qwen2.5 Coder and after modified DEFAULT_MODEL with 'qwen2.5-code' and DEFAULT_PROVIDER 'Ollama' I haven't anymore error. So thanks 🙏
But I still issue/impediment because all code appears on chat and not in the code side panel. When I try other LLM not locally I haven't issue.
Have you the same issue?
this line https://github.com/coleam00/bolt.new-any-llm/blob/main/app/components/chat/Chat.client.tsx#L78-L80 causes a 500 error for me as I don't have this model active. I use Ollama with the Qwen2.5 Coder 14B model.