coleam00 / bolt.new-any-llm

Prompt, run, edit, and deploy full-stack web applications using any LLM you want!
https://bolt.new
MIT License
3.85k stars 1.58k forks source link

Bad default model #331

Open moas opened 3 days ago

moas commented 3 days ago

this line https://github.com/coleam00/bolt.new-any-llm/blob/main/app/components/chat/Chat.client.tsx#L78-L80 causes a 500 error for me as I don't have this model active. I use Ollama with the Qwen2.5 Coder 14B model.

chrismahoney commented 3 days ago

Would a quick fix for this be defining no default model from the get-go? I just tried that and it seemed fine, but I don't have time just yet to look into it myself. No default model, cookies deleted, it was able to initially auto-select Anthropic & Sonnet (new) with Ollama loading my model list OK.

@wonderwhy-er Any thoughts?

ludvax commented 2 days ago

Hi all, I also use Ollama with Qwen2.5 Coder and after modified DEFAULT_MODEL with 'qwen2.5-code' and DEFAULT_PROVIDER 'Ollama' I haven't anymore error. So thanks 🙏

But I still issue/impediment because all code appears on chat and not in the code side panel. When I try other LLM not locally I haven't issue.

Have you the same issue?