coleam00 / bolt.new-any-llm

Prompt, run, edit, and deploy full-stack web applications using any LLM you want!
https://bolt.new
MIT License
610 stars 376 forks source link

Ollama Models not showing up #29

Open openmoto opened 3 hours ago

openmoto commented 3 hours ago

Describe the bug

Ollama Models are not showing up image

Link to the Bolt URL that caused the error

http://192.148.0.207:5173/

Steps to reproduce

Environment : Ollama on a standalone server http://192.168.0.22:11434, works with openwebui and other tools. bolt.new installed on a development VM, OLLAMA_API_BASE_URL set on the .env.local file initiate using pnpm run dev --host accessing the web UI from another computer, then select Ollama, but the Ollama models are not available to select

Expected behavior

Select Ollama from the first list, second dropdown should show all models I've downloaded with ollama

Screen Recording / Screenshot

No response

Platform

Additional context

No response

CesarPetrescu commented 3 hours ago

{A81B7803-DF92-481A-AE78-403CB287517A}

Hello, i happen to have the same problem. Everything is working locally ( checked if ollama is running well with openwebUI ) yet ollama's models arent showing on bolt.new-any-llm local instance

saintman23 commented 1 hour ago

same here

vgcman16 commented 1 hour ago

You have ollama downloaded ? And open