Open openmoto opened 3 hours ago
Hello, i happen to have the same problem. Everything is working locally ( checked if ollama is running well with openwebUI ) yet ollama's models arent showing on bolt.new-any-llm local instance
same here
You have ollama downloaded ? And open
Describe the bug
Ollama Models are not showing up
Link to the Bolt URL that caused the error
http://192.148.0.207:5173/
Steps to reproduce
Environment : Ollama on a standalone server http://192.168.0.22:11434, works with openwebui and other tools. bolt.new installed on a development VM, OLLAMA_API_BASE_URL set on the .env.local file initiate using pnpm run dev --host accessing the web UI from another computer, then select Ollama, but the Ollama models are not available to select
Expected behavior
Select Ollama from the first list, second dropdown should show all models I've downloaded with ollama
Screen Recording / Screenshot
No response
Platform
Additional context
No response