coleam00 / bolt.new-any-llm

Prompt, run, edit, and deploy full-stack web applications using any LLM you want!
https://bolt.new
MIT License
4.81k stars 1.98k forks source link

Docker Ubuntu Linux with External Ollama Server Not working #436

Closed interconnectix closed 20 hours ago

interconnectix commented 6 days ago

Describe the bug

The current repo isn't working with Docker, Ubuntu Linux 22.04, with an External Ollama server.

The following is how to reproduce it: git clone https://github.com/coleam00/bolt.new-any-llm.git cd bolt.new-any-llm cp .env.example .env.local nano .env.local (edit the following line) OLLAMA_API_BASE_URL=http://192.168.50.84:11434 (when i do a curl from the docker host machine to 192.168.50.84:11434, I get 'ollama running' docker build . --target bolt-ai-production docker-compose --profile production up

The site will come up, but when I select Ollama in the drop down it doesn't have any of the models that I have installed populated.

I have also tried everything from switching the docker-compose.yaml line from RUNNING_IN_DOCKER=true to false.

Link to the Bolt URL that caused the error

http://192.168.50.38:5173/

Steps to reproduce

git clone https://github.com/coleam00/bolt.new-any-llm.git cd bolt.new-any-llm cp .env.example .env.local nano .env.local (edit the following line) OLLAMA_API_BASE_URL=http://192.168.50.84:11434 (when i do a curl from the docker host machine to 192.168.50.84:11434, I get 'ollama running' docker build . --target bolt-ai-production docker-compose --profile production up

Expected behavior

Should have a list of Ollama models that are displayed and that can be used

Screen Recording / Screenshot

No response

Platform

Additional context

No response

openmoto commented 6 days ago

Same issue here, never been able to get ollama external server working in any installation method.

OS: [Ubuntu Linux with docker, Ollama on external linux server] Browser: [Chrome, Brave, Edge]

I'm able to log into the container using docker exec -it bolt-bolt-ai-1 bash Check the environment variable for ollama printenv | grep OLLAMA_API_BASE_URL the output shows it's set correctly OLLAMA_API_BASE_URL=http://172.25.10.11:11434 yet, bringing the container up shows it's not set: image

visvapravin commented 6 days ago

me to have the same issue

dustinwloring1988 commented 20 hours ago

please try the latest version