Closed 5a9awneh closed 1 week ago
You need to expose Ollama to your network and then use the private IP of the computer + port as the API URL. Feel free to re-open this issue or join our Discord server if you face any issues after performing the steps
Describe the bug Interface slow to load, when it does, I receive
Invalid connection
message, in settings (also slow to load) it doesn't detect installed Ollama modelsTo Reproduce Steps to reproduce the behavior:
config.toml
:http://host.docker.internal:11434
http://localhost:11434
http://private_ip_of_computer_hosting_ollama:11434
docker compose up -d
http://localhost:3000
to See errorExpected behavior It should be able to connect to Ollama and detect installed models
Additional context