wandb / openui

OpenUI let's you describe UI using your imagination, then see it rendered live.
https://openui.fly.dev
Apache License 2.0
18.86k stars 1.73k forks source link

Ollama on another server (remote) #76

Closed simplesisu closed 4 months ago

simplesisu commented 5 months ago

Thanks for this amazing application, really good.

I have already an Ollama server on another pc which I connect to remotely using WireGuard (remote access to LAN).

Is it possible to connect to that (http://xxx.xxx.xxx.xxx:11434) instead and not re-download Ollama on current pc etc.

I've setup OpenUI according below and not docker:git clone https://github.com/wandb/openui

cd openui/backend
# You probably want to do this from a virtual environment
pip install .
# This must be set to use OpenAI models, find your api key here: https://platform.openai.com/api-keys
export OPENAI_API_KEY=xxx _**(should I input the ip address of Ollama server here?)**_
python -m openui
vanpelt commented 4 months ago

Nice, you can set the OLLAMA_HOST environment variable like I do in the docker-compose file.