cd openui/backend
# You probably want to do this from a virtual environment
pip install .
# This must be set to use OpenAI models, find your api key here: https://platform.openai.com/api-keys
export OPENAI_API_KEY=xxx _**(should I input the ip address of Ollama server here?)**_
python -m openui
Thanks for this amazing application, really good.
I have already an Ollama server on another pc which I connect to remotely using WireGuard (remote access to LAN).
Is it possible to connect to that (http://xxx.xxx.xxx.xxx:11434) instead and not re-download Ollama on current pc etc.
I've setup OpenUI according below and not docker:git clone https://github.com/wandb/openui