wandb / openui

OpenUI let's you describe UI using your imagination, then see it rendered live.
https://openui.fly.dev
Apache License 2.0
19.19k stars 1.77k forks source link

how can i use ollama models in openui #87

Closed zhjygit closed 3 months ago

zhjygit commented 6 months ago

my openui is in ubuntu18 vmware workstation like 192.168.1.169,my ollama and models is in physical host like 192.168.1.103. how can i use ollama models in openui of vmware workstation.

zhjygit commented 6 months ago

Finally, i download my openui and ollama on the physical host like 192.168.1.103,my ollama is running fine at 11434, i have pull llama3、llava models. i don't use docker in the whole process. image As above shown, i can run the openui, however, i have no open_api_key,i don't known how to pass the key to the OPEN_API_KEY, as to the ollama, where is the OPEN_API_KEY? Whether does the openui support local models OPEN_API_KEY?

I have oneapi or open-webui in the vmware host like 192.168.1.169, maybe i can pass a api-key from oneapi,but how can i pass the host 192.168.1.169 of oneapi host to the openui?

vanpelt commented 6 months ago

No need for an API key. Just set OLLAMA_HOST and choose a model from the settings pane.

Some1OnLine commented 6 months ago

I do not have an OpenAI API Key but do have my own ollama instance. If I remove the OPENAI_API_KEY var and set OLLAMA_HOST var to my ollama URL, the container fails to start, complaining about not having the openai_api_key var set or something.

zhjygit commented 6 months ago

No need for an API key. Just set OLLAMA_HOST and choose a model from the settings pane.

no no no, i don't use openui docker, just run the openui docker locally. If i do not set openai_api_key, python -m openui will not run with error like :openaierror,the api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable.

ethanmorton commented 5 months ago

I don't know if you have solved your problem already, but it seems similar to this issue. The solution worked for me.

sokoow commented 5 months ago

soo, if you unset OPENAI_API_KEY then I get: openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable

after setting OLLAMA_HOST to my localhost, I get a choice of models from ollama and can choose it, but then I get lots of errors and a 500 - what is the correct way of running ollama here?

vanpelt commented 5 months ago

@sokoow you can just set the OPENAI_API_KEY to something like xxx if you don't want to use that API. If you're seeing Ollama models in the list the application is able to list them. What are the errors you're getting when attempting to sue one of the models? You should see a stacktrace in the terminal where you ran the server.

sokoow commented 5 months ago

Indeed, after I've set OPENAI_API_KEY to an empty string and got bunch of errors, after setting it to something else everything works fine. Thanks for reply @vanpelt

ghost commented 3 months ago

I am not getting any models to select after launching via Docker docker run --rm --name openui -p 7878:7878 -e OLLAMA_HOST=http://localhost:11434 ghcr.io/wandb/openui. Did someone run into this error as well?

P.S. Running Ollama locally with 2 different models downloaded. image

vanpelt commented 3 months ago

Hey Paul, if you're running Ollama on localhost you'll like need to set OLLAMA_HOST=http://host.docker.internal:11434 because docker is running from within a VM that has a different localhost (unless you're running in Linux).

ghost commented 3 months ago

Hey Paul, if you're running Ollama on localhost you'll like need to set OLLAMA_HOST=http://host.docker.internal:11434 because docker is running from within a VM that has a different localhost (unless you're running in Linux).

Thanks for the tip! Unfortunately, it doesn't work either way... I still can't select any model. Any other suggestions? image

Update: Removed Environment Variable OPENAI_API_KEY=xxx and now it works. Thanks for the help :)

vanpelt commented 3 months ago

Nice! Glad that worked for you.