wandb / openui

OpenUI let's you describe UI using your imagination, then see it rendered live.
https://openui.fly.dev
Apache License 2.0
16.86k stars 1.47k forks source link

how can i use ollama models in openui #87

Open zhjygit opened 1 month ago

zhjygit commented 1 month ago

my openui is in ubuntu18 vmware workstation like 192.168.1.169,my ollama and models is in physical host like 192.168.1.103. how can i use ollama models in openui of vmware workstation.

zhjygit commented 1 month ago

Finally, i download my openui and ollama on the physical host like 192.168.1.103,my ollama is running fine at 11434, i have pull llama3、llava models. i don't use docker in the whole process. image As above shown, i can run the openui, however, i have no open_api_key,i don't known how to pass the key to the OPEN_API_KEY, as to the ollama, where is the OPEN_API_KEY? Whether does the openui support local models OPEN_API_KEY?

I have oneapi or open-webui in the vmware host like 192.168.1.169, maybe i can pass a api-key from oneapi,but how can i pass the host 192.168.1.169 of oneapi host to the openui?

vanpelt commented 1 month ago

No need for an API key. Just set OLLAMA_HOST and choose a model from the settings pane.

Some1OnLine commented 1 month ago

I do not have an OpenAI API Key but do have my own ollama instance. If I remove the OPENAI_API_KEY var and set OLLAMA_HOST var to my ollama URL, the container fails to start, complaining about not having the openai_api_key var set or something.

zhjygit commented 1 month ago

No need for an API key. Just set OLLAMA_HOST and choose a model from the settings pane.

no no no, i don't use openui docker, just run the openui docker locally. If i do not set openai_api_key, python -m openui will not run with error like :openaierror,the api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable.

ethanmorton commented 1 month ago

I don't know if you have solved your problem already, but it seems similar to this issue. The solution worked for me.

sokoow commented 3 weeks ago

soo, if you unset OPENAI_API_KEY then I get: openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable

after setting OLLAMA_HOST to my localhost, I get a choice of models from ollama and can choose it, but then I get lots of errors and a 500 - what is the correct way of running ollama here?

vanpelt commented 3 weeks ago

@sokoow you can just set the OPENAI_API_KEY to something like xxx if you don't want to use that API. If you're seeing Ollama models in the list the application is able to list them. What are the errors you're getting when attempting to sue one of the models? You should see a stacktrace in the terminal where you ran the server.

sokoow commented 3 weeks ago

Indeed, after I've set OPENAI_API_KEY to an empty string and got bunch of errors, after setting it to something else everything works fine. Thanks for reply @vanpelt