karthink / gptel

A simple LLM client for Emacs
GNU General Public License v3.0
1.03k stars 111 forks source link

Is it possible to interact with LLMs provided by an Open WebUI? #306

Open JonatanSahar opened 1 month ago

JonatanSahar commented 1 month ago

They are not stored locally as well, although I suspect that's not a problem. Open WebUI: https://github.com/open-webui/open-webui Thanks for all you work!

karthink commented 1 month ago

I don't understand what this software does. The installation instructions explain how to access Ollama, which gptel already does. Can you give me some more context?

JonatanSahar commented 4 weeks ago

As far as I understand it's a system for deploying models. In my case it's running on a remote server and I want to query that remote Ollama. I'm not sure how it handles HTTP requests..

karthink commented 4 days ago

@JonatanSahar Sorry, I remain confused about what exactly accessing open-webui means. If it involves running Ollama on a remote server, you can use gptel-make-ollama as explained in the README to connect to it. You will need to set the :protocol to https.

On the remote server you'll need a reverse proxy or some other mechanism for Ollama to receive requests from the outside world, the usual.