Closed TheOneValen closed 6 months ago
Can't agree more!
You can use One-API as a redirect and all-in-one proxy to many llm services including ollama. Just change the "option" note to the compatible URL and use the One-API key instead. Opt for your favorite model as well.
ollama now has an openapi compatible api endpoint I will try that when I get the time. Is this addon still actively developed?
I can still use it today. But it's just a piece of script, not much fancy stuff. I'm able to use OneAPI as a bridge to use different endpoints from various API providers.
@Oaklight would you mind showing how you're using OneAPI to forward traffic to Ollama? I would love to include it in the README as well, and it sounds super cool.
Completed in #17 and now available for download.
@Oaklight would you mind showing how you're using OneAPI to forward traffic to Ollama? I would love to include it in the README as well, and it sounds super cool.
Ollama is officially supported by OneAPI. And OneAPI API service is OpenAI compatible. You should be able to see a "channel/渠道" page. Ollama is there in the dropdown menu when you add a new channel. Here is the screenshot of my self-hosted service. The system language has Chinese and English as options on dockerhub:
After you set up the Ollama channel on OneAPI. Go get yourself an API key on its key/token/令牌 management page. Then copy the API Key and API URL. URL could be "ip:port" fashion or domain name.
Then edit the trilium code page of the Trilium-LLM. Input the service URL, the OneAPI API key, and change the model name. Vola!
I would like to suggest to support https://github.com/jmorganca/ollama for a fully selfhosted experience.