soulsands / trilium-chat

Chat plugin highly integrated with Trilium
https://soulsands.github.io/trilium-chat/
GNU Affero General Public License v3.0
84 stars 11 forks source link

Ollama Support #13

Closed TheOneValen closed 6 months ago

TheOneValen commented 11 months ago

I would like to suggest to support https://github.com/jmorganca/ollama for a fully selfhosted experience.

supermayx commented 9 months ago

Can't agree more!

Oaklight commented 8 months ago

You can use One-API as a redirect and all-in-one proxy to many llm services including ollama. Just change the "option" note to the compatible URL and use the One-API key instead. Opt for your favorite model as well.

TheOneValen commented 8 months ago

ollama now has an openapi compatible api endpoint I will try that when I get the time. Is this addon still actively developed?

Oaklight commented 8 months ago

I can still use it today. But it's just a piece of script, not much fancy stuff. I'm able to use OneAPI as a bridge to use different endpoints from various API providers.

perfectra1n commented 6 months ago

@Oaklight would you mind showing how you're using OneAPI to forward traffic to Ollama? I would love to include it in the README as well, and it sounds super cool.

perfectra1n commented 6 months ago

Completed in #17 and now available for download.

Oaklight commented 6 months ago

@Oaklight would you mind showing how you're using OneAPI to forward traffic to Ollama? I would love to include it in the README as well, and it sounds super cool.

Ollama is officially supported by OneAPI. And OneAPI API service is OpenAI compatible. You should be able to see a "channel/渠道" page. Ollama is there in the dropdown menu when you add a new channel. Here is the screenshot of my self-hosted service. The system language has Chinese and English as options on dockerhub:

After you set up the Ollama channel on OneAPI. Go get yourself an API key on its key/token/令牌 management page. Then copy the API Key and API URL. URL could be "ip:port" fashion or domain name.

Then edit the trilium code page of the Trilium-LLM. Input the service URL, the OneAPI API key, and change the model name. Vola! image image