zhaobenny / bz-cogs

Interesting cogs for Red Bot
MIT License
23 stars 14 forks source link

ollama support (Local LLMs) #35

Closed CorneliusCornbread closed 7 months ago

CorneliusCornbread commented 7 months ago

Correct me if I'm wrong but it seems the cog doesn't support locally hosted LLM models. Most notable of which would be something like Ollama, or possibly something like h2ogpt.

It'd be really nice to avoid having to pay for the usage of this cog via hosting locally.

zhaobenny commented 7 months ago

The aiuser cog does supports setting alternative OpenAI API servers. (as long as they are compatible with the more recent OpenAI clients packages)

h2ogpt seems to provide a OpenAI proxy so it might work out of the box, but Ollama doesn't and probably needs something like litellm.