ai-shifu / ChatALL

Concurrently chat with ChatGPT, Bing Chat, Bard, Alpaca, Vicuna, Claude, ChatGLM, MOSS, 讯飞星火, 文心一言 and more, discover the best answers
https://chatall.ai
Apache License 2.0
15.24k stars 1.64k forks source link

[FEAT] Ollama support #747

Open rnbwdsh opened 8 months ago

rnbwdsh commented 8 months ago

Is your feature request related to a problem? / 你想要的功能和什么问题相关?

I have LLMs hosted on my computer with ollama, because it's basically "the docker for LLMs" - installing new models is just 1 command. I want to see how they compare to "professional" LLMs and each other.

In terms of coding: an Ollama.js in src/bots. The "hard part" is that this provides a dynamic amount of models.

Describe the solution you'd like. / 你想要的解决方案是什么?

A tab in the settings for ollama where I can set the host (default: localhost) and a button to refresh the model list you want to use (https://github.com/ollama/ollama/blob/main/docs/api.md#list-local-models)

Bonus features: Also having options to pull a new model, you could just open https://ollama.com/library in an iframe and hook the textbox that contains ollama run X and send a request there and also options to uninstall it.

Describe alternatives you've considered. / 你考虑过的其他方案是什么?

Multiple other web UIs, but nothing fits the ChatAll use case.

Additional context / 其他信息

I found this api doc online, but i'm not 100% sure if it's 100% up to date:

https://editor.swagger.io/?url=https://raw.githubusercontent.com/marscod/ollama/main/api/ollama_api_specification.json

They also claim to have support for the openapi calling spec, so you should be able to reuse all that code.

https://github.com/ollama/ollama/blob/main/docs/openai.md

richardstevenhack commented 3 months ago

I would second this request. As the OP says, Ollama is the "docker for LLMs", primarily local LLMs. There are several GUI front-ends which can access Ollama downloaded models using Ollama's endpoint API, and most of the rest should. In particular, AnythingLLM and MSTY work well with Ollama models.

Also, given that ChatALL's GUI design if for chatting with multiple models, a look at MSTY's extensive chat branching, and other similar features would enhance ChatALL's capabilities considerably. Just downloaded ChatALL and am Looking forward to testing it out.