yinan-c / RSSbrew

Self-hosted, easy-to-deploy RSS tool - Aggregate, filter, digest and AI summarize articles in RSS feeds.
GNU Affero General Public License v3.0
166 stars 15 forks source link

Other LLM models support? #8

Open Pythonpa opened 4 months ago

Pythonpa commented 4 months ago

Hey,thanks for your hard work. Do you have a plan which this project can support other LLM's API or local models. Such as Kimi? 通义千问?讯飞星火? or Ollama?

yinan-c commented 4 months ago

Thanks for your attention, other llms are great, but unfortunately not planned at the moment, since I haven’t really used them myself. Hopefully someone else can submit a pr on this!

yinan-c commented 4 months ago

As a workaround, with update 3b1aec6, 96a7e0b, you should be able to use OneAPI (thanks to its similar use of OPENAI API) and make use of the customised model name.

Felix2yu commented 1 month ago

测试ollama可以正常使用,现在每一个订阅都需要手动填写模型,如果可以的话希望能配置全局加入自定义模型,只需要在下拉框选一下,如果能够设置默认模型就更好了。

te-chan2 commented 1 month ago

Hi! I encountered a model support probrem while using Azure OpenAI.

How about using litellm for model support ? In my case, by specifying 'azure/gpt-4o' as the model name and setting the environment variables correctly, I was able to use Azure OpenAI in this project as well. image

NOTE:

There seem to be models that cannot specify response_format: {type: "json"}.

yinan-c commented 3 weeks ago

Hi! I encountered a model support probrem while using Azure OpenAI.

How about using litellm for model support ? In my case, by specifying 'azure/gpt-4o' as the model name and setting the environment variables correctly, I was able to use Azure OpenAI in this project as well. image

Thanks. I will check out the repo when I have time, in the meanwhile, I am not familiar with azure openai, do you need to set the base_url to call the azure openai models?

NOTE:

There seem to be models that cannot specify response_format: {type: "json"}.

I think it will be fine. I have setup a fallback option to call the AI model without json mode.

te-chan2 commented 3 weeks ago

do you need to set the base_url to call the azure openai models?

Yes. Azure OpenAI requires "base_url", "api_key", "api_version", "deployment_name". (this is useful for specifying deployement location)

Using LiteLLM, these variables can be configured via environment variable.