Open Pythonpa opened 4 months ago
Thanks for your attention, other llms are great, but unfortunately not planned at the moment, since I haven’t really used them myself. Hopefully someone else can submit a pr on this!
As a workaround, with update 3b1aec6, 96a7e0b, you should be able to use OneAPI (thanks to its similar use of OPENAI API) and make use of the customised model name.
测试ollama可以正常使用,现在每一个订阅都需要手动填写模型,如果可以的话希望能配置全局加入自定义模型,只需要在下拉框选一下,如果能够设置默认模型就更好了。
Hi! I encountered a model support probrem while using Azure OpenAI.
How about using litellm for model support ? In my case, by specifying 'azure/gpt-4o' as the model name and setting the environment variables correctly, I was able to use Azure OpenAI in this project as well.
NOTE:
There seem to be models that cannot specify response_format: {type: "json"}.
Hi! I encountered a model support probrem while using Azure OpenAI.
How about using litellm for model support ? In my case, by specifying 'azure/gpt-4o' as the model name and setting the environment variables correctly, I was able to use Azure OpenAI in this project as well.
Thanks. I will check out the repo when I have time, in the meanwhile, I am not familiar with azure openai, do you need to set the base_url to call the azure openai models?
NOTE:
There seem to be models that cannot specify response_format: {type: "json"}.
I think it will be fine. I have setup a fallback option to call the AI model without json mode.
do you need to set the base_url to call the azure openai models?
Yes. Azure OpenAI requires "base_url", "api_key", "api_version", "deployment_name". (this is useful for specifying deployement location)
Using LiteLLM, these variables can be configured via environment variable.
Hey,thanks for your hard work. Do you have a plan which this project can support other LLM's API or local models. Such as Kimi? 通义千问?讯飞星火? or Ollama?