Open falense opened 8 months ago
How to use Azure AI API in this plugin is also a question that I care about very much. I hope someone can provide a good solution. 如何在这个插件中使用Azure AI API,也是我非常关系的问题,希望有人能给出一个好的解决办法。
And / or why not providing a way to use a custom Open AI URL, we could leverage LiteLLM to proxy OpenAI API calls to Azure OpenAI for example. This question is a cross-reference to #130
I encountered the same issue.
Hi @r0mdau do you use LiteLLM Proxy ? Can we hop on a call to learn how you use litellm
Link to my cal for your convenience: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat?month=2024-02
I am trying to use the plugin with an Azure AI API endpoint rather then the OpenAI endpoint. There seems to be a couple of key differences that makes this not work:
api-key
rather thanBearer
for the request.api-version
must be specifiedPOST https://{your-resource-name}.openai.azure.com/openai/deployments/{deployment-id}/completions?api-version={api-version}
More info here: https://learn.microsoft.com/en-us/azure/ai-services/openai/reference
Would it be hard to add a toggle for Azure vs OpenAI API?