Open tuntunhuang opened 11 months ago
I completely agree. Allowing users to customize the API base URL, rather than hard-coding it to "https://api.openai.com/v1/chat/completions", would be extremely helpful in the current situation. Students living outside the service area of OpenAI would benefit from this.
+1. Can't wait to use Azure API.
I completely agree. Allowing users to customize the API base URL, rather than hard-coding it to "https://api.openai.com/v1/chat/completions", would be extremely helpful in the current situation. Students living outside the service area of OpenAI would benefit from this.
Yes, I think this is very important. I have built an API myself, and I hope to change the interface to my own interface
@gaojunyang666 thanks for the feedback
This is already partially implemented in v2.1
for use with localhost APIs. A custom non-localhost endpoint, assuming that it uses the OpenAI API format, can be implemented rather quickly, but so far this hasn't been requested by anyone helping with beta testing the early release.
For non-OpenAI formats, there will be the ability to contribute additional formats to the module I created for implementing these models. This module will be released as an open-source NPM module very soon.
🌴
This is almost there using Custom API (OpenAI format)
except that it sends the api-key
header as Authorization: Bearer xxxx
. If we could override the auth header, I could make this work with Azure OpenAI.
Hey @ChrisRomp
This should be a pretty simple adapter, something similar to what's described here https://github.com/brianpetro/jsbrains/issues/1#issuecomment-2071271220
I don't currently have an Azure account for the necessary testing. But, maybe someone will see this and be able to handle building and testing the adapter 😊
🌴
I can (eventually) take a crack at it. I just want to test this in my Obsidian workflow today. If it works well for me then I'll try to make some time for this.
But the gist for anyone else is it should work almost exactly like an OpenAI endpoint, except you need to provide a couple of params in the URL (see docs). E.g.:
POST https://{your-resource-name}.openai.azure.com/openai/deployments/{deployment-id}/chat/completions?api-version={api-version}
For anyone wanting to use this on Azure today, I've put Azure API Management in front of AOAI, and I'm using policy to compare a key to a named value in APIM. Then I use Azure RBAC auth to the AOAI service. You could optionally extract the Bearer value and pass that into an api-key
header to AOAI instead of storing it as a secret in APIM. Disable requiring a subscription key on the API as well.
<policies>
<inbound>
<base />
<cors allow-credentials="false" terminate-unmatched-request="false">
<allowed-origins>
<origin>*</origin>
</allowed-origins>
<allowed-methods preflight-result-max-age="600">
<method>POST</method>
</allowed-methods>
<allowed-headers>
<header>authorization</header>
<header>content-type</header>
</allowed-headers>
</cors>
<check-header name="Authorization" failed-check-httpcode="401" failed-check-error-message="Unauthorized">
<value>Bearer {{aoai-custom-key}}</value>
</check-header>
<authentication-managed-identity resource="https://cognitiveservices.azure.com" output-token-variable-name="msi-access-token" ignore-error="false" />
<set-header name="Authorization" exists-action="override">
<value>@("Bearer " + (string)context.Variables["msi-access-token"])</value>
</set-header>
</inbound>
<backend>
<base />
</backend>
<outbound>
<base />
</outbound>
<on-error>
<base />
</on-error>
</policies>
You can import the Swagger spec for AOAI into APIM from the Azure docs: https://learn.microsoft.com/en-us/azure/ai-services/openai/reference
@brianpetro I could also give you short term access to an Azure OpenAI instance.
Hi Brain, Thank you very much for developing this amazing plugin. Would you update it to support the Azure OpenAI API?