paul-gauthier / aider

aider is AI pair programming in your terminal
https://aider.chat/
Apache License 2.0
19.57k stars 1.8k forks source link

Azure Open AI Behind APIM (Custom Auth)? #1653

Open bakes82 opened 1 week ago

bakes82 commented 1 week ago

Issue

The best practice for using Azure Open AI in any "organization" is to use it behind an API Mgnt Gateway. MS limits the number of instances an organization can and you don't want to expose the master API keys out to the end users, plus you get other benefits from the APIM layer. APIM will still just pass thru the base information Azure Open AI needs, its just that we would need to authenticate slightly differently, but the "chat/endpoints" payloads would be the same. Where would the best entry point be to setup this kind of custom authentication. In our use case we use app registrations to authenticate with APIM and also have some custom headers we need to set for logging on the APIM before it passes it thru to Azure Open AI,

Version and model info

No response

fry69 commented 1 week ago

Thank you for filing this issue.

aider uses LiteLLM to communication with LLM endpoints. I am not sure how to pass additional header fields for authentication currently.

Maybe related:

1590

https://github.com/BerriAI/litellm/issues/4833

bakes82 commented 1 week ago

@fry69 It looks like LiteLLM already supports the auth method I would need to use, and per the linked issue headers are being passed over correctly.

https://docs.litellm.ai/docs/providers/azure

I would need to implement the "Azure AD Token Refresh" example they have, the azure sdk supports app registrations Ref: https://learn.microsoft.com/en-us/python/api/overview/azure/identity-readme?view=azure-python (Service principal with secret)

And per the issue they are jus doing this: extra_headers={ "Authorization": "my-bad-key", "Ocp-Apim-Subscription-Key": "hello-world-testing", },

So I guess my question is how where do I configure these pieces

fry69 commented 1 week ago

So I guess my question is how where do I configure these pieces

No clue. Does anything in here look suitable to pass these headers?

-> https://aider.chat/docs/config/adv-model-settings.html

bakes82 commented 1 week ago

The send_completion method takes in extra headers. I dont see where LiteLLM is being configured in the code because I would just make my changes in that configuration method for myself since I assume thats the "easiest" because the code obviously isnt setup to handle the use case, but LiteLLM appears to support it.

paul-gauthier commented 1 week ago

You can create a model settings yaml file, and include something like:

  extra_headers:
    anthropic-beta: prompt-caching-2024-07-31

See the https://aider.chat/docs/config/adv-model-settings.html page. I've updated it with more examples.

bakes82 commented 1 week ago

@paul-gauthier I still need to override the auth so it uses an client/secret that I linked above that LiteLLM supports.

paul-gauthier commented 1 week ago

Set azure_ad_token = accessToken from step 3 or set os.environ['AZURE_AD_TOKEN']

Have you tried setting that environment variable?

bakes82 commented 1 week ago

Setting the Azure_AD_Token in a .env file is working, and the call is hitting the APIM instance, the "auth" is also working (ideally would rather set a clientid/secret and have it do the auth).

Im getting an error with the extra headers though, it doesnt seem to working as APIM is telling me its not set:

BadRequestError: litellm.BadRequestError: AzureException - Error code: 400 - {'statusCode': 400, 'message': 'mkl-User-Name is a required header'}

bakes82 commented 1 week ago

My bad needed the name to be azure/gpt-4o or whatever the model is you pass in to match. Its now atleast pass the auth and the request is now hitting the APIM and being sent to AzureOpenAI, now to figure out way Im getting a 500 error "Expression evaluation failed. The message body is not a valid JSON. Unexpected character encountered while parsing value: d. Path '', line 0, position 0."

wissne commented 4 days ago

litellm support identity (client_id, tenant_id, secret ), but not able to pass the value to it with current Aider configuration or parameters