Open MichaelBetser opened 10 months ago
@MichaelBetser can you please give us more details?
using azure ad auth with python openai is described here: https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/migration?tabs=python%2Cdalle-fix#authentication however if I try to use autogen with this config: token_provider = get_bearer_token_provider(DefaultAzureCredential(), "https://my-resource.openai.azure.com/.default") config_list = [ { "model": "gpt-4-32k", "azure_ad_token_provider": token_provider, "azure_endpoint": "https://my-resource.openai.azure.com", "api_type": "azure", "api_version": "2023-07-01-preview", }] it does not work. I also tried variation on this, replacing 'azure_endpoint' with 'base_url', replacing 'azure_ad_token_provider' with 'api_key' and feeding the azure ad token (which is not an api_key) so my question is: do you support azure ad auth and if yes how to do it?
I just found that there is an AzureOpenAI client in openai: https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/chatgpt?tabs=python-new&pivots=programming-language-chat-completions This client must be added recently. We can switch to using this client instead of processing azure endpoints inside the OpenAIWrapper of autogen. @olgavrou for awareness. For your use case though, does it work if you set "api_type" to be "azure_ad"? And use "api_key" and "base_url". I remember seeing someone else using "azure_ad" and it worked.
Hi sonichi, I have tried the suggested combination, and I get an error: 'model no found' even though the model exists and I am able to query using the openai azure api directly.
Not sure if @kevin666aa or others is making a PR to use AzureOpenAI. If not, my suggestion is to make sure the "model" value matches the deployment name in your endpoint.
I just made a draft PR to use AzureOpenAI in #868. @MichaelBetser Can you checkout out the branch to see if it works?
Could this be bumped? This still doesn't support token-based authentication. We want to avoid API keys as much as possible. Here's an example from oai client that could be used in #868. @yiranwu0, could you update your PR to factor this in?
Hi everyone,
Is there a way to use Azure OpenAI and Autogen without having the API key? In my company, we only receive information for access via the token. Here are the details we have:
API_VERSION: '....' AZURE_OPENAI_ENDPOINT: '....' MODEL_NAME: '....' URL_TOKEN: '....' Any guidance on how to proceed with this setup would be greatly appreciated.
Thanks!
Could this be bumped? This still doesn't support token-based authentication. We want to avoid API keys as much as possible. Here's an example from oai client that could be used in #868. @yiranwu0, could you update your PR to factor this in?
It is still not working, right? @yiranwu0 is there someone working on this issue, or at least a roadmap for that?
Hello @cesarofuchi, can you take a look at this PR: #2879 Does it solve the problem?
My gpt model instance is using an azure ad auth scheme. The api is slightly different for azure ad and the autogen doc does not say if it supports it and how to do it. is this supported?
Thanks.