Azure / azure-sdk-for-net

This repository is for active development of the Azure SDK for .NET. For consumers of the SDK we recommend visiting our public developer docs at https://learn.microsoft.com/dotnet/azure/ or our versioned developer docs at https://azure.github.io/azure-sdk-for-net.
MIT License
5.57k stars 4.82k forks source link

[BUG] AzureOpenAIClient does not pass API key down to OpenAI library properly #44488

Open godefroi opened 5 months ago

godefroi commented 5 months ago

Library name and version

Azure.AI.OpenAI 2.0.0-beta.1

Describe the bug

Using the OpenAI library against a "serverless" Azure ML endpoint (i.e. https://whatever-whatever-whatever-serverless.whatever.inference.ai.azure.com) works as expected when the constructor is passed an API key:

var client = new OpenAI.Chat.ChatClient("model/deploymentname", _apiKey, new OpenAIClientOptions() { Endpoint = _endpointUri });

var completion = client.CompleteChat([new UserChatMessage("What model are you?")]);

foreach (var c in completion.Value.Content) {
    Console.WriteLine(c.Text);
}

However, when using the the Azure OpenAI library, the API key is not passed, and the request fails:

var client = new AzureOpenAIClient(_endpointUri, _apiKey, new AzureOpenAIClientOptions(AzureOpenAIClientOptions.ServiceVersion.V2024_04_01_Preview));

var completion = client.CompleteChat([new UserChatMessage("What model are you?")]);

foreach (var c in completion.Value.Content) {
    Console.WriteLine(c.Text);
}

This request fails with:

Unhandled exception. System.ClientModel.ClientResultException: Service request failed.
Status: 400 (Missing Authorization header)

   at Azure.AI.OpenAI.ClientPipelineExtensions.ProcessMessage(ClientPipeline pipeline, PipelineMessage message, RequestOptions options)
   at Azure.AI.OpenAI.Chat.AzureChatClient.CompleteChat(BinaryContent content, RequestOptions options)
   at OpenAI.Chat.ChatClient.CompleteChat(IEnumerable`1 messages, ChatCompletionOptions options)

Expected behavior

The API key should be passed down to the underlying OpenAI classes so that it can be included in the Authorization header

Actual behavior

The Authorization header is not set with the API key

Reproduction Steps

var client = new AzureOpenAIClient(_endpointUri, _apiKey, new AzureOpenAIClientOptions(AzureOpenAIClientOptions.ServiceVersion.V2024_04_01_Preview));

var completion = client.CompleteChat([new UserChatMessage("What model are you?")]);

foreach (var c in completion.Value.Content) {
    Console.WriteLine(c.Text);
}

This request fails with:

Unhandled exception. System.ClientModel.ClientResultException: Service request failed.
Status: 400 (Missing Authorization header)

   at Azure.AI.OpenAI.ClientPipelineExtensions.ProcessMessage(ClientPipeline pipeline, PipelineMessage message, RequestOptions options)
   at Azure.AI.OpenAI.Chat.AzureChatClient.CompleteChat(BinaryContent content, RequestOptions options)
   at OpenAI.Chat.ChatClient.CompleteChat(IEnumerable`1 messages, ChatCompletionOptions options)

Environment

.NET SDK: Version: 8.0.300 Commit: 326f6e68b2 Workload version: 8.0.300-manifests.4e5ea2d8 MSBuild version: 17.10.4+10fbfbf2e

Runtime Environment: OS Name: Windows OS Version: 10.0.22631 OS Platform: Windows RID: win-x64 Base Path: C:\Program Files\dotnet\sdk\8.0.300\

.NET workloads installed: [aspire] Installation Source: VS 17.10.34928.147 Manifest Version: 8.0.0/8.0.100 Manifest Path: C:\Program Files\dotnet\sdk-manifests\8.0.100\microsoft.net.sdk.aspire\8.0.0\WorkloadManifest.json Install Type: FileBased

Host: Version: 8.0.5 Architecture: x64 Commit: 087e15321b

.NET SDKs installed: 8.0.300 [C:\Program Files\dotnet\sdk]

.NET runtimes installed: Microsoft.AspNetCore.App 6.0.30 [C:\Program Files\dotnet\shared\Microsoft.AspNetCore.App] Microsoft.AspNetCore.App 6.0.31 [C:\Program Files\dotnet\shared\Microsoft.AspNetCore.App] Microsoft.AspNetCore.App 7.0.19 [C:\Program Files\dotnet\shared\Microsoft.AspNetCore.App] Microsoft.AspNetCore.App 7.0.20 [C:\Program Files\dotnet\shared\Microsoft.AspNetCore.App] Microsoft.AspNetCore.App 8.0.5 [C:\Program Files\dotnet\shared\Microsoft.AspNetCore.App] Microsoft.NETCore.App 6.0.30 [C:\Program Files\dotnet\shared\Microsoft.NETCore.App] Microsoft.NETCore.App 6.0.31 [C:\Program Files\dotnet\shared\Microsoft.NETCore.App] Microsoft.NETCore.App 7.0.19 [C:\Program Files\dotnet\shared\Microsoft.NETCore.App] Microsoft.NETCore.App 7.0.20 [C:\Program Files\dotnet\shared\Microsoft.NETCore.App] Microsoft.NETCore.App 8.0.5 [C:\Program Files\dotnet\shared\Microsoft.NETCore.App] Microsoft.WindowsDesktop.App 6.0.30 [C:\Program Files\dotnet\shared\Microsoft.WindowsDesktop.App] Microsoft.WindowsDesktop.App 7.0.19 [C:\Program Files\dotnet\shared\Microsoft.WindowsDesktop.App] Microsoft.WindowsDesktop.App 7.0.20 [C:\Program Files\dotnet\shared\Microsoft.WindowsDesktop.App] Microsoft.WindowsDesktop.App 8.0.5 [C:\Program Files\dotnet\shared\Microsoft.WindowsDesktop.App]

Other architectures found: x86 [C:\Program Files (x86)\dotnet] registered at [HKLM\SOFTWARE\dotnet\Setup\InstalledVersions\x86\InstallLocation]

Environment variables: Not set

global.json file: Not found

Learn more: https://aka.ms/dotnet/info

Download .NET: https://aka.ms/dotnet/download

github-actions[bot] commented 5 months ago

Thanks for the feedback! We are routing this to the appropriate team for follow-up. cc @jpalvarezl @trrwilson.

trrwilson commented 5 months ago

Hello, @godefroi! The client configuration pattern is a little different from before; you need to create your ChatClient instance via the method on the AzureOpenAIClient you configured, as described in more detail in the readme:

https://github.com/Azure/azure-sdk-for-net/tree/main/sdk/openai/Azure.AI.OpenAI#authenticate-the-client

Modifying your shared code:

var azureClient = new AzureOpenAIClient(
    _endpointUri,
    _apiKey,
    new AzureOpenAIClientOptions(AzureOpenAIClientOptions.ServiceVersion.V2024_04_01_Preview));

ChatClient client = azureClient.GetChatClient("my-gpt-35-turbo-deployment");

var completion = client.CompleteChat([new UserChatMessage("What model are you?")]);

Could you please try with the above and report back?

godefroi commented 5 months ago

@trrwilson Shoot, I managed to copy/paste the wrong code. This code:

var client = new AzureOpenAIClient(_endpointUri, _apiKey, new AzureOpenAIClientOptions(AzureOpenAIClientOptions.ServiceVersion.V2024_04_01_Preview));

var chatClient = client.GetChatClient("my-gpt-35-turbo-deployment");

var completion = chatClient.CompleteChat([new UserChatMessage("What model are you?")]);

foreach (var c in completion.Value.Content) {
    Console.WriteLine(c.Text);
}

This code fails with the "400 (Missing Authorization header)" error.

JadynWong commented 5 months ago

According to the api description shown in the documentation, AzureML provides an API that is compatible with OpenAI endpoints. It does not seem to be exactly the same as Azure OpenAI, e.g. chat/completions, embeddings, the authentication methods etc. paths.

I think the original OpenAI SDK should be used, not the Azure OpenAI SDK. As you mentioned at the beginning the OpenAI SDK works fine.

However, the documentation does mention compatibility with Azure OpenAI. This may require an explanation from Microsoft.

The API is compatible with Azure OpenAI model deployments.

Azure ML Interface API: https://learn.microsoft.com/en-us/azure/machine-learning/reference-model-inference-api?view=azureml-api-2&tabs=rest

Azure OpenAI REST API: https://learn.microsoft.com/en-us/azure/ai-services/openai/reference

godefroi commented 5 months ago

The issue isn't with the API, it's with how the SDK configures the ChatClient. The Azure OpenAI SDK is based on the OpenAI SDK, and the OpenAI SDK works. The Azure OpenAI SDK (contained in this project) fails to configure the underlying classes correctly. Thus, this is an issue with the Azure OpenAI SDK.

trrwilson commented 5 months ago

Does the non-Azure client (i.e. substituting OpenAIClient for AzureOpenAIClient) properly connect to the endpoint?

OpenAI and Azure OpenAI authenticate differently -- that's one of the primary drivers for the client being needed -- and it sounds like AzureML may be matching OpenAI rather than Azure OpenAI. Specifically, Azure OpenAI uses an api-key header for keys, while OpenAI (and, from that error message, AzureML) use Authorization.

Alternatively -- and although I believe this will work, I haven't confirmed it -- you could use a DelegatedTokenCredential to "spoof" the expected Authorization: bearer header, e.g.:

TokenCredential authorizationCredential = DelegatedTokenCredential.Create(
    (context, cancellationToken) =>
    {
        return new AccessToken(Environment.GetEnvironmentVariable("AZUREML_API_KEY"), DateTime.MaxValue);
    });

AzureOpenAIClient azureClient = new(
    new Uri(Environment.GetEnvironmentVariable("AZUREML_ENDPOINT")),
    authorizationCredential,
    new AzureOpenAIClientOptions(
        AzureOpenAIClientOptions.ServiceVersion.V2024_04_01_Preview));
godefroi commented 5 months ago

Does the non-Azure client (i.e. substituting OpenAIClient for AzureOpenAIClient) properly connect to the endpoint?

Yes. OpenAIClient works as expected. Your alternate workaround with DelegatedTokenCredential, however, does not. It results in a 405 (Method Not Allowed) response.