Azure / azure-sdk-for-net

This repository is for active development of the Azure SDK for .NET. For consumers of the SDK we recommend visiting our public developer docs at https://learn.microsoft.com/dotnet/azure/ or our versioned developer docs at https://azure.github.io/azure-sdk-for-net.
MIT License
5.25k stars 4.59k forks source link

[BUG] OpenAI Legacy Completion endpoint not exposed #44469

Open kescherCode opened 3 months ago

kescherCode commented 3 months ago

Library name and version

Azure.AI.OpenAI 2.0.0-beta.1

Describe the bug

Since the switch to the new OpenAI .NET library, any existing deployments of the gpt-3.5-turbo-instruct model cannot be used anymore, since the LegacyCompletionClient is not exposed, as mentioned at https://github.com/openai/openai-dotnet/issues/34.

You should probably check whether or not an additional AzureLegacyCompletionClient is needed in this case or not.

Expected behavior

I should be able to make completions using the gpt-3.5-turbo-instruct model.

Actual behavior

Due to a lack of access to the legacy completions endpoint via this library and its main dependency, I cannot make completions with the gpt-3.5-turbo-instruct model.

Reproduction Steps

Try upgrading from 1.0.0-beta.17 to 2.0.0-beta.1.

Environment

any and all

github-actions[bot] commented 3 months ago

Thanks for the feedback! We are routing this to the appropriate team for follow-up. cc @jpalvarezl @trrwilson.

kescherCode commented 3 months ago

Note that this is intentionally not a feature request, because this breaks existing functionality that should still be supported.

trrwilson commented 3 months ago

Thank you, @kescherCode, both for the mention here as well as on openai-dotnet. Although this was an intentional omission of the legacy completions surface on both the new client and this major version increment of the preview library, I hear you loud and clear on the migration block that imposes for /completions endpoint use. Breaking changes are very much expected, but breaking changes with no upgrade path are exceptionally painful.

We'll discuss this strategy in the product team and with OpenAI. In the interim, the prior v1.0.0-beta.17 release will continue to be a supported mechanism for using -instruct with the 2024-04-01-preview service API label even while v2.0.0-beta-* does not (at least yet) support them.

joakimriedel commented 3 months ago

+1 for legacy completions, it is not just for the -instruct model but fine-tunes using babbage-002/davinci-002. We have an important logprobs classification task using a fine tuned babbage-002 which blocks upgrading to 2.0.0-beta.x right now.

Edit: managed to move the code calling the legacy completions into a separate "LegacyOpenAI" project, but had to do the following since the 2.0.0-beta version is pinned centrally for Nuget for all other projects in the solution;

        <PackageReference Include="Azure.AI.OpenAI" VersionOverride="1.0.0-beta.17">
            <NoWarn>NU1605</NoWarn>
        </PackageReference>

posting the workaround here if it might help anyone else.

Edit2: This doesn't work, I get error

System.TypeLoadException: Could not load type 'Azure.AI.OpenAI.Completions' from assembly 'Azure.AI.OpenAI, Version=2.0.0.0, Culture=neutral, PublicKeyToken=92742159e12e44c8'.

Looking to solve this right now.

Edit3: Seems like there is no way to load a lower versioned library in a solution already having a higher versioned library loaded so I had to go deep in the rabbit hole here to solve this. I put the v1.0.0-beta.17 version of Azure.AI.OpenAI.dll into my "LegacyOpenAI" project and renamed it to Azure.AI.OpenAI.Legacy.dll to be able to load it in a new AssemblyLoadContext below using reflection... 🤦‍♂️

public async Task<IDictionary<string, float?>> GetTopLogProbsAsync(
    string prompt, string? userId, int maxTokens, float temperature,
    CancellationToken cancellationToken)
{
    AssemblyLoadContext? loadContext = null;

    try
    {
        loadContext = new AssemblyLoadContext(Guid.NewGuid().ToString(), true);

        var pathToAssembly = Path.Combine(AppDomain.CurrentDomain.BaseDirectory, @"Azure.AI.OpenAI.Legacy.dll");
        Assembly assembly = loadContext.LoadFromAssemblyPath(pathToAssembly);

        var optionsType = assembly.GetType("Azure.AI.OpenAI.CompletionsOptions")
            ?? throw new ApplicationException("Could not get Azure.AI.OpenAI.CompletionsOptions from legacy dll");

        dynamic options = Activator.CreateInstance(optionsType, _deploymentName, new[] { prompt })
            ?? throw new ApplicationException("Could not construct Azure.AI.OpenAI.CompletionsOptions from legacy dll");

        options.DeploymentName = _deploymentName;
        options.User = userId;
        options.LogProbabilityCount = 2;
        options.MaxTokens = maxTokens;
        options.Temperature = temperature;

        var clientType = assembly.GetType("Azure.AI.OpenAI.OpenAIClient")
            ?? throw new ApplicationException("Could not get Azure.AI.OpenAI.OpenAIClient from legacy dll");

        dynamic client = (string.IsNullOrEmpty(openAIKey)
            ? Activator.CreateInstance(clientType, _endpoint, new Azure.Identity.DefaultAzureCredential())
            : Activator.CreateInstance(clientType, _endpoint, new Azure.AzureKeyCredential(_openAIKey)))
            ?? throw new ApplicationException("Could not construct Azure.AI.OpenAI.OpenAIClient from legacy dll");

        var response = await client.GetCompletionsAsync(options, cancellationToken);

        dynamic choice = response.Value.Choices[0];
        dynamic logprobs = choice.LogProbabilityModel.TopLogProbabilities[0];

        return logprobs;
    }
    finally
    {
        // Unload the context.
        loadContext?.Unload();
    }
}
kescherCode commented 2 months ago

@joakimriedel I suppose one further workaround could be to get the source of the last 1.0.0 SDK, rename all namespaces, and push that to a new NuGet package, in order to have the legacy package available as a separate package.

All of these are hacky, however, and their need is quite disappointing.