openai / openai-dotnet

The official .NET library for the OpenAI API
https://www.nuget.org/packages/OpenAI
MIT License
707 stars 60 forks source link

Access to all public properties from a service point of view #88

Open rodrigovaras opened 6 days ago

rodrigovaras commented 6 days ago

Our .NET service wants to receive OpenAI REST payloads and then hub the request to different specific service implementations. We were using the Azure OpenAI API (which is now deprecated for this project). That library allows us to process every property on the chat request and also construct chat responses. For some reason the code generation on this project wants to hide everything possible targeting only the client scenario. We want to use this project but its almost impossible, one example:

public partial class ChatCompletionOptions { // CUSTOM: // - Made internal. This value comes from a parameter on the client method. // - Added setter. ///

/// A list of messages comprising the conversation so far. Example Python code. /// Please note is the base class. According to the scenario, a derived class of the base class might need to be assigned here, or this property needs to be casted to one of the possible derived classes. /// The available derived classes include , , , and . /// [CodeGenMember("Messages")] internal IList Messages { get; set; }

Our service can't access the Messages property, but in theory we can deserialize the payload.

Bottom line, we are forced to use the Betalgo OpenAI.

Any plan to support a service scenario?

joseharriaga commented 6 days ago

Thank you for reaching out, @rodrigovaras! I would like to ask a few clarifying questions:

  1. Are you saying that you want strongly-typed classes that map directly to the body parameters of each request? What about other types of parameters such as path and/or query parameters?
  2. Are you using the clients in the OpenAI library too, or are you only using the classes that represent requests and responses?
rodrigovaras commented 6 days ago

1) We want type safe payloads, no need for path and query 2) No clients (for now) is used.

We could eventually use the client type classes for one scenario that the provider is itself an external opneAI service that we will route the call. We currently have our own way to route the json payload in such scenario. For now we are not planning to use the client apis but only the 'models' classes to properly deserialize a JSON payload and then produce a response depending on the provider the request hit (mostly based on the model name).

Here is the snippet code on our minimal API. We would love to replace the type ChatCompletionCreateRequest (from Betalgo package) to this project 'ChatCompletionOptions'.

    // Open AI services support
    var chatApi = app.MapGroup("/v1/chat/completions");
    chatApi.MapPost("/", async (OpenAIServiceComposite openAIServiceComposite, HttpRequest request, ILogger<ChatApi> logger, CancellationToken cancellationToken) =>
    {

        ChatCompletionCreateRequest chatRequest;
        try
        {
            chatRequest = await request.ReadFromJsonAsync<ChatCompletionCreateRequest>(cancellationToken) ?? throw new BadHttpRequestException("JSON can't be null");
        }
        catch (Exception ex)
        {
            logger.LogError(ex, invalidOpenAIJsonMessage);
            return Results.Problem(
                statusCode: StatusCodes.Status500InternalServerError,
                title: invalidOpenAIJsonMessage,
                detail: ex.Message);
        }

        try
        {
            if (chatRequest.Stream == true)
            {
                return openAIServiceComposite.GetOpenAIServiceProvider(chatRequest.Model!).HandleStreamRequest(chatRequest, cancellationToken);
            }

            var result = await openAIServiceComposite.HandleChatCompletionRequestAsync(chatRequest, cancellationToken);
            return Results.Text(result.ToJson(), contentType: MediaTypeNames.Application.Json);
        }
        catch (Exception ex)