openai / openai-dotnet

The official .NET library for the OpenAI API
https://www.nuget.org/packages/OpenAI/AbsoluteLatest
MIT License
1.13k stars 111 forks source link

Cannot call CompleteChatAsync with options after ToolChatMessage in messages of type List<ChatMessage> #218

Open taihuy opened 1 week ago

taihuy commented 1 week ago

Confirm this is not an issue with the OpenAI Python Library

Confirm this is not an issue with the underlying OpenAI API

Confirm this is not an issue with Azure OpenAI

Describe the bug

I really like the idea of this codesnip, but it seems like the client.CompleteChat(messages, options); throws exception from server without any specific errors (only 400 Bad Request has been returned), if we call this method after the ToolCall were processed in the previous loop. If we remove options and only keep messages without options CompleteChat(messages), and again after ToolCall, it would work fine.

do
{
    requiresAction = false;
    ChatCompletion chatCompletion = client.CompleteChat(messages, options);

    switch (chatCompletion.FinishReason)
    ...
while (requiresAction)

in some forums, I see that they suggest to use CompleteChat with options for the first time, and then go through the tool before call the CompleteChat for the last time without options. In this case how we can include the whole history? For example when the bot need a parameter from user and ask a question before the bot can run function further? On other words, how can we process tool calls as chain (not parallel), output from the first tool can be input for the next tool.

To Reproduce

Just run the codesnip and make sure that the model runs at least one tool. You will see that it throws exception.

do
{
    requiresAction = false;
    ChatCompletion chatCompletion = client.CompleteChat(messages, options);

    switch (chatCompletion.FinishReason)
    ...
while (requiresAction)

The error I have gotten:

System.ClientModel.ClientResultException: Service request failed.
      Status: 400 (Bad Request)

         at Azure.AI.OpenAI.ClientPipelineExtensions.ProcessMessageAsync(ClientPipeline pipeline, PipelineMessage message, RequestOptions options)
         at Azure.AI.OpenAI.Chat.AzureChatClient.CompleteChatAsync(BinaryContent content, RequestOptions options)
         at OpenAI.Chat.ChatClient.CompleteChatAsync(IEnumerable`1 messages, ChatCompletionOptions options, CancellationToken cancellationToken)

Code snippets

No response

OS

Windows

.NET version

8.0.6

Library version

2.0.0-beta.2

joseharriaga commented 1 week ago

Thank you for reaching out, @taihuy ! This snippet is from our function calling example linked below, correct? 🔗 https://github.com/openai/openai-dotnet/blob/main/examples/Chat/Example03_FunctionCalling.cs

I just ran the example using the latest version of the library, and it works as expected. Did you make any modifications to the code? If you could share an end-to-end repro, that would be very helpful!

taihuy commented 1 week ago

Hi @joseharriaga,

Thanks very much for your quick answer. I really appreciate it. It is even more helpful when we work with this kind of innovative technology and still in beta test. I figured why it happened with my code. It were because I used data source in addition to tools in my ChatCompletionOptions. When I removed the data source, the code worked well as the example.

var searchDataSource = new AzureSearchChatDataSource()
        {
            Endpoint = new Uri(searchEndpoint),
            IndexName = searchIndexName,
            Authentication = DataSourceAuthentication.FromApiKey(searchApiKey),
        };

 var options = new ChatCompletionOptions
        {
            Tools = { _projectsInCompanyTool, _timeEntriesInProjectTool, _timeEntriesInCompanyTool, _usersInCompanyTool}, ToolChoice = ChatToolChoice.Auto
        };

#pragma warning disable AOAI001 // Type is for evaluation purposes only and is subject to change or removal in future updates.
        // options.AddDataSource(searchDataSource);
#pragma warning restore AOAI001

I got the following warning "Azure.AI.OpenAI.AzureChatCompletionOptionsExtensions.AddDataSource(OpenAI.Chat.ChatCompletionOptions, Azure.AI.OpenAI.Chat.AzureChatDataSource)' is for evaluation purposes only and is subject to change or removal in future updates.", but I need both tools and Azure Search Service in my bot. Is there any ways that I can have both?

dmytrostruk commented 1 week ago

@joseharriaga I have the same issue when using AzureSearchChatDataSource and function calling together. The response is 400 (Bad Request) Invalid chat message detected: message content must be string.

I noticed that it's related to AssistantChatMessage serialization - it works differently depending on content parameter.

I tried it with following test code snippet:

var messages = new List<ChatMessage>
{
    new AssistantChatMessage(toolCalls: [], content: null),
    new AssistantChatMessage(toolCalls: [], content: string.Empty),
    new AssistantChatMessage(toolCalls: [], content: "test")
};

var result = await client.CompleteChatAsync(messages);

This is how the request looks like:

{
   "messages":[
      {
         "role":"assistant"
      },
      {
         "role":"assistant",
         "content":[
            {
               "type":"text",
               "text":""
            }
         ]
      },
      {
         "role":"assistant",
         "content":"test"
      }
   ],
   "model":"gpt-4o"
}
  1. When content is null - content property won't be present in request.
  2. When content is an empty string - content property in request will be an array of objects with "type": "text" and "text": "".
  3. When content is non-empty string - content property in request will be a string.

It looks like Azure OpenAI with data service works only when content property is a string, not an array, and it should exist in request body, even when it's null (in this case it should be empty). I'm not sure where exactly this fix should be applied, but I'm wondering if serialization logic on OpenAI SDK side should be updated for case with empty string to send "content":"" or it should remain as "content":[{"type":"text", "text": ""}]?