Open taihuy opened 1 week ago
Thank you for reaching out, @taihuy ! This snippet is from our function calling example linked below, correct? 🔗 https://github.com/openai/openai-dotnet/blob/main/examples/Chat/Example03_FunctionCalling.cs
I just ran the example using the latest version of the library, and it works as expected. Did you make any modifications to the code? If you could share an end-to-end repro, that would be very helpful!
Hi @joseharriaga,
Thanks very much for your quick answer. I really appreciate it. It is even more helpful when we work with this kind of innovative technology and still in beta test. I figured why it happened with my code. It were because I used data source in addition to tools in my ChatCompletionOptions. When I removed the data source, the code worked well as the example.
var searchDataSource = new AzureSearchChatDataSource()
{
Endpoint = new Uri(searchEndpoint),
IndexName = searchIndexName,
Authentication = DataSourceAuthentication.FromApiKey(searchApiKey),
};
var options = new ChatCompletionOptions
{
Tools = { _projectsInCompanyTool, _timeEntriesInProjectTool, _timeEntriesInCompanyTool, _usersInCompanyTool}, ToolChoice = ChatToolChoice.Auto
};
#pragma warning disable AOAI001 // Type is for evaluation purposes only and is subject to change or removal in future updates.
// options.AddDataSource(searchDataSource);
#pragma warning restore AOAI001
I got the following warning "Azure.AI.OpenAI.AzureChatCompletionOptionsExtensions.AddDataSource(OpenAI.Chat.ChatCompletionOptions, Azure.AI.OpenAI.Chat.AzureChatDataSource)' is for evaluation purposes only and is subject to change or removal in future updates.", but I need both tools and Azure Search Service in my bot. Is there any ways that I can have both?
@joseharriaga I have the same issue when using AzureSearchChatDataSource
and function calling together. The response is 400 (Bad Request) Invalid chat message detected: message content must be string
.
I noticed that it's related to AssistantChatMessage
serialization - it works differently depending on content
parameter.
I tried it with following test code snippet:
var messages = new List<ChatMessage>
{
new AssistantChatMessage(toolCalls: [], content: null),
new AssistantChatMessage(toolCalls: [], content: string.Empty),
new AssistantChatMessage(toolCalls: [], content: "test")
};
var result = await client.CompleteChatAsync(messages);
This is how the request looks like:
{
"messages":[
{
"role":"assistant"
},
{
"role":"assistant",
"content":[
{
"type":"text",
"text":""
}
]
},
{
"role":"assistant",
"content":"test"
}
],
"model":"gpt-4o"
}
content
is null
- content
property won't be present in request.content
is an empty string - content
property in request will be an array of objects with "type": "text"
and "text": ""
.content
is non-empty string - content
property in request will be a string. It looks like Azure OpenAI with data service works only when content
property is a string, not an array, and it should exist in request body, even when it's null (in this case it should be empty). I'm not sure where exactly this fix should be applied, but I'm wondering if serialization logic on OpenAI SDK side should be updated for case with empty string to send "content":""
or it should remain as "content":[{"type":"text", "text": ""}]
?
Confirm this is not an issue with the OpenAI Python Library
Confirm this is not an issue with the underlying OpenAI API
Confirm this is not an issue with Azure OpenAI
Describe the bug
I really like the idea of this codesnip, but it seems like the client.CompleteChat(messages, options); throws exception from server without any specific errors (only 400 Bad Request has been returned), if we call this method after the ToolCall were processed in the previous loop. If we remove options and only keep messages without options CompleteChat(messages), and again after ToolCall, it would work fine.
in some forums, I see that they suggest to use CompleteChat with options for the first time, and then go through the tool before call the CompleteChat for the last time without options. In this case how we can include the whole history? For example when the bot need a parameter from user and ask a question before the bot can run function further? On other words, how can we process tool calls as chain (not parallel), output from the first tool can be input for the next tool.
To Reproduce
Just run the codesnip and make sure that the model runs at least one tool. You will see that it throws exception.
The error I have gotten:
Code snippets
No response
OS
Windows
.NET version
8.0.6
Library version
2.0.0-beta.2