openai / openai-dotnet

The official .NET library for the OpenAI API
https://www.nuget.org/packages/OpenAI
MIT License
1.37k stars 132 forks source link

Difficulty Using Function Calling in OpenAI Batch API #245

Open mehrdad-tat opened 2 weeks ago

mehrdad-tat commented 2 weeks ago

Confirm this is a feature request for the .NET library and not the underlying OpenAI API

Describe the feature or improvement you are requesting

I am trying to implement function calling using the OpenAI Batch API as mentioned in the documentation here. The documentation states that function calling is supported in both the Chat Completions API, Assistants API, and the Batch API. However, I am having trouble getting it to work with the Batch API.

Despite following the guidelines provided, I cannot seem to find a way to utilize function calling effectively within the Batch API. If anyone has experience or insights regarding this functionality, I would greatly appreciate your assistance.

Screenshot 2024-10-07 085831

Additional context

I have reviewed the examples provided for function calling in the Chat Completions API and Assistants API, but I haven't found similar examples for the Batch API. My goal is to implement a conversational assistant that can process multiple requests in a batch while utilizing function calling to retrieve and respond with real-time data.

I am particularly interested in understanding the specific requirements or limitations when using function calling with the Batch API. If there are any differences in implementation compared to the other APIs, that information would be very helpful.

HavenDV commented 1 week ago

I think the Batch API in this case is more about delayed execution of a large volume (within 24 hours and with a discount) than about fast parallel processing of requests, which I think you mean and this happens in some other APIs In general, this is just the same series of Messages, which will contain function calls/results of these calls, and I don't see any difference here