Confirm this is a feature request for the Python library and not the underlying OpenAI API.
[X] This is a feature request for the Python library
Describe the feature or improvement you're requesting
Why the current Implementation for assistant function calling is cumbersome
I feel the implementation for function calls in Python is quite cumberson with assistant. If I understand the documentations correctly, the initial approach requires developers to handle threads, runs, and tool outputs manually. Each step, from starting conversations and defining tools to gathering and submitting outputs, involves multiple function calls. This complexity makes the process error-prone and less intuitive for developers.
To just get started with assistant (no function calls yet), we need to define four things:
OpenAI client
Assistant
Thread
Message
To enable function calls, we need to wrap the function results in yet another object tool_outputs_stream. I feel there could be better implementations out there. As a result, I asked chatGPT on what could be a better implementation.
Desired improvement
I guess we should introduces a unified "Assistant with Function Call Capability" object. This single object manages all aspects of function-based interactions, offering an interface for setting up an assistant, starting conversations, handling messages, collecting function outputs, and submitting results.
Simplified API Usage: By centralizing interactions within a single object, the new design reduces setup complexity and minimizes errors.
Encapsulation: The assistant object internally manages tool calls and states, eliminating the need for manual thread and run management.
Maintainability: The modular design is easy to extend and modify.
Better Abstraction: It abstracts away technical complexities, letting developers focus on writing function logic instead of managing the intricate mechanics of the SDK.
This improvement provides a cleaner, more developer-friendly API, making it easier to integrate function calls and boost productivity in OpenAI-based applications.
I have asked ChatGPT to generate a mockup AssistantWithFunctionCalls. Please let me know if there is interest in bringing this feature in?
Confirm this is a feature request for the Python library and not the underlying OpenAI API.
Describe the feature or improvement you're requesting
Why the current Implementation for assistant function calling is cumbersome
I feel the implementation for function calls in Python is quite cumberson with assistant. If I understand the documentations correctly, the initial approach requires developers to handle threads, runs, and tool outputs manually. Each step, from starting conversations and defining tools to gathering and submitting outputs, involves multiple function calls. This complexity makes the process error-prone and less intuitive for developers.
To just get started with assistant (no function calls yet), we need to define four things:
To enable function calls, we need to wrap the function results in yet another object
tool_outputs_stream
. I feel there could be better implementations out there. As a result, I asked chatGPT on what could be a better implementation.Desired improvement
I guess we should introduces a unified "Assistant with Function Call Capability" object. This single object manages all aspects of function-based interactions, offering an interface for setting up an assistant, starting conversations, handling messages, collecting function outputs, and submitting results.
I have asked ChatGPT to generate a mockup
AssistantWithFunctionCalls
. Please let me know if there is interest in bringing this feature in?ChatGPT conversation
Additional context
No response