Closed mohamedcherifmo closed 3 months ago
Hi @mohamedcherifmo . this scenario is possible, check this example https://github.com/vercel/ai/blob/main/examples/next-openai/app/function-calling/page.tsx
To use the new Assistant API, you can use useAssistant. This implementation is experimental as some features such as stream are not yet available in the openai api
https://github.com/vercel/ai/blob/main/examples/next-openai/app/assistant/page.tsx
Thanks @tgonzales so i presume you mean I would need to invoke the assistants from the client, correct?
so my function call would fallback the tools to the client, and then the client would have a functionCallHandler that in turn uses the useAssistant?
Did i get that correctly?
I tried to do that however I'm not sure how to invoke the submit message of the useAssistant now, given the submit message relies on a Dom node (the first line in it is e.preventDefault)
@mohamedcherifmo good point re the event, see https://github.com/vercel/ai/pull/776
does it support streaming ?
Please check out our new tool calling with AI SDK Core and useChat: https://sdk.vercel.ai/docs/ai-sdk-ui/chatbot-with-tool-calling#example-server-side-tool-execution-with-roundtrips
Feature Description
Provide the ability to useChat to output responses from chats, tools and assistants all in one go
Use Case
To fully provide the "agent" capabilities I would imagine a scenario whereby I have a useChat that can either respond directly or use a tool. The tool in itself can be a function call or invoke an assistant.
So the idea being... I want the user to chat with the bot. Once the message is received, a function / tool call determines the best fit function if available. These functions are then either invoked directly in code or go through a step where they invoke an assistant to do the task on their behalf
Additional context
No response