Open filippozanfini opened 4 weeks ago
Can you reproduce this with just the code from inside your tool? (i.e., without the AI SDK)?
In the code that I see resp
will always be truthy, so resp.json()
will always execute.
@lgrammel I tried to make the call inside a useEffect and everything works correctly. The data is returned
Can you provide more context, i.e. are you using streamText or generateText? Is this from the frontend with useChat or in the backend?
@lgrammel I have a similar issue that I posted in the discussions. Willing to provide more context if needed: https://github.com/vercel/ai/discussions/3350
streamText
with useChat
.Must likely a timing issues with in-flight tool calls. Will investigate
Fixed in ai@3.4.25
@lgrammel Unfortunately I am still seeing this issue after updating to the latest.
It will make the call and I can see the data response but the result
never makes it back.
https://github.com/vercel/ai/discussions/3350
hm are you disabling your submit button in the frontend while backend response is arriving? this might happen when you send again from the frontend before the backend response has fully arrived.
I am not sending again before it completes. It just seems to happen for me with any async tools I attempt to use that make a call. Here is an example where I simply just pass the data through so I can format it in a title component off the tool name and it works fine:
export const titleTool = tool({
description: 'Provide a title for the current step',
parameters: z.object({
title: z.string(),
}),
execute: async ({ title }) => {
return {
title,
};
},
});
Thank you for the help on this! 💯
I just double-checked with the tool call reference example, injecting a 5 second delay into the server side tool call. It works as expected. Can you confirm the version of the ai packages that you are using? is your tool call client or server side?
https://github.com/vercel/ai/tree/main/examples/next-openai/app/api/use-chat-tools
The call is server side and not from the client. I just attempted again and it logs the formattedData
in the tool but then it looks like it is never streamed back to the messages
in useChat
. I attempted again and it just hangs. I try to send the next message to the agent and that is when it sends the error. It looks like it just never completes the flow.
const { messages, input, handleInputChange, handleSubmit, isLoading, error, append } = useChat({
keepLastMessageOnError: true,
api: '/api/chat/route',
});
current packages:
"@ai-sdk/google": "^0.0.55",
"@ai-sdk/google-vertex": "^0.0.42",
"@ai-sdk/react": "^0.0.68",
"ai": "^3.4.27",
The tool with the api call:
export const patientBasicInfoTool = tool({
description: 'Fetch basic information for a patient using their ID',
parameters: z.object({
patientId: z.string().describe('The ID of the patient to fetch basic information for'),
}),
execute: async ({ patientId }) => {
try {
// Use the internal API endpoint for fetching patient basic info
const response = await axios.get(`http://localhost:3000/api/patient/${patientId}`);
console.log('patientBasicInfoTool response', response);
const patientData = response.data;
// Extract and format the basic information
const formattedData = {
...
};
// I CAN SEE THIS LOGGED AFTER THE CALL 👇
console.log('formattedData --->', formattedData);
// IT NEVER SENDS RESULTS BACK ?? 👇
return {
...formattedData,
message: 'Here is the basic information for the patient',
};
} catch (error) {
console.error('Error fetching patient basic information:', error);
return {
message: 'Failed to fetch patient basic information. Are you sure the patient ID is correct?',
};
}
},
});
my route:
export default async function POST(req: NextRequest) {
try {
const { messages } = await req.json();
const result = await streamText({
model: google('gemini-1.5-pro', { structuredOutputs: false }),
system: system,
messages: convertToCoreMessages(messages),
tools: {
title: titleTool,
fetchPatientBasicInfo: patientBasicInfoTool,
},
maxSteps: 100,
});
return result.toDataStreamResponse();
} catch (error) {
console.error('Error in chat API:', error);
return new Response(error, { status: 500 });
}
}
(all mock data)
@daynewright can you try using onChunk
on stream text to log the chunks? I'm wondering if the tool results shows up there
Yep! It is making it to the onChunk
formattedData ---> {
id: '03f12f9e-fd3e-b845-502a-3a12511d9e48',
name: 'Mario764 Weston546 Medhurst46',
initials: 'MWM',
status: 'Existing Patient',
sex: 'Male',
dob: '2024-01-16',
age: 0,
address: '600 Denesik Alley Suite 34, Smith Mills, MA 00000',
phone: '555-398-3801',
race: 'White',
ethnicity: 'Hispanic or Latino',
language: 'English (United States)',
maritalStatus: 'Never Married'
}
Tool result: {
"type": "tool-result",
"toolCallId": "nNKzU2j",
"toolName": "fetchPatientBasicInfo",
"args": {
"patientId": "03f12f9e-fd3e-b845-502a-3a12511d9e48"
},
"result": {
"id": "03f12f9e-fd3e-b845-502a-3a12511d9e48",
"name": "Mario764 Weston546 Medhurst46",
"initials": "MWM",
"status": "Existing Patient",
"sex": "Male",
"dob": "2024-01-16",
"age": 0,
"address": "600 Denesik Alley Suite 34, Smith Mills, MA 00000",
"phone": "555-398-3801",
"race": "White",
"ethnicity": "Hispanic or Latino",
"language": "English (United States)",
"maritalStatus": "Never Married",
"message": "Here is the basic information for the patient"
}
}
okay then next we need to check if it makes it into the stream and arrives client side. You can check this in the chrome network tab, see screenshot:
It is. It is showing in the response.
Hm that's strange. What is showing up in your UI? is the toolInvocation
on the last two assistant messages containing the result? (trying to figure out where in the whole client/server roundtrip loop it's failing)
It doesn't seem to be getting to the useChat
hook. I am not even seeing the last toolInvocation
with a result
. I am not sure if it is how I am structuring the response back from the tool
. The messages
object never gets the data from what I can tell.
You seem to have three assistant steps on the backend. There might be an issue with how useChat handles this. I'll investigate - it might be 1-2 days until I can take a closer look.
no problem! I appreciate the help. If I can provide more details or test different scenarios just let me know.
When I was trying to reproduce, I noticed that you have error parts (starting with 3:
). Can you send the error messages to the client (they get suppressed by default for security reasons):https://sdk.vercel.ai/docs/ai-sdk-ui/chatbot#error-messages
Can you provide more context, i.e. are you using streamText or generateText? Is this from the frontend with useChat or in the backend?
I'm using streamText
with useChat
hook
@lgrammel Unfortunately I am still seeing this issue after updating to the latest. It will make the call and I can see the data response but the
result
never makes it back. #3350
@lgrammel same here after updating to latest version 😞
I am not sure of the impact of this but I was able to get it to partially work.
I modified my prompt to simplify it and then moved to a different llm. I was using gemini-1.5-pro
and it always failed. Moved to gemini-1.5-flash
and it worked better. Also, adjusting my prompt to be less complex helped. No idea why these changes impacted things to allow some to succeed but I am still dealing with things just "lost".
I attempted to add the error messaging to the stream and did not get anything when it failed. 🤷♂️
I feel the debugging / visibility of tool call parse failures is not good - leading into issues that seem unrelated but aren't.
What would have helped you identify that this is in fact an LLM / tool call issue earlier?
@lgrammel any update on this?
Description
If I make a fetch call to the APIs inside the tool, the tool for some reason incorrectly executes
await resp.json()
, returning an empty objectCode example
No response
Additional context
No response