Closed Brayan233 closed 3 months ago
You have increased the temperature
setting, leading to more random output, and are using a model that's weaker for tool calls (gpt-4o
) in JSON mode. This is expected to happen, because the model might produce invalid JSON output in such a setting.
I tried this example with gpt-3.5-turbo
and a temp of 0
, and still receiving AI_JSONParseError
. Even the example copy/pasted from https://sdk.vercel.ai/examples/next-app/basics/generating-object produces something similar.
Is this expected?
Please try without mode
: json
(then it uses tool calling w/ the OpenAI provider, which is often more robust). For better tool calling results, check out the prompt engineering with tools tips.
Thanks for the reply. No difference for me, even if I use generateText
, for example. The error message is odd, too:
cause: JSONParseError [AI_JSONParseError]: JSON parsing failed: Text: �webpack/lib/util/registerExternalSerializer�webpack-sources/RawSource�__webpack_require__.r(__webpack_exports__);
Clearly something is off given that it's parsing webpack 😅. All I'm trying is:
export async function GET(request: NextRequest) {
// imports...
const { text } = await generateText({
model: openai("gpt-3.5-turbo"),
prompt: "tell me a good joke.",
temperature: 0,
});
console.log({ text });
return new NextResponse();
}
I'm simply running this code within an API route. I must be doing something incorrectly. Just not sure what.
Problem solved ✅
Solution: Changing the API route from a GET
to a POST
I'm guessing there were weird caching issues from marking the route as a GET
that goes away when switching to a POST
. Hope that this helps anyone that might stumble across this issue.
Description
Steps to Reproduce
Code example
Additional context
Actual error :
The function returns an unterminated string error at position 166 in the JSON data :
Possible Cause
The model might include characters like quotation marks or parentheses that are not correctly escaped, leading to parsing errors. This could be an issue with how the model's output is formatted or escaped before parsing into JSON.