vercel / ai

Build AI-powered applications with React, Svelte, Vue, and Solid
https://sdk.vercel.ai/docs
Other
9.59k stars 1.41k forks source link

JSON Parsing Error in generateObject Function with Unterminated String #1921

Closed Brayan233 closed 3 months ago

Brayan233 commented 3 months ago

Description

Steps to Reproduce

  1. Set up the environment with Node.js and necessary dependencies.
  2. Use the generateObject function with the demo code : https://sdk.vercel.ai/docs/reference/ai-sdk-core/generate-object#generateobject
  3. Run the script.

Code example

import { openai } from '@ai-sdk/openai';
import { generateObject } from 'ai';
import { z } from 'zod';
import dotenv from 'dotenv';

dotenv.config();

const { object } = await generateObject({
  model: openai('gpt-4o'),
  mode: "json",
  max_tokens: 1000000000,
  temperature: 0.5,
  schema: z.object({
    recipe: z.object({
      name: z.string(),
      ingredients: z.array(z.string()),
      steps: z.array(z.string()),
    }),
  }),
  prompt: 'Generate a lasagna recipe.',
  log: true,
});

console.log(JSON.stringify(object, null, 2));

Additional context

Actual error :

The function returns an unterminated string error at position 166 in the JSON data :

JSONParseError [AI_JSONParseError]: JSON parsing failed: Text: {
  "recipe": {
    "name": "Classic Lasagna",
    "ingredients": [
      "1 pound ground beef",
      "1 onion, chopped",
      "2 cloves garlic, minced",
      "1 (.
Error message: Unterminated string in JSON at position 166

Possible Cause

The model might include characters like quotation marks or parentheses that are not correctly escaped, leading to parsing errors. This could be an issue with how the model's output is formatted or escaped before parsing into JSON.

lgrammel commented 3 months ago

You have increased the temperature setting, leading to more random output, and are using a model that's weaker for tool calls (gpt-4o) in JSON mode. This is expected to happen, because the model might produce invalid JSON output in such a setting.

pruett commented 3 months ago

I tried this example with gpt-3.5-turbo and a temp of 0, and still receiving AI_JSONParseError. Even the example copy/pasted from https://sdk.vercel.ai/examples/next-app/basics/generating-object produces something similar.

Is this expected?

lgrammel commented 3 months ago

Please try without mode: json (then it uses tool calling w/ the OpenAI provider, which is often more robust). For better tool calling results, check out the prompt engineering with tools tips.

pruett commented 3 months ago

Thanks for the reply. No difference for me, even if I use generateText, for example. The error message is odd, too:

  cause: JSONParseError [AI_JSONParseError]: JSON parsing failed: Text: �webpack/lib/util/registerExternalSerializer�webpack-sources/RawSource�__webpack_require__.r(__webpack_exports__);

Clearly something is off given that it's parsing webpack 😅. All I'm trying is:

export async function GET(request: NextRequest) {
  // imports...

  const { text } = await generateText({
    model: openai("gpt-3.5-turbo"),
    prompt: "tell me a good joke.",
    temperature: 0,
  });

  console.log({ text });

  return new NextResponse();
}

I'm simply running this code within an API route. I must be doing something incorrectly. Just not sure what.

pruett commented 3 months ago

Problem solved ✅

Solution: Changing the API route from a GET to a POST

I'm guessing there were weird caching issues from marking the route as a GET that goes away when switching to a POST. Hope that this helps anyone that might stumble across this issue.