vercel / ai

Build AI-powered applications with React, Svelte, Vue, and Solid
https://sdk.vercel.ai/docs
Other
10.01k stars 1.49k forks source link

`useObject` doesn't work properly when the LLM throws an error #2795

Open Andriy-Kulak opened 2 months ago

Andriy-Kulak commented 2 months ago

Description

There are 2 issues with use useObject when using it in conjunction with streamObject on the next.js API layer.

  1. If the LLM (like chat gpt 4o throw an error such as: "This model's maximum context length is 128000 tokens. However, your messages resulted in 154265 tokens (154107 in the messages, 158 in the functions). Please reduce the length of the messages or functions.", streamObject will throw this error but on the front-end we just receive an empty error object

In front-end example below, error becomes an empty object that doesn't keep this error so I don't what the BE error is.

  const { object, submit, isLoading, error } = useObject({
    api: "/api/use-object",
    schema: reportCardSchema,
  });
  1. When I try to stop the loading state by executing stop() during an error state with code below, nothing happens
  useEffect(() => {
    if (error) {
      toast.error(JSON.stringify(error));
      stop();
    }
  }, [error]);

Basically in error states this functionality is difficult to use and there are no examples on dealing with this

Code example

ai npm version 3.3.9. you can use api example from docs and generate an LLM error like going over max content tokens: https://sdk.vercel.ai/examples/next-pages/basics/streaming-object-generation

Additional context

No response

lgrammel commented 2 months ago

https://github.com/vercel/ai/pull/2877 should help with some of this