anthropics / anthropic-sdk-typescript

Access to Anthropic's safety-first language model APIs
https://www.npmjs.com/package/@anthropic-ai/sdk
MIT License
635 stars 70 forks source link

"Error: [object Object]" during message streaming when error is via an SSE (cause/detail not accessible) #346

Closed paulcalcraft closed 3 days ago

paulcalcraft commented 5 months ago

When hitting an error during the async iterator of a anthropic.messages.create(), the exception raised and associated error object doesn't have any detail, it just has an e.cause.message set to "[object Object]".

My example SSE that's occurring during streaming is:

{event: 'error', data: '{"type":"error","error":{"type":"overloaded_error","message":"Overloaded"}              }', raw: Array(2)}

The error SSE is then thrown using APIError.generate here: https://github.com/anthropics/anthropic-sdk-typescript/blob/ad92b0d536508954ee8b6c83e82bca30eefeb298/src/streaming.ts#L95

The errJSON is correctly being passed to generate, but because status isn't set (it's an SSE, not an HTTP response), we use castToError() to raise the APIConnectionError, with no other info: https://github.com/anthropics/anthropic-sdk-typescript/blob/ad92b0d536508954ee8b6c83e82bca30eefeb298/src/error.ts#L52

castToError just returns new Error(errJSON) https://github.com/anthropics/anthropic-sdk-typescript/blob/ad92b0d536508954ee8b6c83e82bca30eefeb298/src/core.ts#L977

But errJSON doesn't have a good toString, so our cause Error object has message "[object Object]" and no other properties. This means you can't handle/inspect the error cause correctly when catching the error during the async iterator.

An example error:

anthropic stream error Error: Connection error.
    at APIError.generate (\node_modules\@anthropic-ai\sdk\error.mjs:32:20)
    at Stream.iterator (\node_modules\@anthropic-ai\sdk\streaming.mjs:73:40)
    at process.processTicksAndRejections (\lib\internal\process\task_queues.js:95:5)
    at async [Symbol.asyncIterator] (\src\routes\api\oracle\+server.js:205:38)
...
    at async initiateResponse (/src/routes/api/oracle/+server.js:1410:17) {status: undefined, headers: undefined, error: undefined, cause: Error: [object Object]
    at castToError (fil…/node_modules/@anthr…, stack: 'Error: Connection error.
    at APIError.gene…/src/routes/api/oracle/+server.js:1410:17)', …}

And where I'm catching it:

try {
    for await (const messageStreamEvent of response) {
        // ...
    }
} catch (e) {
    console.error("anthropic stream error", e) // e.cause.message == "[object Object]"
}

Would it be possible to correctly format the error so that it's possible to identify at least the error type by inspecting .cause on the APIConnectionError?

Thanks for any help. Also happy to submit a PR if there's agreement on the best way to surface the error detail in the APIConnectionError object.

rattrayalex commented 5 months ago

Thanks for the report, we'll take a look!

ryanblock commented 4 months ago

Just a heads up, we have also been able to replicate this issue. This is running within a Lambda; the error occurs after a few hundred tokens. An example prompt which seems to replicate this for us is: Please send the first 10 paragraphs of Alice's Adventures in Wonderland by Lewis Carroll (which is in the public domain).

APIConnectionError: Connection error.
    at Function.generate (file:///var/task/node_modules/@anthropic-ai/sdk/error.mjs:32:20)
    at Stream.iterator (file:///var/task/node_modules/@anthropic-ai/sdk/streaming.mjs:52:40)
    ... 2 lines matching cause stack trace ...
    at async MessageStream._createMessage (file:///var/task/node_modules/@anthropic-ai/sdk/lib/MessageStream.mjs:113:26) {
  status: undefined,
  headers: undefined,
  error: undefined,
  cause: Error: [object Object]
      at castToError (file:///var/task/node_modules/@anthropic-ai/sdk/core.mjs:682:12)
      at Function.generate (file:///var/task/node_modules/@anthropic-ai/sdk/error.mjs:32:52)
      at Stream.iterator (file:///var/task/node_modules/@anthropic-ai/sdk/streaming.mjs:52:40)
      at runMicrotasks (<anonymous>)
      at processTicksAndRejections (node:internal/process/task_queues:96:5)
      at async MessageStream._createMessage (file:///var/task/node_modules/@anthropic-ai/sdk/lib/MessageStream.mjs:113:26)
}
beeirl commented 4 months ago

Running into the exact same issue here when running it on Vercel using Vercel AI SDK.

rattrayalex commented 4 months ago

FWIW, my guess is that this is due to Vercel timing out your handler, but I agree the error message being hard to read makes this worse. @RobertCraigie care to ticket?

paulcalcraft commented 4 months ago

Thanks! I'm on a Pro plan with Vercel for 5 minute timeouts so I don't think that's actually the case for me

ryanblock commented 4 months ago

@rattrayalex fwiw, and I mentioned above, we have seen this error in plain old AWS Lambda, and have observed this not to be related to Lambda timeouts. (Just for my own edification, what's the relationship here with @stainless-api?)

rattrayalex commented 3 months ago

Gotcha, that's helpful. We'll try to look into this, but a repro script would be very helpful. Can anyone share one?

what's the relationship here with https://github.com/stainless-api?

I work at Stainless, which Anthropic uses to build their SDKs.

greg84 commented 3 weeks ago

I am seeing this too.

I'm running a NextJS app locally. Just randomly chatting with my app it throws this error maybe every 5 - 10 requests. The app has been working fine with Together AI's API (via the OpenAI SDK) using Llama 3 and 3.1 in the last few months. Since swapping over to Anthropic I'm now seeing this intermittent issue.

This is the output when the error is thrown:


APIConnectionError: Connection error.
    at APIError.generate (file:///Users/path/to/app/node_modules/@anthropic-ai/sdk/error.mjs:33:20)
    at Stream.iterator (file:///Users/path/to/app/node_modules/@anthropic-ai/sdk/streaming.mjs:52:40)
    ... 11 lines matching cause stack trace ...
    at async handleRequest (/Users/path/to/app/node_modules/next/dist/server/lib/router-server.js:353:24)
    at async requestHandlerImpl (/Users/path/to/app/node_modules/next/dist/server/lib/router-server.js:377:13) {
  status: undefined,
  headers: undefined,
  request_id: undefined,
  error: undefined,
  cause: Error: [object Object]
      at castToError (file:///Users/path/to/app/node_modules/@anthropic-ai/sdk/core.mjs:695:12)
      at APIError.generate (file:///Users/path/to/app/node_modules/@anthropic-ai/sdk/error.mjs:33:52)
      at Stream.iterator (file:///Users/path/to/app/node_modules/@anthropic-ai/sdk/streaming.mjs:52:40)
      at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
      at async handler (webpack-internal:///(api)/./pages/api/chat.ts:269:30)
      at async K (/Users/path/to/app/node_modules/next/dist/compiled/next-server/pages-api.runtime.dev.js:21:2946)
      at async U.render (/Users/path/to/app/node_modules/next/dist/compiled/next-server/pages-api.runtime.dev.js:21:3827)
      at async DevServer.runApi (/Users/path/to/app/node_modules/next/dist/server/next-server.js:554:9)
      at async NextNodeServer.handleCatchallRenderRequest (/Users/path/to/app/node_modules/next/dist/server/next-server.js:266:37)
      at async DevServer.handleRequestImpl (/Users/path/to/app/node_modules/next/dist/server/base-server.js:791:17)
      at async /Users/path/to/app/node_modules/next/dist/server/dev/next-dev-server.js:331:20
      at async Span.traceAsyncFn (/Users/path/to/app/node_modules/next/dist/trace/trace.js:151:20)
      at async DevServer.handleRequest (/Users/path/to/app/node_modules/next/dist/server/dev/next-dev-server.js:328:24)
      at async invokeRender (/Users/path/to/app/node_modules/next/dist/server/lib/router-server.js:174:21)
      at async handleRequest (/Users/path/to/app/node_modules/next/dist/server/lib/router-server.js:353:24)```
greg84 commented 3 weeks ago

Could do something like this to serialize the object as JSON for use as the error message. Not an ideal fix but at least we'd be able to see what the error is.

jbergs-dsit commented 2 weeks ago

We're also seeing this issue (using .messages.stream()) – is this still on the roadmap to be fixed?

rattrayalex commented 2 weeks ago

@greg84 @jbergs-dsit (or anyone else on this thread) could you please provide a codesandbox or similar which reproduces the error?

ryanblock commented 2 weeks ago

It should replicate for you using this repo (see comment above): https://github.com/beginner-corp/claude-begin-demo

Note: you don't need to deploy to Begin to replicate, just run the local sandbox with npm start.

rattrayalex commented 2 weeks ago

Thank you @ryanblock, we'll take a look soon!

greg84 commented 2 weeks ago

I have not been able to consistently reproduce this. It happens when the API returns an error to a streaming response. We have seen it during times of instability, where the API was returning 500 or overloaded errors.

Please read the original comment from paulcalcraft, it describes exactly what is happening: Just need to extract some useful detail from errJson before the error is thrown.

rattrayalex commented 6 days ago

EDIT: we're working on a fix for this internally.

ryanblock commented 6 days ago

@rattrayalex that appears to be a private repo?

jbergs-dsit commented 3 days ago

~@RobertCraigie can you elaborate a bit on how the error has been fixed? Is e.g. the Connection Error not occurring anymore or is the error it throws now processable by thecastToError function?~ EDIT: answered by commit reference

RobertCraigie commented 3 days ago

Sorry it looks like it was closed prematurely before this commit was pushed.

RobertCraigie commented 3 days ago

This fix was released in v0.27.3, really sorry for the delay here!