vercel / ai-chatbot

A full-featured, hackable Next.js AI chatbot built by Vercel
https://chat.vercel.ai
Other
5.53k stars 1.63k forks source link

Connection closed error on Vercel #291

Open angelhodar opened 3 months ago

angelhodar commented 3 months ago

Hi! I have forked the repo to test the new AI SDK 3.0 and when deployed to Vercel it was crashing, giving an Application error client side after a few seconds streaming the chat response, while locally it was working fine.

Watching at the browser console, it seems like it was a Connection closed error from the server, so I supposed it was because of serverless function timeout. I noticed that the runtime for the chat/[id]/page.tsx was not edge as in the previous chatbot version. I have included this 2 lines (second one optional) in the page and now its not giving timeout:

export const runtime = 'edge'
export const preferredRegion = 'home'

Any reason why it was deleted with this new version? I would like confirmation just in case I am missing something. Thanks in advance!!

nikohann commented 3 months ago

The duration limit for Vercel functions in the hobby plan may be the cause.

https://vercel.com/docs/functions/limitations

athrael-soju commented 3 months ago

@angelhodar Did you have a response that streamed longer than 10 seconds?

angelhodar commented 3 months ago

@angelhodar Did you have a response that streamed longer than 10 seconds?

I didnt measure it but yes, more or less. The moment I changed to the edge runtime the problem was solved, and as it wasn't failing locally I suppose that function timeout was the issue, but as nobody has mentioned it in the issues I am a bit confused...

TimchaStudio commented 3 months ago

I have the same question

AireWong commented 3 months ago

I also have the same question

AireWong commented 3 months ago

Has anyone solved it? Website: https://oai-chatbot-ui.vercel.app

Ask 5 questions quickly, the error will appear

image

image

angelhodar commented 3 months ago

@AireWong Have you tried adding the 2 lines of code I added in the first message to the chat/[id]/page.tsx route?

AireWong commented 3 months ago

@angelhodar I still encounter the same issue even when deploying through your repository.

Ask multiple questions quickly, and a connection error will occur. My website: https://oai-chat.vercel.app/ The KV Database Instance location is Singapore. My Fork repository: https://github.com/AireWong/ai-chatbot

angelhodar commented 3 months ago

@AireWong Oh I have tried now and its giving the connection error again lol. It was working perfectly a few days ago... Maybe someone from Vercel can give us some guidance?

AireWong commented 3 months ago

Many users have encountered the same issue. Can any genius solve it?

AmmarByFar commented 3 months ago

Felt like a tried everything, tried adding the 'edge' runtime, maxDuration setting. nothing is working. I even upgraded to Pro for longer execution time and still can't seem to run the streaming chat for more than like 15-20 seconds....

feliche93 commented 3 months ago

I am also currently running into the same issue 😬 for me it only happens when I set it to:

export const runtime = 'edge'
export const preferredRegion = 'fra1'

Leaving it as a serverless function it all works as expected 😬

AireWong commented 3 months ago

@feliche93 this doesn't work for me.

https://github.com/AireWong/ai-chatbot/commit/bcc4aa71405f7dfa9c5b6dc35d8127b62087918c

olivermontes commented 2 months ago

I try this in my case and its works, long LLM response is limited by 15s vercel Pro account

vercel.json (root path)

{
  "functions": {
    "app/[domain]/chat/[id]/page.tsx": {
      "maxDuration": 60
    }
  }
}

Links

eb379 commented 2 months ago

If the issue isn't intermittent for you, I recommend double checking you have credits on your openAI account as per the comment here: https://github.com/vercel/ai-chatbot/issues/309#issuecomment-2072048781

It took a couple of minutes before the billing settings updated once I had loaded credit there.

angelhodar commented 2 months ago

If the issue isn't intermittent for you, I recommend double checking you have credits on your openAI account as per the comment here: https://github.com/vercel/ai-chatbot/issues/309#issuecomment-2072048781

It took a couple of minutes before the billing settings updated once I had loaded credit there.

Finally I have moved to Groq with Llama3 model and it goes so fast that there is no opportunity to fail hahaha. I dont think billing was the problem

themataleao commented 2 months ago

If you do not have a Pro plan the timeout is limited to 10s.

What worked for me is setting the following exports (I bought the Pro account):

export const runtime = 'edge'
export const preferredRegion = 'home'
export const maxDuration = 300

to app/(chat)/chat/[id]/page.tsx app/(chat)/page.tsx

You can add the exports to other routes if needed.

As @angelhodar pointed out, reducing inference time is always a good idea.

TimchaStudio commented 2 months ago

The old version does not have this issue. --> https://next.oaiui.com

angelhodar commented 1 month ago

Hey everyone, Vercel has increased the hobby functions limit to 60s so this issue should be solved: https://twitter.com/vercel_changes/status/1788600830649639092?t=wcRiZtxoWsuch6zD_EVMsw&s=19

Can you verify?