FlowiseAI / Flowise

Drag & drop UI to build your customized LLM flow
https://flowiseai.com
Apache License 2.0
29.95k stars 15.46k forks source link

[BUG]Limit number of retries #3226

Open IgorMilavec opened 4 hours ago

IgorMilavec commented 4 hours ago

Describe the bug Published chat tries to call /api/v1/prediction/ indefinitely in case the call (or parsing) fails.

To Reproduce Make the backend fail (shut down the container, ...)

Expected behavior A retry policy with limited number of retries and exponential backoff policy should be implemented.

Screenshots

Flow

Setup

Additional context

hahl9000 commented 3 hours ago

Had the same issue, cost me some bucks given the tokens used to reply to the null message.