Streaming requests typically time out around the 5 minute mark on Node.js, despite the fact that the load balancer timeout is actually 2 hours. This causes unnecessary HTTP requests and the potential for lost or missing messages.
The root of this issue is that streaming is implemented as a long-lived HTTP request where the body is sent as line-delineated JSON with a new line sent every time the node receives a message. When there is a break between messages of more than 5 minutes, the HTTP agent gives up and disconnects. Our ApiClient then automatically reconnects.
Describe the solution to the problem
The following global config change fixes the issue in Node.
It is possible to set an agent on a per-request basis by defining the dispatcher option as part of the fetch options, so that we don't have to modify the user's global HTTP config. We should automatically apply this option when the user is running xmtp-js in a Node.js environment.
Is your feature request related to a problem?
Streaming requests typically time out around the 5 minute mark on Node.js, despite the fact that the load balancer timeout is actually 2 hours. This causes unnecessary HTTP requests and the potential for lost or missing messages.
The root of this issue is that streaming is implemented as a long-lived HTTP request where the body is sent as line-delineated JSON with a new line sent every time the node receives a message. When there is a break between messages of more than 5 minutes, the HTTP agent gives up and disconnects. Our ApiClient then automatically reconnects.
Describe the solution to the problem
The following global config change fixes the issue in Node.
It is possible to set an agent on a per-request basis by defining the
dispatcher
option as part of the fetch options, so that we don't have to modify the user's global HTTP config. We should automatically apply this option when the user is runningxmtp-js
in a Node.js environment.Describe the uses cases for the feature
Anything using streaming for long sessions
Additional details
No response