aws / aws-sdk-js-v3

Modularized AWS SDK for JavaScript.
Apache License 2.0
3.08k stars 576 forks source link

aws-sdk/client-kinesis consuming too much memory under load #6292

Closed Guru3101 closed 2 months ago

Guru3101 commented 2 months ago

Checkboxes for prior research

Describe the bug

In one of our app we are using kinesis to push our platform feeds to Kinesis , under some load (3000 to 3500 requests/min max) the heap consumption spikes gradually heavily and app is getting crashed . we took the heapdump when memory is spiking and looks like the kinesis sdk is not flushing the ClientRequest Object and holding it in the memory at least when big volume comes in, attaching the heap screenshot for reference

HEAP

13K such ClientRequest Object are sitting in the memory when we took heap dump and here is one such memory spike of our container

MEMEORYUSAGE

Here is quick snippets of our script

const { KinesisClient, PutRecordCommand} = require(@aws-sdk/client-kinesis);
const { NodeHttpHandler } = require("@smithy/node-http-handler");
const https = require('https');

//Initialization
let kinesis = new KinesisClient({
    region: 'us-east-1',
    retryMode: 'standard',
    requestHandler: new NodeHttpHandler({
        httpsAgent: new https.Agent({
            keepAlive:true,
            maxSockets: 200
        })
    })
})

//pushing to kinesis
await kinesis.send(new PutRecordCommand(params));

this app consumes the feeds at the rate of ~100req/sec from Kafka and pushes to kinesis so to match that i added a http agent with max socket value as 200 and keepalive as true to reuse connection.

SDK version number

@aws-sdk/client-kinesis@3.606.0

Which JavaScript Runtime is this issue in?

Node.js

Details of the browser/Node.js/ReactNative version

node v18.20.4

Reproduction Steps

const { KinesisClient, PutRecordCommand} = require(@aws-sdk/client-kinesis);
const { NodeHttpHandler } = require("@smithy/node-http-handler");
const https = require('https');

//Initialization
let kinesis = new KinesisClient({
    region: 'us-east-1',
    retryMode: 'standard',
    requestHandler: new NodeHttpHandler({
        httpsAgent: new https.Agent({
            keepAlive:true,
            maxSockets: 200
        })
    })
})

//pushing to kinesis
await kinesis.send(new PutRecordCommand(params));

Observed Behavior

App crashes due to heavy heap usage

under some load (3000 to 3500 requests/min max) the heap consumption spikes gradually heavily and app is getting crashed . we took the heapdump when memory is spiking and looks like the kinesis sdk is not flushing the ClientRequest Object and holding it in the memory at least when big volume comes in, attaching the heap screenshot for reference

HEAP

13K such ClientRequest Object are sitting in the memory when we took heap dump and here is one such memory spike of our container

MEMEORYUSAGE

Here is quick snippets of our script

const { KinesisClient, PutRecordCommand} = require(@aws-sdk/client-kinesis);
const { NodeHttpHandler } = require("@smithy/node-http-handler");
const https = require('https');

//Initialization
let kinesis = new KinesisClient({
    region: 'us-east-1',
    retryMode: 'standard',
    requestHandler: new NodeHttpHandler({
        httpsAgent: new https.Agent({
            keepAlive:true,
            maxSockets: 200
        })
    })
})

//pushing to kinesis
await kinesis.send(new PutRecordCommand(params));

this app consumes the feeds at the rate of ~100req/sec from Kafka and pushes to kinesis so to match that i added a http agent with max socket value as 200 and keepalive as true to reuse connection.

Expected Behavior

Memory consumption to be stable

Possible Solution

No response

Additional Information/Context

No response

aBurmeseDev commented 2 months ago

Hi @Guru3101 - thanks for reaching out. I apologize for the inconvenience you're experiencing.

To better understand and troubleshoot the problem, it would be helpful if you could provide us with more details about your environment and workflow. Specifically, please share the following information:

Having a clear understanding of your setup and the frequency of the issue will greatly assist us in diagnosing the root cause and providing an effective solution. Please provide as much relevant information as possible to help us investigate further.

github-actions[bot] commented 2 months ago

This issue has not received a response in 1 week. If you still think there is a problem, please leave a comment to avoid the issue from automatically closing.

github-actions[bot] commented 1 month ago

This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs and link to relevant comments in this thread.