Closed mwaarna closed 9 years ago
It maybe just the bottleneck of a single connection per broker used by kafka-node
which cannot keep up with the number of incoming requests. You can start up several server processes load balanced behind a proxy, then you can keep adding more processes if you need more TPS.
@haio can probably tell the exact reason why kafka-node
would consumer so much memory.
Just have a look of your code, that can't explain all the 1gb+ memory is used by kafka-node, I make a sample test that wirite data to a 8 partitions topic use HighLevelProducer , the memory used is always less than 90M, the code
var kafka = require('kafka-node');
var HighLevelProducer = kafka.HighLevelProducer;
var Client = kafka.Client;
var client = new Client();
var topic = 'test-topic-2';
var count = 100*1024, rets = 0;
var producer = new HighLevelProducer(client);
var data = require('fs').readFileSync('4k.txt').toString()
producer.on('ready', function () {
setInterval(function () {
send(data);
}, 5)
});
function send(message) {
producer.send([
{topic: topic, messages: [message] }
], function (err, data) {
if (err) console.log(err);
if (rets % 1000 === 0) console.log('sent %d', rets);
if (++rets === count) process.exit();
});
}
The issue was not in Kafka.
Having same problem. Can you explain what was the issue?
@mwaarna Where was the issue?
Same here, @mwaarna what was the issue?
What kind of memory usage is everyone seeing with Kafka node?
I am sending 4kb messages in bulk(100k+) to kafka node and I am seeing ram usage climb rapidly to 1gb+.
Testing on Ubuntu 12.04 and Windows 7 64 bit. Both running Node V 0.12.0
Kafka version 8.1.1
Sample Node.js code to simulate:
Simple Classic ASP to hammer the Node app: