SOHU-Co / kafka-node

Node.js client for Apache Kafka 0.8 and later.
MIT License
2.66k stars 628 forks source link

EventEmitter memory leak when the producer is offline. #1231

Open fungiboletus opened 5 years ago

fungiboletus commented 5 years ago

Environment

For specific cases also provide

Sample

Run the following code without kafka running.

const kafka = require('kafka-node');

const kafkaClient = new kafka.KafkaClient();
const kafkaProducer = new kafka.Producer(kafkaClient);
kafkaProducer.on('ready', () => {
  console.log('ready');
}).on('error', (err) => {
  console.error(err);
});

let i = 0;
setInterval(() => {
  i += 1;
  kafkaProducer.send([{
    topic: 'abcd',
    messages: [`abcd ${i}`],
  }], (error, data) => {
    console.log(error, data);
  });
}, 1000);

Wait a bit and you should see this error :

MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 localhost:9092-ready listeners added. Use emitter.setMaxListeners() to increase limit.
aikar commented 5 years ago

This is not a problem nor a leak. EventEmitter just had some strict limits on how many listeners it thinks one might need, but the way kafka-node uses the emitter this is a perfectly valid case.

The only way to avoid this false warning is to stop using event emitters for the queue and use an array of pending requests.

fungiboletus commented 5 years ago

It looks like it creates an event listener per .send when the broker is not connected. So the developer needs to set the max listeners higher than the number of .send he can do before the timeout will trigger. So maybe not a leak, but it could be a lot of listeners.