Hi,
we encountered issue with your package, when private method createConsumer in kafka-pubsub.js does not handle node-rdkafka Stream API in correct way. This problem leads to not catching all messages from Kafka broker, in our case, after 3 events were recieved, consumer stops consuming.
Current code:
stream.consumer.on('data', (message) => {
let parsedMessage = JSON.parse(message.value.toString())
...
})
You access Standard API via Stream API object without manual setup for flowing/non-flowing mode (see examples on https://www.npmjs.com/package/node-rdkafka). One of the solutions is use only Stream API (example below).
Fix:
stream.on('data', (message) => {
let parsedMessage = JSON.parse(message.value.toString())
...
})
This fix helps us to get it working, so we'll be glad to have this thing fixed in the next release. Thanks!
This fix also fixed our issue with restarting or re initializing the kafka pubsub, so thanks @cubase for this, hope we get a new release for this soon. cheers!
Hi, we encountered issue with your package, when private method
createConsumer
inkafka-pubsub.js
does not handle node-rdkafka Stream API in correct way. This problem leads to not catching all messages from Kafka broker, in our case, after 3 events were recieved, consumer stops consuming.Current code:
You access Standard API via Stream API object without manual setup for flowing/non-flowing mode (see examples on https://www.npmjs.com/package/node-rdkafka). One of the solutions is use only Stream API (example below).
Fix:
This fix helps us to get it working, so we'll be glad to have this thing fixed in the next release. Thanks!