Blizzard / node-rdkafka

Node.js bindings for librdkafka
MIT License
2.1k stars 390 forks source link

Docker - Producer Messages Never Reach Topic #1007

Closed CoinCoderBuffalo closed 1 year ago

CoinCoderBuffalo commented 1 year ago

Error Works fine locally, but when I run it from Docker I get this error for every message sent to topic

kafka1  | [2023-03-23 03:19:20,470] WARN [SocketServer listenerType=ZK_BROKER, nodeId=1] Unexpected error from /172.18.0.1 (channelId=172.18.0.3:9092-172.18.0.1:48226-100); closing connection (org.apache.kafka.common.network.Selector)
kafka1  | org.apache.kafka.common.network.InvalidReceiveException: Invalid receive (size = 1195725856 larger than 104857600)
kafka1  |   at org.apache.kafka.common.network.NetworkReceive.readFrom(NetworkReceive.java:105)
kafka1  |   at org.apache.kafka.common.network.KafkaChannel.receive(KafkaChannel.java:452)
kafka1  |   at org.apache.kafka.common.network.KafkaChannel.read(KafkaChannel.java:402)
kafka1  |   at org.apache.kafka.common.network.Selector.attemptRead(Selector.java:674)
kafka1  |   at org.apache.kafka.common.network.Selector.pollSelectionKeys(Selector.java:576)
kafka1  |   at org.apache.kafka.common.network.Selector.poll(Selector.java:481)
kafka1  |   at kafka.network.Processor.poll(SocketServer.scala:1144)
kafka1  |   at kafka.network.Processor.run(SocketServer.scala:1047)
kafka1  |   at java.base/java.lang.Thread.run(Thread.java:829)

Environment Information

Steps to Reproduce

Docker file

FROM node:alpine

RUN apk update
RUN apk --no-cache add \
      bash \
      g++ \
      ca-certificates \
      lz4-dev \
      musl-dev \
      cyrus-sasl-dev \
      openssl-dev \
      make \
      python3

RUN apk add --no-cache --virtual .build-deps gcc zlib-dev libc-dev bsd-compat-headers py-setuptools bash

ENV NODE_ENV development

RUN mkdir -p /usr/local/app

WORKDIR /usr/local/app

COPY package.json .

COPY .yalc/ ./.yalc/
COPY src/ ./src/
COPY test/ ./test/

RUN yarn

CMD ["yarn", "start"]

node-rdkafka Configuration Settings

const producerConfig = {
  debug: 'broker,topic,msg',
  'client.id': CLIENT_ID,
  'metadata.broker.list': config.kafkaBrokers,
  'message.max.bytes': 10000000,
  'compression.codec': 'gzip',
  'retry.backoff.ms': 200,
  'message.send.max.retries': 5,
  'socket.keepalive.enable': true,
  'queue.buffering.max.messages': 100000,
  'queue.buffering.max.ms': 1000,
  'batch.num.messages': 1000000,
  dr_cb: false,
}
const producer = new Kafka.Producer(producerConfig)
producer.setPollInterval(50)

Additional context I am not using SSL. The messages I'm sending are about 2MB in size, which shouldn't be an issue.

CoinCoderBuffalo commented 1 year ago

Found the issue. My docker app was pointing to the broker on port 9092 when it should have been pointing to port 29092.

So there was never an issue with this library, it was just an issue with the way I had configured the docker app.

KAFKA_ADVERTISED_LISTENERS: INTERNAL://kafka1:19092,EXTERNAL://kafka1:9092,DOCKER://host.docker.internal:29092