linkedin / Burrow

Kafka Consumer Lag Checking
Apache License 2.0
3.77k stars 802 forks source link

InvalidReceiveException: Invalid receive (size = 369295616 larger than 524288) #435

Open shubhamvasaikar opened 6 years ago

shubhamvasaikar commented 6 years ago

I have a single broker Kafka and Zookeeper. I have configured it to use GSSAPI with PLAIN. I am getting the following warning in Kafka logs when I start Burrow.

org.apache.kafka.common.network.InvalidReceiveException: Invalid receive (size = 369295616 larger than 524288)
    at org.apache.kafka.common.network.NetworkReceive.readFromReadableChannel(NetworkReceive.java:132)
    at org.apache.kafka.common.network.NetworkReceive.readFrom(NetworkReceive.java:93)
    at org.apache.kafka.common.security.authenticator.SaslServerAuthenticator.authenticate(SaslServerAuthenticator.java:248)
    at org.apache.kafka.common.network.KafkaChannel.prepare(KafkaChannel.java:81)
    at org.apache.kafka.common.network.Selector.pollSelectionKeys(Selector.java:460)
    at org.apache.kafka.common.network.Selector.poll(Selector.java:398)
    at kafka.network.Processor.poll(SocketServer.scala:535)
    at kafka.network.Processor.run(SocketServer.scala:452)
    at java.lang.Thread.run(Thread.java:748)

I am also getting this error in burrow.log

{"level":"info","ts":1533065406.9009063,"msg":"starting evaluations","type":"coordinator","name":"notifier"}
{"level":"error","ts":1533065407.5871475,"msg":"failed to start client","type":"module","coordinator":"cluster","class":"kafka","name":"local","error":"kafka: client has run out of available brokers to talk to (Is your cluster reachable?)"}

I have the following configuration in my Burrow config:

[general]
pidfile="burrow.pid"
stdout-logfile="burrow.out"
access-control-allow-origin="*"

[logging]
filename="logs/burrow.log"
level="info"
maxsize=100
maxbackups=30
maxage=10
use-localtime=true
use-compression=false

[zookeeper]
servers=[ "ak.example.com:2181" ]
timeout=6
root-path="/opt/kafka/data"

[client-profile.test]
client-id="burrow-test"
kafka-version="1.0.0"
sasl="mysasl"
tls="mytls"

[tls.mytls]
noverify=true

[sasl.mysasl]
username="burrow"
password="burrow"
handshake-first=true

[cluster.local]
class-name="kafka"
servers=[ "ak.example.com:9092" ]
client-profile="test"

[consumer.local]
class-name="kafka"
cluster="local"
servers=[ "ak.example.com:9092" ]
client-profile="test"
group-blacklist="^(console-consumer-|python-kafka-consumer-|quick-).*$"
group-whitelist=""

[consumer.local_zk]
class-name="kafka_zk"
cluster="local"
servers=[ "ak.example.com:2181" ]
zookeeper-path="/opt/kafka/data"
zookeeper-timeout=30
group-blacklist="^(console-consumer-|python-kafka-consumer-|quick-).*$"
group-whitelist=""

[httpserver.default]
address=":80"

[storage.default]
class-name="inmemory"
workers=20
intervals=15
expire-group=604800
min-distance=1

I have also added this part to my Jaas config:

org.apache.kafka.common.security.plain.PlainLoginModule required
  username="burrow"
  password="burrow";

Finally, this is what my server.properties looks like:

listeners=SASL_PLAINTEXT://ak.example.com:9092
     security.inter.broker.protocol=SASL_PLAINTEXT
     sasl.mechanism.inter.broker.protocol=GSSAPI
     sasl.enabled.mechanisms=GSSAPI,PLAIN
     sasl.kerberos.service.name=kafka
advertised.listeners=SASL_PLAINTEXT://ak.example.com:9092
allow.everyone.if.no.acl.found=true
principal.to.local.class=kafka.security.auth.KerberosPrincipalToLocal
security.protocol=SASL_PLAINTEXT
super.users=user:kafka,kafkausr
Sammy2005 commented 4 years ago

Same issue here. Any solution yet?

akku16 commented 4 years ago

I am also facing the same issue. Any solutions for this? @shubhamvasaikar @Sammy2005

twz999 commented 3 years ago

运行一段时间就出现这个问题

twz999 commented 3 years ago

运行一段时间就出现这个问题然后kafka一个节点死了

ralyodio commented 2 years ago

anyone figure it out?

dhilgarth commented 4 months ago

So, this happens if the kafka broker is configured for a PLAINTEXT connection but the client tries to establish a SSL connection. I also got the number 369295617 for a more current client, probably advertising a different SSL/TLS version