linkedin / Burrow

Kafka Consumer Lag Checking
Apache License 2.0
3.73k stars 797 forks source link

Burrow SASL_SSL configs #761

Open ashishvashisht1 opened 2 years ago

ashishvashisht1 commented 2 years ago

Hello i am trying to configure Burrow to connect to our Kafka cluster which is kerberized and is SASL_SSL; is there any sample config/examples that i could follow to add specific configs in burrow.toml?

Thanks Ashish

ashishvashisht1 commented 2 years ago

Added few SASL_SSL parms;

[sasl.SASL_SSL]
#username=kafka
security_protocol="SASL_SSL"
sasl_mechanism="GSSAPI"
ssl_cafile="truststore.pem"
handshake-first=false

getting this error:

{"level":"error","ts":1658961613.1148155,"msg":"failed to start client","type":"module","coordinator":"cluster","class":"kafka","name":"local","error":"kafka: invalid configuration (Net.SASL.User must not be empty when SASL is enabled)"}

gklp commented 2 years ago

Hi @ashishvashisht1, you can check this comment.

https://github.com/linkedin/Burrow/issues/374#issuecomment-397780277

ashishvashisht1 commented 2 years ago

Thanks @gklp Seems like i have those configs (broadly) and still not able to connect:

All Configs for Burrow below.. still getting errors:

"level":"error","ts":1658860339.0624013,"msg":"failed to start client","type":"module","coordinator":"cluster","class":"kafka","name":"local","error":"kafka: client has run out of available brokers to talk to (Is your cluster reachable?)"}
{"level":"info","ts":1658860339.0624447,"msg":"stopping","type":"coordinator","name":"notifier"}
{"level":"info","ts":1658860339.062451,"msg":"shutdown","type":"coordinator","name":"httpserver"}
{"level":"info","ts":1658860339.0624893,"msg":"stopping","type":"coordinator","name":"evaluator"}
{"level":"info","ts":1658860339.062495,"msg":"stopping","type":"module","coordinator":"evaluator","class":"caching","name":"default"}
{"level":"info","ts":1658860339.0625021,"msg":"stopping","type":"coordinator","name":"storage"}
{"level":"info","ts":1658860339.062509,"msg":"stopping","type":"module","coordinator":"storage","class":"inmemory","name":"default"}
{"level":"info","ts":1658860339.0625546,"msg":"stopping","type":"coordinator","name":"zookeeper"}
{"level":"info","ts":1658860339.0648248,"msg":"recv loop terminated: err=EOF","type":"coordinator","name":"zookeeper"}
{"level":"info","ts":1658860339.0648563,"msg":"send loop terminated: err=<nil>","type":"coordinator","name":"zookeeper"}

`[general] pidfile="burrow.pid" stdout-logfile="burrow.out" access-control-allow-origin="mysite.example.com"

[logging] filename="logs/burrow.log" level="debug" maxsize=100 maxbackups=30 maxage=10 use-localtime=true use-compression=true

[zookeeper] servers=[ "HOST1:2181", "HOST2:2181", "HOST2:2181" ] timeout=6 root-path="/burrow"

[client-profile.test] client-id="burrow-test" kafka-version="0.10.0" sasl="SASL_SSL" tls="kafka-certs"

[tls.kafka-certs] certfile="truststore.jks" keyfile="keystore.jks" cafile="rootca.pem"

noverify=true

[sasl.SASL_SSL]

username=kafka

security_protocol="SASL_SSL" sasl_mechanism="GSSAPI" ssl_cafile="/truststore.pem" handshake-first=false

[cluster.local] class-name="kafka" servers=[ "HOST1:9093", "HOST2:9093", "HOST2:9093" ] client-profile="test" topic-refresh=120 offset-refresh=30 groups-reaper-refresh=0

[consumer.local] class-name="kafka" cluster="local" servers=[ "HOST1:9093", "HOST2:9093", "HOST2:9093" ] client-profile="test" group-denylist="^(console-consumer-|python-kafka-consumer-|quick-).*$" group-allowlist=""

[consumer.local_zk] class-name="kafka_zk" cluster="local" servers=[ "HOST1:2181", "HOST2:2181", "HOST2:2181" ] zookeeper-path="/kafka-cluster" zookeeper-timeout=30 group-denylist="^(console-consumer-|python-kafka-consumer-|quick-).*$" group-allowlist=""

[httpserver.default] address=":8000"

[storage.default] class-name="inmemory" workers=20 intervals=15 expire-group=604800 min-distance=1 `

gklp commented 2 years ago

I guess that documentation has missing points. There might be one more configuration. I've seen it in code.

https://github.com/linkedin/Burrow/blob/be40f44509e48462a8b5420a57f4f40cd6839921/core/internal/helpers/sarama.go#L121

` [sasl.SASL_SSL]

username=kafka

security_protocol="SASL_SSL"

sasl_mechanism="GSSAPI" /// should be "mechanism" and two options -> SCRAM-SHA-256 or SCRAM-SHA-512, you can see in the code

ssl_cafile="/truststore.pem"

handshake-first=false`

ashishvashisht1 commented 2 years ago

Well, New Error now, seems like enabling SASL requires username & password

Net.SASL.User must not be empty when SASL is enabled Net.SASL.Password must not be empty when SASL is enabled

We don't use open user/pass and use connect via service principles that are specifically granted Roles.

Not sure if we are the only ones doing it.. i assume SASL_SSL is default protocol used by everyone.

gklp commented 2 years ago

@ashishvashisht1 did you try it without sasl part ? maybe you just need tls config.

ashishvashisht1 commented 2 years ago

@gklp , I tried that as well, still getting errors; i am not sure if I mentioned it clearly, we do have to use Kerberos auth and i do explicit jaas.conf declare and kinit prior to running burrow.

Configs:

[client-profile.test]
client-id="burrow-test"
kafka-version="0.10.0"
#sasl="SASL_SSL"
tls="kafka-certs"

{"level":"debug","ts":1659116098.7474747,"msg":"Successful SASL handshake. Available mechanisms: [SCRAM-SHA-512 GSSAPI SCRAM-SHA-256]","name":"sarama"} {"level":"debug","ts":1659116098.7477207,"msg":"Failed to read response header while authenticating with SASL to broker HOST1:9093: EOF","name":"sarama"} {"level":"debug","ts":1659116098.7477582,"msg":"Closed connection to broker HOST1:9093","name":"sarama"} {"level":"debug","ts":1659116098.7477732,"msg":"client/metadata got error from broker -1 while fetching metadata: EOF","name":"sarama"} {"level":"debug","ts":1659116098.7477832,"msg":"client/metadata fetching metadata for all topics from broker