bitnami / vms

Bitnami VMs
https://bitnami.com
Other
206 stars 43 forks source link

[<Apache Kafka>] <SSL Handshake Unable to Authenticate> #1502

Closed misogare closed 5 months ago

misogare commented 5 months ago

Platform

Virtual Machine

bndiagnostic ID know more about bndiagnostic ID

0a5e285b-677a-ae9a-3349-1e5aebf5d0fd

bndiagnostic output

[Connectitivy] server ports 22,80 and/or 443 are not publicly accessible

bndiagnostic was not useful. Could you please tell us why?

well these ports are not usefull in my case

Describe your issue as much as you can

the issue is i can not create a topic or produce message or consumer message nothing works and when i try to do so via this command /opt/bitnami/kafka/bin/kafka-topics.sh --create --bootstrap-server SERVER-IP:9092 --replication-factor 1 --partitions 1 --topic test it says timeout

Error while executing topic command : Timed out waiting for a node assignment. Call: createTopics [2022-01-17 01:46:59,753] ERROR org.apache.kafka.common.errors.TimeoutException: Timed out waiting for a node assignment. Call: createTopics (org.apache.kafka.tools.TopicCommand)

and in the server.log under /opt/bitnami/kafka/logs/ I see bunch of lines with this

[2020-04-30 14:48:14,955] INFO [SocketServer brokerId=0] Failed authentication with /127.0.0.1 (Unexpected Kafka request of type METADATA during SASL handshake.) (org.apache.kafka.common.network.Selector)

this export KAFKA_OPTS="-Djava.security.auth.login.config=/opt/bitnami/kafka/config/kafka_jaas.conf" also won't work since the new bitnami kafka for vm does not have this and it is instead in server.properties which everything seems fine in there also i did not change anything in server.properties or producer.properties only installed the image and ran the command and got the error (network is working fine)

server.properties

listeners=INTERNAL://:9092,CONTROLLER://:9093 advertised.listeners=INTERNAL://:9092 listener.security.protocol.map=INTERNAL:SASL_PLAINTEXT,CONTROLLER:SASL_PLAINTEXT sasl.enabled.mechanism=PLAIN,SCRAM-SHA-256,SCRAM-SHA-512 sasl.mechanism.controller.protocol=PLAIN sasl.mechanism.inter.broker.protocol=PLAIN security.protocol=SALS_PLAINTEXT sasl.mechanism=PLAIN sasl.enabled.mechanism=PLAIN listener.name.internal.plain.sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username="inter_broker_user" password="***" user_admin="**" user_inter_broker_user="**"; listener.name.internal.scram-sha-256.sasl.jaas.config=org.apache.kafka.common.security.scram.ScramLoginModule required username="inter_broker_user" password="**"; listener.name.internal.scram-sha-512.sasl.jaas.config=org.apache.kafka.common.security.scram.ScramLoginModule required username="inter_broker_user" password="**"; listener.name.controller.sasl.enabled.mechanisms=PLAIN listener.name.controller.plain.sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLogModule required username="controller_user" password="**" ;

matusa-bleik commented 5 months ago

I was with the same problem here. You need to authenticate with sasl. If you don`t know your password you can get it by connecting to your VM:

go to this reference and read the "Clients" section https://docs.confluent.io/platform/current/kafka/authentication_sasl/authentication_sasl_plain.html

you need to create a client.properties file with the sasl authentication and pass it to the cli with "--command-config " (you could also use the consumer.properties file from /kafka/config/, but idk if it works for all cases)

btw, you will need to use this sasl authentication on each client you connect