ansible-middleware / amq_streams

Apache License 2.0
9 stars 7 forks source link

Enable Broker Authentication usign SASL mechanism #43

Closed rpelisse closed 1 year ago

rpelisse commented 1 year ago

Superseeds https://github.com/ansible-middleware/amq_streams/pull/35

rpelisse commented 1 year ago

@rmarting I've looked into this. The issue is that Connect can't connect to zk and the brokers:

[2023-07-06 07:45:32,750] ERROR Stopping due to error (org.apache.kafka.connect.cli.ConnectStandalone:126)
org.apache.kafka.connect.errors.ConnectException: Failed to connect to and describe Kafka cluster. Check worker's broker connection and security properties.
        at org.apache.kafka.connect.util.ConnectUtils.lookupKafkaClusterId(ConnectUtils.java:77)
        at org.apache.kafka.connect.util.ConnectUtils.lookupKafkaClusterId(ConnectUtils.java:58)
        at org.apache.kafka.connect.cli.ConnectStandalone.main(ConnectStandalone.java:82)
Caused by: java.util.concurrent.ExecutionException: org.apache.kafka.common.errors.IllegalSaslStateException: Unexpected handshake request with client mechanism SCRAM-SHA-512, enabled mechanisms are []
        at java.base/java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:395)
        at java.base/java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1999)
        at org.apache.kafka.common.internals.KafkaFutureImpl.get(KafkaFutureImpl.java:165)
        at org.apache.kafka.connect.util.ConnectUtils.lookupKafkaClusterId(ConnectUtils.java:71)
        ... 2 more
Caused by: org.apache.kafka.common.errors.IllegalSaslStateException: Unexpected handshake request with client mechanism SCRAM-SHA-512, enable

Do you have an idea on what's going wrong here?

rmarting commented 1 year ago

I think, it is related to the authentication mechanism used by Kafka connect againts the Kafka Broker. If I am not wrong, that issue is because the Kafka connect is using sasl.mechanism=SCRAM-SHA-512 to establish the authentication, however the Kafka broker is not defined with that value.

Please, review the way you are configuring the Kafka broker, and then Kafka connect must use the same configuration.

If the property amq_streams_connect_broker_auth_scram_enabled is enabled, then Kafka connect will use that mechanism to connect to the user, otherwise PLAIN will be used.

Can you share your playbook to deploy the kafka broker (with authentication), and the kafka connect cluster?

rpelisse commented 1 year ago

@rmarting I'm testing your PR ! So it's your setup :)

It's running the playbooks/playbook.yml and it's using default from the two other roles (zk and brokers) to setup the rest of the cluster.

rmarting commented 1 year ago

Ok! Let me review my original PR with the default playbook, as it seems that the Kafka connect is failing the authentication ... so maybe there is something wrong! Thanks for the heads up!

rmarting commented 1 year ago

Last question, could you share here the content of the connect-standalone.properties of the Kafka connect cluster deployed? thanks

rpelisse commented 1 year ago

It's the one based on the templates in the collection: roles/amq_streams_connect/templates/connect-standalone.properties.j2

I'm running the PR locally so I can post here the resulting file.

rpelisse commented 1 year ago

And here is the resulting conf:

# Ansible managed

# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements.  See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License.  You may obtain a copy of the License at
#
#    http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

# These are defaults. This file just demonstrates how to override some settings.
bootstrap.servers=localhost:9092

security.protocol=SASL_PLAINTEXT
sasl.mechanism=SCRAM-SHA-512
sasl.jaas.config=org.apache.kafka.common.security.scram.ScramLoginModule required \
                 username=broker \
                 password=PLEASE_CHANGEME_IAMNOTGOOD_FOR_PRODUCTION;

# The converters specify the format of data in Kafka and how to translate it into Connect data. Every Connect user will
# need to configure these based on the format they want their data in when loaded from or stored into Kafka
key.converter=org.apache.kafka.connect.json.JsonConverter
value.converter=org.apache.kafka.connect.json.JsonConverter
# Converter-specific settings can be passed in by prefixing the Converter's setting with the converter we want to apply
# it to
key.converter.schemas.enable=true
value.converter.schemas.enable=true

offset.storage.file.filename=/tmp/connect.offsets
# Flush much faster than normal, which is useful for testing/debugging
offset.flush.interval.ms=10000

# Set to a list of filesystem paths separated by commas (,) to enable class loading isolation for plugins
# (connectors, converters, transformations). The list should consist of top level directories that include 
# any combination of: 
# a) directories immediately containing jars with plugins and their dependencies
# b) uber-jars with plugins and their dependencies
# c) directories immediately containing the package directory structure of classes of plugins and their dependencies
# Note: symlinks will be followed to discover dependencies or plugins.
# Examples: 
# plugin.path=/usr/local/share/java,/usr/local/share/kafka/plugins,/opt/connectors,
plugin.path=/opt/kafka_2.13-3.3.2/libs/connect-file-3.3.2.jar
rmarting commented 1 year ago

By default the variable amq_streams_connect_broker_auth_enabled is false, so the following block should not be included in that file:

security.protocol=SASL_PLAINTEXT
sasl.mechanism=SCRAM-SHA-512
sasl.jaas.config=org.apache.kafka.common.security.scram.ScramLoginModule required \
                 username=broker \
                 password=PLEASE_CHANGEME_IAMNOTGOOD_FOR_PRODUCTION;

So, I think that the issue is because the if clause is not well declare to skip that section. I tested with the following commit in my original PR for your review and test with this branch.

https://github.com/ansible-middleware/amq_streams/pull/35/commits/e946b68a1b0bfa7b908bb97b699d861e6fca771b

The file must not include anything related to authentication, as the default values are false.

rmarting commented 1 year ago

Thanks @rpelisse for your review, and extended contributions to close this amazing PR. It is a huge step in the collection to cover more complex scenarios.

Neustradamus commented 1 year ago

@rmarting, @rpelisse: Good job!

Linked to: