confluentinc / kafka-images

Confluent Docker images for Apache Kafka
Apache License 2.0
27 stars 137 forks source link

kafka-connect fails to start when CONNECT_CONFIG_PROVIDERS is set #111

Open twobeeb opened 3 years ago

twobeeb commented 3 years ago

Config with CONNECT_CONFIG_PROVIDERS

version: '2.1'
services:
  kafka-connect:
    image: confluentinc/cp-kafka-connect:6.2.0
    user: 1000:1000
    hostname: kafka-connect
    ports:
      - "8083:8083"
    environment:
      CONNECT_REST_ADVERTISED_HOST_NAME: localhost
      CONNECT_BOOTSTRAP_SERVERS: ******.northeurope.azure.confluent.cloud:9092
      CONNECT_CONFIG_STORAGE_TOPIC: f4m.connect-k8s.connect-configs
      CONNECT_CONNECTOR_CLIENT_CONFIG_OVERRIDE_POLICY: Principal
      CONNECT_GROUP_ID: f4m.connect-k8s.kafka-connect
      CONNECT_INTERNAL_KEY_CONVERTER: org.apache.kafka.connect.json.JsonConverter
      CONNECT_INTERNAL_VALUE_CONVERTER: org.apache.kafka.connect.json.JsonConverter
      CONNECT_KEY_CONVERTER: org.apache.kafka.connect.storage.StringConverter
      CONNECT_OFFSET_STORAGE_TOPIC: f4m.connect-k8s.connect-offsets
      CONNECT_PRODUCER_SASL_MECHANISM: PLAIN
      CONNECT_PRODUCER_SECURITY_PROTOCOL: SASL_SSL
      #CONNECT_REST_EXTENSION_CLASSES: org.apache.kafka.connect.rest.basic.auth.extension.BasicAuthSecurityRestExtension
      CONNECT_REST_PORT: "8083"
      CONNECT_SASL_JAAS_CONFIG: org.apache.kafka.common.security.plain.PlainLoginModule required username="*****" password="*****";
      CONNECT_SASL_MECHANISM: PLAIN
      CONNECT_SECURITY_PROTOCOL: SASL_SSL
      CONNECT_STATUS_STORAGE_TOPIC: f4m.connect-k8s.connect-status
      CONNECT_VALUE_CONVERTER: org.apache.kafka.connect.storage.StringConverter
      CONNECT_CONFIG_PROVIDERS: 'aes'
      CONNECT_config.providers.aes.class: 'com.michelin.kafka.AES256ConfigProvider'
      CONNECT_config.providers.aes.param.key: '*****'
$ docker compose up
[+] Running 1/1
 - Container ansible_kafka-connect_1  Recreated                                                                                                                                                             2.2s 
Attaching to kafka-connect_1
kafka-connect_1  | ===> User
kafka-connect_1  | uid=1000(appuser) gid=1000(appuser) groups=1000(appuser)
kafka-connect_1  | ===> Configuring ...
kafka-connect_1  | ===> Running preflight checks ... 
kafka-connect_1  | ===> Check if Kafka is healthy ...
kafka-connect_1  | SLF4J: Class path contains multiple SLF4J bindings.
kafka-connect_1  | SLF4J: Found binding in [jar:file:/usr/share/java/cp-base-new/slf4j-log4j12-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
kafka-connect_1  | SLF4J: Found binding in [jar:file:/usr/share/java/cp-base-new/slf4j-simple-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
kafka-connect_1  | SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
kafka-connect_1  | SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
kafka-connect_1  | log4j:WARN No appenders could be found for logger (io.confluent.admin.utils.cli.KafkaReadyCommand).
kafka-connect_1  | log4j:WARN Please initialize the log4j system properly.
kafka-connect_1  | log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
kafka-connect_1 exited with code 1

Config with CONNECT_CONFIG_PROVIDERS removed

$ docker compose up
[+] Running 1/1
 - Container ansible_kafka-connect_1  Recreated                                                                                                                                                             2.2s 
Attaching to kafka-connect_1
kafka-connect_1  | ===> User
kafka-connect_1  | uid=1000(appuser) gid=1000(appuser) groups=1000(appuser)
kafka-connect_1  | ===> Configuring ...
kafka-connect_1  | ===> Running preflight checks ... 
kafka-connect_1  | ===> Check if Kafka is healthy ...
kafka-connect_1  | SLF4J: Class path contains multiple SLF4J bindings.
kafka-connect_1  | SLF4J: Found binding in [jar:file:/usr/share/java/cp-base-new/slf4j-log4j12-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
kafka-connect_1  | SLF4J: Found binding in [jar:file:/usr/share/java/cp-base-new/slf4j-simple-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
kafka-connect_1  | SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
kafka-connect_1  | SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
kafka-connect_1  | log4j:WARN No appenders could be found for logger (io.confluent.admin.utils.cli.KafkaReadyCommand).
kafka-connect_1  | log4j:WARN Please initialize the log4j system properly.
kafka-connect_1  | log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
kafka-connect_1  | ===> Launching ... 
kafka-connect_1  | ===> Launching kafka-connect ...
kafka-connect_1  | [2021-10-22 14:33:57,557] INFO WorkerInfo values: 
kafka-connect_1  |      jvm.args = -Xms256M, -Xmx2G, -XX:+UseG1GC, -XX:MaxGCPauseMillis=20, -XX:InitiatingHeapOccupancyPercent=35, -XX:+ExplicitGCInvokesConcurrent, -XX:MaxInlineLevel=15, -Djava.awt.headless=true, -Dcom.sun.management.jmxremote=true, -Dcom.sun.management.jmxremote.authenticate=false, -Dcom.sun.management.jmxremote.ssl=false, -Dkafka.logs.dir=/var/log/kafka, -Dlog4j.configuration=file:/etc/kafka/connect-log4j.properties
...

Even stranger, the following config lets kafka connect starts properly (upper case value) :

version: '2.1'
services:
  kafka-connect:
    image: confluentinc/cp-kafka-connect:6.2.0
    user: 1000:1000
    hostname: kafka-connect
    ports:
      - "8083:8083"
    environment:
      CONNECT_REST_ADVERTISED_HOST_NAME: localhost
      CONNECT_BOOTSTRAP_SERVERS: ******.northeurope.azure.confluent.cloud:9092
      CONNECT_CONFIG_STORAGE_TOPIC: f4m.connect-k8s.connect-configs
      CONNECT_CONNECTOR_CLIENT_CONFIG_OVERRIDE_POLICY: Principal
      CONNECT_GROUP_ID: f4m.connect-k8s.kafka-connect
      CONNECT_INTERNAL_KEY_CONVERTER: org.apache.kafka.connect.json.JsonConverter
      CONNECT_INTERNAL_VALUE_CONVERTER: org.apache.kafka.connect.json.JsonConverter
      CONNECT_KEY_CONVERTER: org.apache.kafka.connect.storage.StringConverter
      CONNECT_OFFSET_STORAGE_TOPIC: f4m.connect-k8s.connect-offsets
      CONNECT_PRODUCER_SASL_MECHANISM: PLAIN
      CONNECT_PRODUCER_SECURITY_PROTOCOL: SASL_SSL
      #CONNECT_REST_EXTENSION_CLASSES: org.apache.kafka.connect.rest.basic.auth.extension.BasicAuthSecurityRestExtension
      CONNECT_REST_PORT: "8083"
      CONNECT_SASL_JAAS_CONFIG: org.apache.kafka.common.security.plain.PlainLoginModule required username="*****" password="*****";
      CONNECT_SASL_MECHANISM: PLAIN
      CONNECT_SECURITY_PROTOCOL: SASL_SSL
      CONNECT_STATUS_STORAGE_TOPIC: f4m.connect-k8s.connect-status
      CONNECT_VALUE_CONVERTER: org.apache.kafka.connect.storage.StringConverter
      CONNECT_CONFIG_PROVIDERS: 'AES' # Note the upper case
      CONNECT_config.providers.aes.class: 'com.michelin.kafka.AES256ConfigProvider'
      CONNECT_config.providers.aes.param.key: '*****'

Except it's useless because aes.class and aes.key don't match the name AES.

I went a little bit inside the pod and did the following steps :

$sed -i 's/config\.providers=.*/config.providers=AES/g' kafka-connect.properties
$cub kafka-ready 1 10 -b ****.northeurope.azure.confluent.cloud:9092 --config /etc/kafka-connect/kafka-connect.properties
$echo $?
0

$sed -i 's/config\.providers=.*/config.providers=aes/g' kafka-connect.properties
$cub kafka-ready 1 10 -b ****.northeurope.azure.confluent.cloud:9092 --config /etc/kafka-connect/kafka-connect.properties
$echo $?
1

This tells me that cub kafka-ready is picking config it shouldn't pick and this is rendering kafka-connect image impossible to use.

Thanks for your help.

twobeeb commented 3 years ago

A possible workaround for this issue involves adding the jar of the ConfigProvider in cub CLASSPATH. As per the cub.py code, simply declare CUB_CLASSPATH environment variable. https://github.com/confluentinc/confluent-docker-utils/blob/master/confluent/docker_utils/cub.py#L47

CUB_CLASSPATH: "/usr/share/java/cp-base/*:/usr/share/java/cp-base-new/*:/usr/share/confluent-hub-components/*"

sachin-badgujar commented 2 years ago

+1 I am trying to use https://github.com/lensesio/secret-provider Jar. It fails on cub kafka-ready command in ensure. Tried above solution mentioned by @twobeeb and still same issue. However, if I by pass ensure or cub kafka-ready everything works!!

Image: confluentinc/cp-kafka-connect:6.2.1 Jar: secret-provider-2.1.6-all.jar