Closed bagipriyank closed 4 years ago
Here is the Dockerfile for building kafka-connect-s3
FROM confluentinc/cp-kafka-connect-base:5.4.0 MAINTAINER bagi RUN echo "===> Installing S3 connector ..." RUN apt-get update # RUN apt-get install -y confluent-kafka-connect-s3=5.4.0 RUN apt-get install -y confluent-kafka-connect-s3=${CONFLUENT_VERSION}${CONFLUENT_PLATFORM_LABEL}-${CONFLUENT_DEB_VERSION} RUN echo "===> Cleaning up ..." RUN apt-get clean && rm -rf /tmp/* /var/lib/apt/lists/* # copy protobuf dependencies COPY protobuf-java-3.11.4.jar /usr/share/java/kafka-connect-s3/ RUN rm /usr/share/java/kafka-connect-s3/protobuf-java-2.5.0.jar COPY kafka-connect-protobuf-converter-3.0.0.jar /usr/share/java/kafka-serde-tools/ # copy protobuf event schema jar COPY tracking.jar /usr/share/java/kafka-connect-s3/ COPY tracking.jar /usr/share/java/kafka-serde-tools/
Here is my docker-compose.yml
version: '2' services: zookeeper: image: confluentinc/cp-zookeeper:5.4.0 hostname: zookeeper container_name: zookeeper ports: - "2181:2181" environment: ZOOKEEPER_CLIENT_PORT: 2181 ZOOKEEPER_TICK_TIME: 2000 broker: image: confluentinc/cp-kafka:5.4.0 hostname: broker container_name: broker depends_on: - zookeeper ports: - "29092:29092" - "9092:9092" environment: KAFKA_BROKER_ID: 1 KAFKA_ZOOKEEPER_CONNECT: "zookeeper:2181" KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: PLAINTEXT:PLAINTEXT,PLAINTEXT_HOST:PLAINTEXT KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://broker:29092,PLAINTEXT_HOST://localhost:9092 KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1 KAFKA_GROUP_INITIAL_REBALANCE_DELAY_MS: 0 KAFKA_AUTO_CREATE_TOPICS_ENABLE: "true" connect: image: bagi/kafka-connect-s3 hostname: connect container_name: connect depends_on: - zookeeper - broker ports: - "8083:8083" environment: AWS_ACCESS_KEY_ID: **** AWS_SECRET_ACCESS_KEY: **** CONNECT_BOOTSTRAP_SERVERS: "broker:29092" CONNECT_REST_ADVERTISED_HOST_NAME: connect CONNECT_REST_PORT: 8083 CONNECT_GROUP_ID: tracking-kafka-connect-s3 CONNECT_CONFIG_STORAGE_TOPIC: tracking-kafka-connect-s3-configs CONNECT_CONFIG_STORAGE_REPLICATION_FACTOR: 1 CONNECT_OFFSET_FLUSH_INTERVAL_MS: 10000 CONNECT_OFFSET_STORAGE_TOPIC: tracking-kafka-connect-s3-offsets CONNECT_OFFSET_STORAGE_REPLICATION_FACTOR: 1 CONNECT_STATUS_STORAGE_TOPIC: tracking-kafka-connect-s3-status CONNECT_STATUS_STORAGE_REPLICATION_FACTOR: 1 CONNECT_KEY_CONVERTER: org.apache.kafka.connect.storage.StringConverter CONNECT_VALUE_CONVERTER: com.blueapron.connect.protobuf.ProtobufConverter CONNECT_VALUE_CONVERTER_PROTOCLASSNAME: com.bagi.protobuf.TrackingEvent$$Event # CONNECT_KEY_CONVERTER: org.apache.kafka.connect.json.JsonConverter # CONNECT_VALUE_CONVERTER: org.apache.kafka.connect.json.JsonConverter CONNECT_INTERNAL_KEY_CONVERTER: org.apache.kafka.connect.json.JsonConverter CONNECT_INTERNAL_VALUE_CONVERTER: org.apache.kafka.connect.json.JsonConverter CONNECT_PLUGIN_PATH: "/usr/share/java,/usr/share/confluent-hub-components" CONNECT_LOG4J_LOGGERS: org.apache.zookeeper=ERROR,org.I0Itec.zkclient=ERROR,org.reflections=ERROR CONNECT_SCHEMA_COMPATIBILITY: None
Here is the config for kafka-connct-s3 which is used in curl -XPOST http://localhost:8083/connectors -H "Content-Type: application/json" -d @tracking-kafka-connect-s3.json to invoke the connector
curl -XPOST http://localhost:8083/connectors -H "Content-Type: application/json" -d @tracking-kafka-connect-s3.json
{ "name": "s3-sink", "config": { "connector.class": "io.confluent.connect.s3.S3SinkConnector", "key.converter": "org.apache.kafka.connect.storage.StringConverter", "value.converter": "com.blueapron.connect.protobuf.ProtobufConverter", "value.converter.protoClassName": "com.bagi.protobuf.TrackingEvent$Event", "tasks.max": "1", "topics.regex": ".*", "format.class": "io.confluent.connect.s3.format.parquet.ParquetFormat", "flush.size": "100000", "rotate.interval.ms": "900", "s3.region": "us-east-1", "s3.bucket.name": "dev-bagi", "s3.part.size": "5242880", "storage.class": "io.confluent.connect.s3.storage.S3Storage", "topics.dir": "topics", "file.delim": "_", "schema.compatibility": "NONE", "partitioner.class": "io.confluent.connect.storage.partitioner.TimeBasedPartitioner", "partition.duration.ms": "900000", "path.format": "'dt'=YYYY-MM-dd", "locale": "en-US", "timezone": "UTC", "timestamp.extractor": "RecordField", "timestamp.field": "event_ts", "timestamp.unit": "s" } }
Here are the connect logs
connect | ===> ENV Variables ... connect | ALLOW_UNSIGNED=false connect | AWS_ACCESS_KEY_ID=AKIAZJOTBLHERW7PERET connect | AWS_SECRET_ACCESS_KEY=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 connect | COMPONENT=kafka-connect connect | CONFLUENT_DEB_VERSION=1 connect | CONFLUENT_PLATFORM_LABEL= connect | CONFLUENT_VERSION=5.4.0 connect | CONNECT_BOOTSTRAP_SERVERS=broker:29092 connect | CONNECT_CONFIG_STORAGE_REPLICATION_FACTOR=1 connect | CONNECT_CONFIG_STORAGE_TOPIC=tracking-kafka-connect-s3-configs connect | CONNECT_GROUP_ID=tracking-kafka-connect-s3 connect | CONNECT_INTERNAL_KEY_CONVERTER=org.apache.kafka.connect.json.JsonConverter connect | CONNECT_INTERNAL_VALUE_CONVERTER=org.apache.kafka.connect.json.JsonConverter connect | CONNECT_KEY_CONVERTER=org.apache.kafka.connect.storage.StringConverter connect | CONNECT_LOG4J_LOGGERS=org.apache.zookeeper=ERROR,org.I0Itec.zkclient=ERROR,org.reflections=ERROR connect | CONNECT_OFFSET_FLUSH_INTERVAL_MS=10000 connect | CONNECT_OFFSET_STORAGE_REPLICATION_FACTOR=1 connect | CONNECT_OFFSET_STORAGE_TOPIC=tracking-kafka-connect-s3-offsets connect | CONNECT_PLUGIN_PATH=/usr/share/java,/usr/share/confluent-hub-components connect | CONNECT_REST_ADVERTISED_HOST_NAME=connect connect | CONNECT_REST_PORT=8083 connect | CONNECT_SCHEMA_COMPATIBILITY=None connect | CONNECT_STATUS_STORAGE_REPLICATION_FACTOR=1 connect | CONNECT_STATUS_STORAGE_TOPIC=tracking-kafka-connect-s3-status connect | CONNECT_VALUE_CONVERTER=com.blueapron.connect.protobuf.ProtobufConverter connect | CONNECT_VALUE_CONVERTER_PROTOCLASSNAME=com.bagi.protobuf.TrackingEvent$Event connect | CONNECT_ZOOKEEPER_CONNECT=zookeeper:2181 connect | CONSUMER_KEY_CONVERTER=org.apache.kafka.connect.storage.StringConverter connect | CONSUMER_SCHEMA_COMPATIBILITY=None connect | CONSUMER_VALUE_CONVERTER=com.blueapron.connect.protobuf.ProtobufConverter connect | CONSUMER_VALUE_CONVERTER_PROTOCLASSNAME=com.bagi.protobuf.TrackingEvent$Event connect | CUB_CLASSPATH=/etc/confluent/docker/docker-utils.jar connect | HOME=/root connect | HOSTNAME=connect connect | KAFKA_ADVERTISED_LISTENERS= connect | KAFKA_VERSION= connect | KAFKA_ZOOKEEPER_CONNECT= connect | LANG=C.UTF-8 connect | PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin connect | PRODUCER_KEY_CONVERTER=org.apache.kafka.connect.storage.StringConverter connect | PRODUCER_SCHEMA_COMPATIBILITY=None connect | PRODUCER_VALUE_CONVERTER=com.blueapron.connect.protobuf.ProtobufConverter connect | PRODUCER_VALUE_CONVERTER_PROTOCLASSNAME=com.bagi.protobuf.TrackingEvent$Event connect | PWD=/ connect | PYTHON_PIP_VERSION=8.1.2 connect | PYTHON_VERSION=2.7.9-1 connect | SCALA_VERSION=2.12 connect | SHLVL=1 connect | ZULU_OPENJDK_VERSION=8=8.38.0.13 connect | _=/usr/bin/env connect | ===> User connect | uid=0(root) gid=0(root) groups=0(root) connect | ===> Configuring ... connect | ===> Running preflight checks ... connect | ===> Check if Kafka is healthy ... connect | [main] INFO org.apache.kafka.clients.admin.AdminClientConfig - AdminClientConfig values: connect | bootstrap.servers = [broker:29092] connect | client.dns.lookup = default connect | client.id = connect | connections.max.idle.ms = 300000 connect | metadata.max.age.ms = 300000 connect | metric.reporters = [] connect | metrics.num.samples = 2 connect | metrics.recording.level = INFO connect | metrics.sample.window.ms = 30000 connect | receive.buffer.bytes = 65536 connect | reconnect.backoff.max.ms = 1000 connect | reconnect.backoff.ms = 50 connect | request.timeout.ms = 120000 connect | retries = 5 connect | retry.backoff.ms = 100 connect | sasl.client.callback.handler.class = null connect | sasl.jaas.config = null connect | sasl.kerberos.kinit.cmd = /usr/bin/kinit connect | sasl.kerberos.min.time.before.relogin = 60000 connect | sasl.kerberos.service.name = null connect | sasl.kerberos.ticket.renew.jitter = 0.05 connect | sasl.kerberos.ticket.renew.window.factor = 0.8 connect | sasl.login.callback.handler.class = null connect | sasl.login.class = null connect | sasl.login.refresh.buffer.seconds = 300 connect | sasl.login.refresh.min.period.seconds = 60 connect | sasl.login.refresh.window.factor = 0.8 connect | sasl.login.refresh.window.jitter = 0.05 connect | sasl.mechanism = GSSAPI connect | security.protocol = PLAINTEXT connect | security.providers = null connect | send.buffer.bytes = 131072 connect | ssl.cipher.suites = null connect | ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] connect | ssl.endpoint.identification.algorithm = https connect | ssl.key.password = null connect | ssl.keymanager.algorithm = SunX509 connect | ssl.keystore.location = null connect | ssl.keystore.password = null connect | ssl.keystore.type = JKS connect | ssl.protocol = TLS connect | ssl.provider = null connect | ssl.secure.random.implementation = null connect | ssl.trustmanager.algorithm = PKIX connect | ssl.truststore.location = null connect | ssl.truststore.password = null connect | ssl.truststore.type = JKS connect | connect | [main] INFO org.apache.kafka.common.utils.AppInfoParser - Kafka version: 5.4.0-ccs connect | [main] INFO org.apache.kafka.common.utils.AppInfoParser - Kafka commitId: f4201a82bea68cc7 connect | [main] INFO org.apache.kafka.common.utils.AppInfoParser - Kafka startTimeMs: 1582587236633 connect | [kafka-admin-client-thread | adminclient-1] WARN org.apache.kafka.clients.NetworkClient - [AdminClient clientId=adminclient-1] Connection to node -1 (broker/192.168.160.3:29092) could not be established. Broker may not be available. connect | [kafka-admin-client-thread | adminclient-1] WARN org.apache.kafka.clients.NetworkClient - [AdminClient clientId=adminclient-1] Connection to node -1 (broker/192.168.160.3:29092) could not be established. Broker may not be available. connect | [kafka-admin-client-thread | adminclient-1] WARN org.apache.kafka.clients.NetworkClient - [AdminClient clientId=adminclient-1] Connection to node -1 (broker/192.168.160.3:29092) could not be established. Broker may not be available. connect | [kafka-admin-client-thread | adminclient-1] WARN org.apache.kafka.clients.NetworkClient - [AdminClient clientId=adminclient-1] Connection to node -1 (broker/192.168.160.3:29092) could not be established. Broker may not be available. connect | [kafka-admin-client-thread | adminclient-1] WARN org.apache.kafka.clients.NetworkClient - [AdminClient clientId=adminclient-1] Connection to node -1 (broker/192.168.160.3:29092) could not be established. Broker may not be available. connect | ===> Launching ... connect | ===> Launching kafka-connect ... connect | [2020-02-24 23:34:00,242] INFO WorkerInfo values: connect | jvm.args = -Xms256M, -Xmx2G, -XX:+UseG1GC, -XX:MaxGCPauseMillis=20, -XX:InitiatingHeapOccupancyPercent=35, -XX:+ExplicitGCInvokesConcurrent, -Djava.awt.headless=true, -Dcom.sun.management.jmxremote=true, -Dcom.sun.management.jmxremote.authenticate=false, -Dcom.sun.management.jmxremote.ssl=false, -Dkafka.logs.dir=/var/log/kafka, -Dlog4j.configuration=file:/etc/kafka/connect-log4j.properties connect | jvm.spec = Azul Systems, Inc., OpenJDK 64-Bit Server VM, 1.8.0_212, 25.212-b04 connect | jvm.classpath = /etc/kafka-connect/jars/*:/usr/share/java/kafka/jersey-hk2-2.28.jar:/usr/share/java/kafka/kafka-streams-examples-5.4.0-ccs.jar:/usr/share/java/kafka/netty-common-4.1.42.Final.jar:/usr/share/java/kafka/connect-mirror-5.4.0-ccs.jar:/usr/share/java/kafka/jersey-client-2.28.jar:/usr/share/java/kafka/kafka_2.12-5.4.0-ccs-scaladoc.jar:/usr/share/java/kafka/httpclient-4.5.9.jar:/usr/share/java/kafka/jetty-continuation-9.4.20.v20190813.jar:/usr/share/java/kafka/maven-artifact-3.6.1.jar:/usr/share/java/kafka/scala-java8-compat_2.12-0.9.0.jar:/usr/share/java/kafka/connect-json-5.4.0-ccs.jar:/usr/share/java/kafka/plexus-utils-3.2.0.jar:/usr/share/java/kafka/jersey-server-2.28.jar:/usr/share/java/kafka/kafka_2.12-5.4.0-ccs-sources.jar:/usr/share/java/kafka/support-metrics-client-5.4.0-ccs.jar:/usr/share/java/kafka/kafka-log4j-appender-5.4.0-ccs.jar:/usr/share/java/kafka/kafka_2.12-5.4.0-ccs-javadoc.jar:/usr/share/java/kafka/jackson-jaxrs-json-provider-2.9.10.jar:/usr/share/java/kafka/kafka.jar:/usr/share/java/kafka/jackson-databind-2.9.10.1.jar:/usr/share/java/kafka/kafka_2.12-5.4.0-ccs-test.jar:/usr/share/java/kafka/scala-library-2.12.10.jar:/usr/share/java/kafka/jakarta.ws.rs-api-2.1.5.jar:/usr/share/java/kafka/jetty-client-9.4.20.v20190813.jar:/usr/share/java/kafka/netty-transport-native-epoll-4.1.42.Final.jar:/usr/share/java/kafka/zstd-jni-1.4.3-1.jar:/usr/share/java/kafka/connect-file-5.4.0-ccs.jar:/usr/share/java/kafka/avro-1.9.1.jar:/usr/share/java/kafka/hk2-locator-2.5.0.jar:/usr/share/java/kafka/slf4j-api-1.7.28.jar:/usr/share/java/kafka/kafka_2.12-5.4.0-ccs-test-sources.jar:/usr/share/java/kafka/commons-codec-1.11.jar:/usr/share/java/kafka/audience-annotations-0.5.0.jar:/usr/share/java/kafka/reflections-0.9.11.jar:/usr/share/java/kafka/rocksdbjni-5.18.3.jar:/usr/share/java/kafka/kafka-clients-5.4.0-ccs.jar:/usr/share/java/kafka/scala-collection-compat_2.12-2.1.2.jar:/usr/share/java/kafka/kafka_2.12-5.4.0-ccs.jar:/usr/share/java/kafka/jetty-util-9.4.20.v20190813.jar:/usr/share/java/kafka/httpmime-4.5.9.jar:/usr/share/java/kafka/jetty-io-9.4.20.v20190813.jar:/usr/share/java/kafka/jackson-jaxrs-base-2.9.10.jar:/usr/share/java/kafka/log4j-1.2.17.jar:/usr/share/java/kafka/kafka-tools-5.4.0-ccs.jar:/usr/share/java/kafka/connect-runtime-5.4.0-ccs.jar:/usr/share/java/kafka/connect-basic-auth-extension-5.4.0-ccs.jar:/usr/share/java/kafka/netty-codec-4.1.42.Final.jar:/usr/share/java/kafka/aopalliance-repackaged-2.5.0.jar:/usr/share/java/kafka/connect-mirror-client-5.4.0-ccs.jar:/usr/share/java/kafka/javax.ws.rs-api-2.1.1.jar:/usr/share/java/kafka/jersey-common-2.28.jar:/usr/share/java/kafka/httpcore-4.4.11.jar:/usr/share/java/kafka/connect-transforms-5.4.0-ccs.jar:/usr/share/java/kafka/activation-1.1.1.jar:/usr/share/java/kafka/netty-buffer-4.1.42.Final.jar:/usr/share/java/kafka/netty-transport-native-unix-common-4.1.42.Final.jar:/usr/share/java/kafka/hk2-api-2.5.0.jar:/usr/share/java/kafka/netty-handler-4.1.42.Final.jar:/usr/share/java/kafka/jaxb-api-2.3.0.jar:/usr/share/java/kafka/javax.servlet-api-3.1.0.jar:/usr/share/java/kafka/javassist-3.22.0-CR2.jar:/usr/share/java/kafka/jackson-module-jaxb-annotations-2.9.10.jar:/usr/share/java/kafka/jetty-server-9.4.20.v20190813.jar:/usr/share/java/kafka/validation-api-2.0.1.Final.jar:/usr/share/java/kafka/jackson-annotations-2.9.10.jar:/usr/share/java/kafka/jackson-dataformat-csv-2.9.10.jar:/usr/share/java/kafka/jersey-container-servlet-2.28.jar:/usr/share/java/kafka/lz4-java-1.6.0.jar:/usr/share/java/kafka/connect-api-5.4.0-ccs.jar:/usr/share/java/kafka/kafka-streams-scala_2.12-5.4.0-ccs.jar:/usr/share/java/kafka/guava-20.0.jar:/usr/share/java/kafka/jetty-http-9.4.20.v20190813.jar:/usr/share/java/kafka/jetty-security-9.4.20.v20190813.jar:/usr/share/java/kafka/kafka-streams-5.4.0-ccs.jar:/usr/share/java/kafka/jopt-simple-5.0.4.jar:/usr/share/java/kafka/jackson-module-scala_2.12-2.9.10.jar:/usr/share/java/kafka/kafka-streams-test-utils-5.4.0-ccs.jar:/usr/share/java/kafka/support-metrics-common-5.4.0-ccs.jar:/usr/share/java/kafka/jackson-core-2.9.10.jar:/usr/share/java/kafka/commons-compress-1.19.jar:/usr/share/java/kafka/hk2-utils-2.5.0.jar:/usr/share/java/kafka/jakarta.annotation-api-1.3.4.jar:/usr/share/java/kafka/paranamer-2.8.jar:/usr/share/java/kafka/scala-reflect-2.12.10.jar:/usr/share/java/kafka/jersey-media-jaxb-2.28.jar:/usr/share/java/kafka/jackson-datatype-jdk8-2.9.10.jar:/usr/share/java/kafka/netty-transport-4.1.42.Final.jar:/usr/share/java/kafka/osgi-resource-locator-1.0.1.jar:/usr/share/java/kafka/netty-resolver-4.1.42.Final.jar:/usr/share/java/kafka/jakarta.inject-2.5.0.jar:/usr/share/java/kafka/slf4j-log4j12-1.7.28.jar:/usr/share/java/kafka/jersey-container-servlet-core-2.28.jar:/usr/share/java/kafka/snappy-java-1.1.7.3.jar:/usr/share/java/kafka/jetty-servlets-9.4.20.v20190813.jar:/usr/share/java/kafka/jackson-module-paranamer-2.9.10.jar:/usr/share/java/kafka/commons-lang3-3.8.1.jar:/usr/share/java/kafka/argparse4j-0.7.0.jar:/usr/share/java/kafka/zookeeper-jute-3.5.6.jar:/usr/share/java/kafka/zookeeper-3.5.6.jar:/usr/share/java/kafka/commons-logging-1.2.jar:/usr/share/java/kafka/metrics-core-2.2.0.jar:/usr/share/java/kafka/commons-cli-1.4.jar:/usr/share/java/kafka/jetty-servlet-9.4.20.v20190813.jar:/usr/share/java/kafka/scala-logging_2.12-3.9.2.jar:/usr/share/java/kafka/confluent-metrics-5.4.0-ce.jar:/usr/share/java/confluent-common/common-utils-5.4.0.jar:/usr/share/java/confluent-common/common-config-5.4.0.jar:/usr/share/java/confluent-common/build-tools-5.4.0.jar:/usr/share/java/confluent-common/slf4j-api-1.7.26.jar:/usr/share/java/confluent-common/common-metrics-5.4.0.jar:/usr/share/java/kafka-serde-tools/jackson-dataformat-yaml-2.9.10.jar:/usr/share/java/kafka-serde-tools/swagger-core-1.5.3.jar:/usr/share/java/kafka-serde-tools/jackson-databind-2.9.10.1.jar:/usr/share/java/kafka-serde-tools/kafka-connect-avro-converter-5.4.0.jar:/usr/share/java/kafka-serde-tools/avro-1.9.1.jar:/usr/share/java/kafka-serde-tools/jackson-datatype-joda-2.9.10.jar:/usr/share/java/kafka-serde-tools/kafka-avro-serializer-5.4.0.jar:/usr/share/java/kafka-serde-tools/kafka-streams-avro-serde-5.4.0.jar:/usr/share/java/kafka-serde-tools/kafka-schema-registry-client-5.4.0.jar:/usr/share/java/kafka-serde-tools/swagger-models-1.5.3.jar:/usr/share/java/kafka-serde-tools/jackson-annotations-2.9.10.jar:/usr/share/java/kafka-serde-tools/joda-time-2.7.jar:/usr/share/java/kafka-serde-tools/guava-20.0.jar:/usr/share/java/kafka-serde-tools/jackson-core-2.9.10.jar:/usr/share/java/kafka-serde-tools/commons-compress-1.19.jar:/usr/share/java/kafka-serde-tools/kafka-json-serializer-5.4.0.jar:/usr/share/java/kafka-serde-tools/snakeyaml-1.23.jar:/usr/share/java/kafka-serde-tools/commons-lang3-3.8.1.jar:/usr/share/java/kafka-serde-tools/swagger-annotations-1.5.22.jar:/usr/share/java/kafka-serde-tools/tracking-1.0.0.LOCAL.jar:/usr/share/java/kafka-serde-tools/kafka-connect-protobuf-converter-3.0.0.jar:/usr/share/java/kafka-serde-tools/protobuf-java-3.11.4.jar:/usr/share/java/monitoring-interceptors/monitoring-interceptors-5.4.0.jar:/usr/bin/../share/java/kafka/jersey-hk2-2.28.jar:/usr/bin/../share/java/kafka/kafka-streams-examples-5.4.0-ccs.jar:/usr/bin/../share/java/kafka/netty-common-4.1.42.Final.jar:/usr/bin/../share/java/kafka/connect-mirror-5.4.0-ccs.jar:/usr/bin/../share/java/kafka/jersey-client-2.28.jar:/usr/bin/../share/java/kafka/kafka_2.12-5.4.0-ccs-scaladoc.jar:/usr/bin/../share/java/kafka/httpclient-4.5.9.jar:/usr/bin/../share/java/kafka/jetty-continuation-9.4.20.v20190813.jar:/usr/bin/../share/java/kafka/maven-artifact-3.6.1.jar:/usr/bin/../share/java/kafka/scala-java8-compat_2.12-0.9.0.jar:/usr/bin/../share/java/kafka/connect-json-5.4.0-ccs.jar:/usr/bin/../share/java/kafka/plexus-utils-3.2.0.jar:/usr/bin/../share/java/kafka/jersey-server-2.28.jar:/usr/bin/../share/java/kafka/kafka_2.12-5.4.0-ccs-sources.jar:/usr/bin/../share/java/kafka/support-metrics-client-5.4.0-ccs.jar:/usr/bin/../share/java/kafka/kafka-log4j-appender-5.4.0-ccs.jar:/usr/bin/../share/java/kafka/kafka_2.12-5.4.0-ccs-javadoc.jar:/usr/bin/../share/java/kafka/jackson-jaxrs-json-provider-2.9.10.jar:/usr/bin/../share/java/kafka/kafka.jar:/usr/bin/../share/java/kafka/jackson-databind-2.9.10.1.jar:/usr/bin/../share/java/kafka/kafka_2.12-5.4.0-ccs-test.jar:/usr/bin/../share/java/kafka/scala-library-2.12.10.jar:/usr/bin/../share/java/kafka/jakarta.ws.rs-api-2.1.5.jar:/usr/bin/../share/java/kafka/jetty-client-9.4.20.v20190813.jar:/usr/bin/../share/java/kafka/netty-transport-native-epoll-4.1.42.Final.jar:/usr/bin/../share/java/kafka/zstd-jni-1.4.3-1.jar:/usr/bin/../share/java/kafka/connect-file-5.4.0-ccs.jar:/usr/bin/../share/java/kafka/avro-1.9.1.jar:/usr/bin/../share/java/kafka/hk2-locator-2.5.0.jar:/usr/bin/../share/java/kafka/slf4j-api-1.7.28.jar:/usr/bin/../share/java/kafka/kafka_2.12-5.4.0-ccs-test-sources.jar:/usr/bin/../share/java/kafka/commons-codec-1.11.jar:/usr/bin/../share/java/kafka/audience-annotations-0.5.0.jar:/usr/bin/../share/java/kafka/reflections-0.9.11.jar:/usr/bin/../share/java/kafka/rocksdbjni-5.18.3.jar:/usr/bin/../share/java/kafka/kafka-clients-5.4.0-ccs.jar:/usr/bin/../share/java/kafka/scala-collection-compat_2.12-2.1.2.jar:/usr/bin/../share/java/kafka/kafka_2.12-5.4.0-ccs.jar:/usr/bin/../share/java/kafka/jetty-util-9.4.20.v20190813.jar:/usr/bin/../share/java/kafka/httpmime-4.5.9.jar:/usr/bin/../share/java/kafka/jetty-io-9.4.20.v20190813.jar:/usr/bin/../share/java/kafka/jackson-jaxrs-base-2.9.10.jar:/usr/bin/../share/java/kafka/log4j-1.2.17.jar:/usr/bin/../share/java/kafka/kafka-tools-5.4.0-ccs.jar:/usr/bin/../share/java/kafka/connect-runtime-5.4.0-ccs.jar:/usr/bin/../share/java/kafka/connect-basic-auth-extension-5.4.0-ccs.jar:/usr/bin/../share/java/kafka/netty-codec-4.1.42.Final.jar:/usr/bin/../share/java/kafka/aopalliance-repackaged-2.5.0.jar:/usr/bin/../share/java/kafka/connect-mirror-client-5.4.0-ccs.jar:/usr/bin/../share/java/kafka/javax.ws.rs-api-2.1.1.jar:/usr/bin/../share/java/kafka/jersey-common-2.28.jar:/usr/bin/../share/java/kafka/httpcore-4.4.11.jar:/usr/bin/../share/java/kafka/connect-transforms-5.4.0-ccs.jar:/usr/bin/../share/java/kafka/activation-1.1.1.jar:/usr/bin/../share/java/kafka/netty-buffer-4.1.42.Final.jar:/usr/bin/../share/java/kafka/netty-transport-native-unix-common-4.1.42.Final.jar:/usr/bin/../share/java/kafka/hk2-api-2.5.0.jar:/usr/bin/../share/java/kafka/netty-handler-4.1.42.Final.jar:/usr/bin/../share/java/kafka/jaxb-api-2.3.0.jar:/usr/bin/../share/java/kafka/javax.servlet-api-3.1.0.jar:/usr/bin/../share/java/kafka/javassist-3.22.0-CR2.jar:/usr/bin/../share/java/kafka/jackson-module-jaxb-annotations-2.9.10.jar:/usr/bin/../share/java/kafka/jetty-server-9.4.20.v20190813.jar:/usr/bin/../share/java/kafka/validation-api-2.0.1.Final.jar:/usr/bin/../share/java/kafka/jackson-annotations-2.9.10.jar:/usr/bin/../share/java/kafka/jackson-dataformat-csv-2.9.10.jar:/usr/bin/../share/java/kafka/jersey-container-servlet-2.28.jar:/usr/bin/../share/java/kafka/lz4-java-1.6.0.jar:/usr/bin/../share/java/kafka/connect-api-5.4.0-ccs.jar:/usr/bin/../share/java/kafka/kafka-streams-scala_2.12-5.4.0-ccs.jar:/usr/bin/../share/java/kafka/guava-20.0.jar:/usr/bin/../share/java/kafka/jetty-http-9.4.20.v20190813.jar:/usr/bin/../share/java/kafka/jetty-security-9.4.20.v20190813.jar:/usr/bin/../share/java/kafka/kafka-streams-5.4.0-ccs.jar:/usr/bin/../share/java/kafka/jopt-simple-5.0.4.jar:/usr/bin/../share/java/kafka/jackson-module-scala_2.12-2.9.10.jar:/usr/bin/../share/java/kafka/kafka-streams-test-utils-5.4.0-ccs.jar:/usr/bin/../share/java/kafka/support-metrics-common-5.4.0-ccs.jar:/usr/bin/../share/java/kafka/jackson-core-2.9.10.jar:/usr/bin/../share/java/kafka/commons-compress-1.19.jar:/usr/bin/../share/java/kafka/hk2-utils-2.5.0.jar:/usr/bin/../share/java/kafka/jakarta.annotation-api-1.3.4.jar:/usr/bin/../share/java/kafka/paranamer-2.8.jar:/usr/bin/../share/java/kafka/scala-reflect-2.12.10.jar:/usr/bin/../share/java/kafka/jersey-media-jaxb-2.28.jar:/usr/bin/../share/java/kafka/jackson-datatype-jdk8-2.9.10.jar:/usr/bin/../share/java/kafka/netty-transport-4.1.42.Final.jar:/usr/bin/../share/java/kafka/osgi-resource-locator-1.0.1.jar:/usr/bin/../share/java/kafka/netty-resolver-4.1.42.Final.jar:/usr/bin/../share/java/kafka/jakarta.inject-2.5.0.jar:/usr/bin/../share/java/kafka/slf4j-log4j12-1.7.28.jar:/usr/bin/../share/java/kafka/jersey-container-servlet-core-2.28.jar:/usr/bin/../share/java/kafka/snappy-java-1.1.7.3.jar:/usr/bin/../share/java/kafka/jetty-servlets-9.4.20.v20190813.jar:/usr/bin/../share/java/kafka/jackson-module-paranamer-2.9.10.jar:/usr/bin/../share/java/kafka/commons-lang3-3.8.1.jar:/usr/bin/../share/java/kafka/argparse4j-0.7.0.jar:/usr/bin/../share/java/kafka/zookeeper-jute-3.5.6.jar:/usr/bin/../share/java/kafka/zookeeper-3.5.6.jar:/usr/bin/../share/java/kafka/commons-logging-1.2.jar:/usr/bin/../share/java/kafka/metrics-core-2.2.0.jar:/usr/bin/../share/java/kafka/commons-cli-1.4.jar:/usr/bin/../share/java/kafka/jetty-servlet-9.4.20.v20190813.jar:/usr/bin/../share/java/kafka/scala-logging_2.12-3.9.2.jar:/usr/bin/../share/java/kafka/confluent-metrics-5.4.0-ce.jar:/usr/bin/../support-metrics-client/build/dependant-libs-2.12/*:/usr/bin/../support-metrics-client/build/libs/*:/usr/share/java/support-metrics-client/* connect | os.spec = Linux, amd64, 4.19.76-linuxkit connect | os.vcpus = 4 connect | (org.apache.kafka.connect.runtime.WorkerInfo) connect | [2020-02-24 23:34:00,265] INFO Scanning for plugin classes. This might take a moment ... (org.apache.kafka.connect.cli.ConnectDistributed) connect | [2020-02-24 23:34:00,316] INFO Loading plugin from: /usr/share/java/kafka-connect-storage-common (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:02,738] INFO Registered loader: PluginClassLoader{pluginLocation=file:/usr/share/java/kafka-connect-storage-common/} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:02,739] INFO Added plugin 'io.confluent.connect.storage.tools.SchemaSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:02,739] INFO Added plugin 'io.confluent.connect.avro.AvroConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:02,740] INFO Added plugin 'org.apache.kafka.common.config.provider.FileConfigProvider' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:02,740] INFO Added plugin 'org.apache.kafka.connect.connector.policy.AllConnectorClientConfigOverridePolicy' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:02,740] INFO Added plugin 'org.apache.kafka.connect.connector.policy.PrincipalConnectorClientConfigOverridePolicy' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:02,740] INFO Added plugin 'org.apache.kafka.connect.connector.policy.NoneConnectorClientConfigOverridePolicy' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:02,742] INFO Loading plugin from: /usr/share/java/kafka-serde-tools (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:03,015] INFO Registered loader: PluginClassLoader{pluginLocation=file:/usr/share/java/kafka-serde-tools/} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:03,016] INFO Added plugin 'com.blueapron.connect.protobuf.ProtobufConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:03,017] INFO Loading plugin from: /usr/share/java/kafka-connect-s3 (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:05,167] INFO Registered loader: PluginClassLoader{pluginLocation=file:/usr/share/java/kafka-connect-s3/} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:05,167] INFO Added plugin 'io.confluent.connect.s3.S3SinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:05,167] INFO Loading plugin from: /usr/share/java/rest-utils (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:05,618] INFO Registered loader: PluginClassLoader{pluginLocation=file:/usr/share/java/rest-utils/} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:05,619] INFO Loading plugin from: /usr/share/java/confluent-control-center (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:07,538] INFO Registered loader: PluginClassLoader{pluginLocation=file:/usr/share/java/confluent-control-center/} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:07,538] INFO Added plugin 'org.apache.kafka.connect.tools.MockSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:07,538] INFO Added plugin 'org.apache.kafka.connect.tools.VerifiableSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:07,538] INFO Added plugin 'org.apache.kafka.connect.tools.SchemaSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:07,538] INFO Added plugin 'org.apache.kafka.connect.tools.MockSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:07,539] INFO Added plugin 'org.apache.kafka.connect.tools.MockConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:07,539] INFO Added plugin 'org.apache.kafka.connect.tools.VerifiableSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:07,539] INFO Added plugin 'org.apache.kafka.connect.converters.IntegerConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:07,539] INFO Added plugin 'org.apache.kafka.connect.converters.DoubleConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:07,539] INFO Added plugin 'org.apache.kafka.connect.converters.LongConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:07,539] INFO Added plugin 'org.apache.kafka.connect.json.JsonConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:07,539] INFO Added plugin 'org.apache.kafka.connect.converters.ShortConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:07,539] INFO Added plugin 'org.apache.kafka.connect.converters.ByteArrayConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:07,540] INFO Added plugin 'org.apache.kafka.connect.converters.FloatConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:07,540] INFO Added plugin 'org.apache.kafka.connect.storage.StringConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:07,540] INFO Added plugin 'org.apache.kafka.connect.storage.SimpleHeaderConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:07,540] INFO Added plugin 'org.apache.kafka.connect.transforms.SetSchemaMetadata$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:07,540] INFO Added plugin 'org.apache.kafka.connect.transforms.Flatten$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:07,541] INFO Added plugin 'org.apache.kafka.connect.transforms.MaskField$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:07,541] INFO Added plugin 'org.apache.kafka.connect.transforms.ExtractField$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:07,541] INFO Added plugin 'org.apache.kafka.connect.transforms.Cast$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:07,541] INFO Added plugin 'org.apache.kafka.connect.transforms.ReplaceField$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:07,541] INFO Added plugin 'org.apache.kafka.connect.transforms.TimestampRouter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:07,542] INFO Added plugin 'org.apache.kafka.connect.transforms.TimestampConverter$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:07,542] INFO Added plugin 'org.apache.kafka.connect.transforms.ReplaceField$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:07,542] INFO Added plugin 'org.apache.kafka.connect.transforms.HoistField$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:07,542] INFO Added plugin 'org.apache.kafka.connect.transforms.HoistField$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:07,543] INFO Added plugin 'org.apache.kafka.connect.transforms.RegexRouter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:07,543] INFO Added plugin 'org.apache.kafka.connect.transforms.TimestampConverter$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:07,543] INFO Added plugin 'org.apache.kafka.connect.transforms.MaskField$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:07,543] INFO Added plugin 'org.apache.kafka.connect.transforms.Cast$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:07,543] INFO Added plugin 'org.apache.kafka.connect.transforms.InsertField$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:07,543] INFO Added plugin 'org.apache.kafka.connect.transforms.ValueToKey' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:07,544] INFO Added plugin 'org.apache.kafka.connect.transforms.ExtractField$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:07,544] INFO Added plugin 'org.apache.kafka.connect.transforms.InsertField$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:07,544] INFO Added plugin 'org.apache.kafka.connect.transforms.SetSchemaMetadata$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:07,544] INFO Added plugin 'org.apache.kafka.connect.transforms.Flatten$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:07,545] INFO Added plugin 'io.confluent.kafka.secretregistry.client.config.provider.SecretConfigProvider' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:07,545] INFO Added plugin 'io.confluent.connect.security.ConnectSecurityExtension' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:07,546] INFO Loading plugin from: /usr/share/java/monitoring-interceptors (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:07,811] INFO Registered loader: PluginClassLoader{pluginLocation=file:/usr/share/java/monitoring-interceptors/} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:07,812] INFO Loading plugin from: /usr/share/java/acl (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:10,061] INFO Registered loader: PluginClassLoader{pluginLocation=file:/usr/share/java/acl/} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:10,063] INFO Loading plugin from: /usr/share/java/kafka (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:11,323] INFO Registered loader: PluginClassLoader{pluginLocation=file:/usr/share/java/kafka/} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:11,323] INFO Added plugin 'org.apache.kafka.connect.mirror.MirrorHeartbeatConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:11,323] INFO Added plugin 'org.apache.kafka.connect.file.FileStreamSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:11,323] INFO Added plugin 'org.apache.kafka.connect.mirror.MirrorCheckpointConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:11,323] INFO Added plugin 'org.apache.kafka.connect.mirror.MirrorSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:11,324] INFO Added plugin 'org.apache.kafka.connect.file.FileStreamSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:11,324] INFO Added plugin 'org.apache.kafka.connect.rest.basic.auth.extension.BasicAuthSecurityRestExtension' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:11,325] INFO Loading plugin from: /usr/share/java/confluent-rebalancer (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:12,234] INFO Registered loader: PluginClassLoader{pluginLocation=file:/usr/share/java/confluent-rebalancer/} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:12,235] INFO Loading plugin from: /usr/share/java/schema-registry (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:12,630] INFO Registered loader: PluginClassLoader{pluginLocation=file:/usr/share/java/schema-registry/} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:12,630] INFO Loading plugin from: /usr/share/java/confluent-common (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:12,645] INFO Registered loader: PluginClassLoader{pluginLocation=file:/usr/share/java/confluent-common/} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:12,646] INFO Loading plugin from: /usr/share/java/confluent-hub-client (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:12,778] INFO Registered loader: PluginClassLoader{pluginLocation=file:/usr/share/java/confluent-hub-client/} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:14,402] INFO Registered loader: sun.misc.Launcher$AppClassLoader@764c12b6 (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:14,404] INFO Added aliases 'S3SinkConnector' and 'S3Sink' to plugin 'io.confluent.connect.s3.S3SinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:14,405] INFO Added aliases 'FileStreamSinkConnector' and 'FileStreamSink' to plugin 'org.apache.kafka.connect.file.FileStreamSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:14,405] INFO Added aliases 'FileStreamSourceConnector' and 'FileStreamSource' to plugin 'org.apache.kafka.connect.file.FileStreamSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:14,406] INFO Added aliases 'MirrorCheckpointConnector' and 'MirrorCheckpoint' to plugin 'org.apache.kafka.connect.mirror.MirrorCheckpointConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:14,406] INFO Added aliases 'MirrorHeartbeatConnector' and 'MirrorHeartbeat' to plugin 'org.apache.kafka.connect.mirror.MirrorHeartbeatConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:14,407] INFO Added aliases 'MirrorSourceConnector' and 'MirrorSource' to plugin 'org.apache.kafka.connect.mirror.MirrorSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:14,407] INFO Added aliases 'MockConnector' and 'Mock' to plugin 'org.apache.kafka.connect.tools.MockConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:14,407] INFO Added aliases 'MockSinkConnector' and 'MockSink' to plugin 'org.apache.kafka.connect.tools.MockSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:14,408] INFO Added aliases 'MockSourceConnector' and 'MockSource' to plugin 'org.apache.kafka.connect.tools.MockSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:14,409] INFO Added aliases 'VerifiableSinkConnector' and 'VerifiableSink' to plugin 'org.apache.kafka.connect.tools.VerifiableSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:14,409] INFO Added aliases 'VerifiableSourceConnector' and 'VerifiableSource' to plugin 'org.apache.kafka.connect.tools.VerifiableSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:14,410] INFO Added aliases 'ProtobufConverter' and 'Protobuf' to plugin 'com.blueapron.connect.protobuf.ProtobufConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:14,411] INFO Added aliases 'AvroConverter' and 'Avro' to plugin 'io.confluent.connect.avro.AvroConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:14,411] INFO Added aliases 'ByteArrayConverter' and 'ByteArray' to plugin 'org.apache.kafka.connect.converters.ByteArrayConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:14,411] INFO Added aliases 'DoubleConverter' and 'Double' to plugin 'org.apache.kafka.connect.converters.DoubleConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:14,412] INFO Added aliases 'FloatConverter' and 'Float' to plugin 'org.apache.kafka.connect.converters.FloatConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:14,412] INFO Added aliases 'IntegerConverter' and 'Integer' to plugin 'org.apache.kafka.connect.converters.IntegerConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:14,413] INFO Added aliases 'LongConverter' and 'Long' to plugin 'org.apache.kafka.connect.converters.LongConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:14,413] INFO Added aliases 'ShortConverter' and 'Short' to plugin 'org.apache.kafka.connect.converters.ShortConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:14,413] INFO Added aliases 'JsonConverter' and 'Json' to plugin 'org.apache.kafka.connect.json.JsonConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:14,414] INFO Added aliases 'StringConverter' and 'String' to plugin 'org.apache.kafka.connect.storage.StringConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:14,415] INFO Added aliases 'ByteArrayConverter' and 'ByteArray' to plugin 'org.apache.kafka.connect.converters.ByteArrayConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:14,415] INFO Added aliases 'DoubleConverter' and 'Double' to plugin 'org.apache.kafka.connect.converters.DoubleConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:14,416] INFO Added aliases 'FloatConverter' and 'Float' to plugin 'org.apache.kafka.connect.converters.FloatConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:14,416] INFO Added aliases 'IntegerConverter' and 'Integer' to plugin 'org.apache.kafka.connect.converters.IntegerConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:14,417] INFO Added aliases 'LongConverter' and 'Long' to plugin 'org.apache.kafka.connect.converters.LongConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:14,417] INFO Added aliases 'ShortConverter' and 'Short' to plugin 'org.apache.kafka.connect.converters.ShortConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:14,418] INFO Added aliases 'JsonConverter' and 'Json' to plugin 'org.apache.kafka.connect.json.JsonConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:14,418] INFO Added alias 'SimpleHeaderConverter' to plugin 'org.apache.kafka.connect.storage.SimpleHeaderConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:14,419] INFO Added aliases 'StringConverter' and 'String' to plugin 'org.apache.kafka.connect.storage.StringConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:14,421] INFO Added alias 'RegexRouter' to plugin 'org.apache.kafka.connect.transforms.RegexRouter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:14,422] INFO Added alias 'TimestampRouter' to plugin 'org.apache.kafka.connect.transforms.TimestampRouter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:14,423] INFO Added alias 'ValueToKey' to plugin 'org.apache.kafka.connect.transforms.ValueToKey' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:14,423] INFO Added alias 'ConnectSecurityExtension' to plugin 'io.confluent.connect.security.ConnectSecurityExtension' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:14,423] INFO Added alias 'BasicAuthSecurityRestExtension' to plugin 'org.apache.kafka.connect.rest.basic.auth.extension.BasicAuthSecurityRestExtension' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:14,423] INFO Added aliases 'AllConnectorClientConfigOverridePolicy' and 'All' to plugin 'org.apache.kafka.connect.connector.policy.AllConnectorClientConfigOverridePolicy' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:14,423] INFO Added aliases 'NoneConnectorClientConfigOverridePolicy' and 'None' to plugin 'org.apache.kafka.connect.connector.policy.NoneConnectorClientConfigOverridePolicy' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:14,424] INFO Added aliases 'PrincipalConnectorClientConfigOverridePolicy' and 'Principal' to plugin 'org.apache.kafka.connect.connector.policy.PrincipalConnectorClientConfigOverridePolicy' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader) connect | [2020-02-24 23:34:14,511] INFO DistributedConfig values: connect | access.control.allow.methods = connect | access.control.allow.origin = connect | admin.listeners = null connect | bootstrap.servers = [broker:29092] connect | client.dns.lookup = default connect | client.id = connect | config.providers = [] connect | config.storage.replication.factor = 1 connect | config.storage.topic = tracking-kafka-connect-s3-configs connect | connect.protocol = sessioned connect | connections.max.idle.ms = 540000 connect | connector.client.config.override.policy = None connect | group.id = tracking-kafka-connect-s3 connect | header.converter = class org.apache.kafka.connect.storage.SimpleHeaderConverter connect | heartbeat.interval.ms = 3000 connect | inter.worker.key.generation.algorithm = HmacSHA256 connect | inter.worker.key.size = null connect | inter.worker.key.ttl.ms = 3600000 connect | inter.worker.signature.algorithm = HmacSHA256 connect | inter.worker.verification.algorithms = [HmacSHA256] connect | internal.key.converter = class org.apache.kafka.connect.json.JsonConverter connect | internal.value.converter = class org.apache.kafka.connect.json.JsonConverter connect | key.converter = class org.apache.kafka.connect.storage.StringConverter connect | listeners = null connect | metadata.max.age.ms = 300000 connect | metric.reporters = [] connect | metrics.num.samples = 2 connect | metrics.recording.level = INFO connect | metrics.sample.window.ms = 30000 connect | offset.flush.interval.ms = 10000 connect | offset.flush.timeout.ms = 5000 connect | offset.storage.partitions = 25 connect | offset.storage.replication.factor = 1 connect | offset.storage.topic = tracking-kafka-connect-s3-offsets connect | plugin.path = [/usr/share/java, /usr/share/confluent-hub-components] connect | rebalance.timeout.ms = 60000 connect | receive.buffer.bytes = 32768 connect | reconnect.backoff.max.ms = 1000 connect | reconnect.backoff.ms = 50 connect | request.timeout.ms = 40000 connect | rest.advertised.host.name = connect connect | rest.advertised.listener = null connect | rest.advertised.port = null connect | rest.extension.classes = [] connect | rest.host.name = null connect | rest.port = 8083 connect | retry.backoff.ms = 100 connect | sasl.client.callback.handler.class = null connect | sasl.jaas.config = null connect | sasl.kerberos.kinit.cmd = /usr/bin/kinit connect | sasl.kerberos.min.time.before.relogin = 60000 connect | sasl.kerberos.service.name = null connect | sasl.kerberos.ticket.renew.jitter = 0.05 connect | sasl.kerberos.ticket.renew.window.factor = 0.8 connect | sasl.login.callback.handler.class = null connect | sasl.login.class = null connect | sasl.login.refresh.buffer.seconds = 300 connect | sasl.login.refresh.min.period.seconds = 60 connect | sasl.login.refresh.window.factor = 0.8 connect | sasl.login.refresh.window.jitter = 0.05 connect | sasl.mechanism = GSSAPI connect | scheduled.rebalance.max.delay.ms = 300000 connect | security.protocol = PLAINTEXT connect | send.buffer.bytes = 131072 connect | session.timeout.ms = 10000 connect | ssl.cipher.suites = null connect | ssl.client.auth = none connect | ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] connect | ssl.endpoint.identification.algorithm = https connect | ssl.key.password = null connect | ssl.keymanager.algorithm = SunX509 connect | ssl.keystore.location = null connect | ssl.keystore.password = null connect | ssl.keystore.type = JKS connect | ssl.protocol = TLS connect | ssl.provider = null connect | ssl.secure.random.implementation = null connect | ssl.trustmanager.algorithm = PKIX connect | ssl.truststore.location = null connect | ssl.truststore.password = null connect | ssl.truststore.type = JKS connect | status.storage.partitions = 5 connect | status.storage.replication.factor = 1 connect | status.storage.topic = tracking-kafka-connect-s3-status connect | task.shutdown.graceful.timeout.ms = 5000 connect | value.converter = class com.blueapron.connect.protobuf.ProtobufConverter connect | worker.sync.timeout.ms = 3000 connect | worker.unsync.backoff.ms = 300000 connect | (org.apache.kafka.connect.runtime.distributed.DistributedConfig) connect | [2020-02-24 23:34:14,511] INFO Worker configuration property 'internal.key.converter' is deprecated and may be removed in an upcoming release. The specified value 'org.apache.kafka.connect.json.JsonConverter' matches the default, so this property can be safely removed from the worker configuration. (org.apache.kafka.connect.runtime.WorkerConfig) connect | [2020-02-24 23:34:14,512] INFO Worker configuration property 'internal.key.converter.schemas.enable' (along with all configuration for 'internal.key.converter') is deprecated and may be removed in an upcoming release. The specified value 'false' matches the default, so this property can be safely removed from the worker configuration. (org.apache.kafka.connect.runtime.WorkerConfig) connect | [2020-02-24 23:34:14,513] INFO Worker configuration property 'internal.value.converter' is deprecated and may be removed in an upcoming release. The specified value 'org.apache.kafka.connect.json.JsonConverter' matches the default, so this property can be safely removed from the worker configuration. (org.apache.kafka.connect.runtime.WorkerConfig) connect | [2020-02-24 23:34:14,513] INFO Worker configuration property 'internal.value.converter.schemas.enable' (along with all configuration for 'internal.value.converter') is deprecated and may be removed in an upcoming release. The specified value 'false' matches the default, so this property can be safely removed from the worker configuration. (org.apache.kafka.connect.runtime.WorkerConfig) connect | [2020-02-24 23:34:14,517] INFO Creating Kafka admin client (org.apache.kafka.connect.util.ConnectUtils) connect | [2020-02-24 23:34:14,522] INFO AdminClientConfig values: connect | bootstrap.servers = [broker:29092] connect | client.dns.lookup = default connect | client.id = connect | connections.max.idle.ms = 300000 connect | metadata.max.age.ms = 300000 connect | metric.reporters = [] connect | metrics.num.samples = 2 connect | metrics.recording.level = INFO connect | metrics.sample.window.ms = 30000 connect | receive.buffer.bytes = 65536 connect | reconnect.backoff.max.ms = 1000 connect | reconnect.backoff.ms = 50 connect | request.timeout.ms = 120000 connect | retries = 5 connect | retry.backoff.ms = 100 connect | sasl.client.callback.handler.class = null connect | sasl.jaas.config = null connect | sasl.kerberos.kinit.cmd = /usr/bin/kinit connect | sasl.kerberos.min.time.before.relogin = 60000 connect | sasl.kerberos.service.name = null connect | sasl.kerberos.ticket.renew.jitter = 0.05 connect | sasl.kerberos.ticket.renew.window.factor = 0.8 connect | sasl.login.callback.handler.class = null connect | sasl.login.class = null connect | sasl.login.refresh.buffer.seconds = 300 connect | sasl.login.refresh.min.period.seconds = 60 connect | sasl.login.refresh.window.factor = 0.8 connect | sasl.login.refresh.window.jitter = 0.05 connect | sasl.mechanism = GSSAPI connect | security.protocol = PLAINTEXT connect | security.providers = null connect | send.buffer.bytes = 131072 connect | ssl.cipher.suites = null connect | ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] connect | ssl.endpoint.identification.algorithm = https connect | ssl.key.password = null connect | ssl.keymanager.algorithm = SunX509 connect | ssl.keystore.location = null connect | ssl.keystore.password = null connect | ssl.keystore.type = JKS connect | ssl.protocol = TLS connect | ssl.provider = null connect | ssl.secure.random.implementation = null connect | ssl.trustmanager.algorithm = PKIX connect | ssl.truststore.location = null connect | ssl.truststore.password = null connect | ssl.truststore.type = JKS connect | (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:14,637] WARN The configuration 'log4j.loggers' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:14,637] WARN The configuration 'config.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:14,637] WARN The configuration 'schema.compatibility' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:14,638] WARN The configuration 'rest.advertised.host.name' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:14,638] WARN The configuration 'status.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:14,639] WARN The configuration 'group.id' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:14,639] WARN The configuration 'plugin.path' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:14,640] WARN The configuration 'internal.key.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:14,640] WARN The configuration 'value.converter.protoclassname' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:14,640] WARN The configuration 'config.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:14,641] WARN The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:14,641] WARN The configuration 'rest.port' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:14,642] WARN The configuration 'internal.key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:14,642] WARN The configuration 'zookeeper.connect' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:14,642] WARN The configuration 'internal.value.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:14,642] WARN The configuration 'status.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:14,642] WARN The configuration 'internal.value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:14,643] WARN The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:14,643] WARN The configuration 'offset.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:14,643] WARN The configuration 'value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:14,643] WARN The configuration 'key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:14,645] INFO Kafka version: 5.4.0-ccs (org.apache.kafka.common.utils.AppInfoParser) connect | [2020-02-24 23:34:14,645] INFO Kafka commitId: 9401111f8c4bb0ec (org.apache.kafka.common.utils.AppInfoParser) connect | [2020-02-24 23:34:14,645] INFO Kafka startTimeMs: 1582587254644 (org.apache.kafka.common.utils.AppInfoParser) connect | [2020-02-24 23:34:15,123] INFO Kafka cluster ID: ZeSnCBNdRh6pGQwnc-VNVQ (org.apache.kafka.connect.util.ConnectUtils) connect | [2020-02-24 23:34:15,152] INFO Logging initialized @15929ms to org.eclipse.jetty.util.log.Slf4jLog (org.eclipse.jetty.util.log) connect | [2020-02-24 23:34:15,252] INFO Added connector for http://:8083 (org.apache.kafka.connect.runtime.rest.RestServer) connect | [2020-02-24 23:34:15,253] INFO Initializing REST server (org.apache.kafka.connect.runtime.rest.RestServer) connect | [2020-02-24 23:34:15,267] INFO jetty-9.4.20.v20190813; built: 2019-08-13T21:28:18.144Z; git: 84700530e645e812b336747464d6fbbf370c9a20; jvm 1.8.0_212-b04 (org.eclipse.jetty.server.Server) connect | [2020-02-24 23:34:15,323] INFO Started http_8083@34f392be{HTTP/1.1,[http/1.1]}{0.0.0.0:8083} (org.eclipse.jetty.server.AbstractConnector) connect | [2020-02-24 23:34:15,324] INFO Started @16099ms (org.eclipse.jetty.server.Server) connect | [2020-02-24 23:34:15,386] INFO Advertised URI: http://connect:8083/ (org.apache.kafka.connect.runtime.rest.RestServer) connect | [2020-02-24 23:34:15,386] INFO REST server listening at http://192.168.160.4:8083/, advertising URL http://connect:8083/ (org.apache.kafka.connect.runtime.rest.RestServer) connect | [2020-02-24 23:34:15,387] INFO Advertised URI: http://connect:8083/ (org.apache.kafka.connect.runtime.rest.RestServer) connect | [2020-02-24 23:34:15,387] INFO REST admin endpoints at http://connect:8083/ (org.apache.kafka.connect.runtime.rest.RestServer) connect | [2020-02-24 23:34:15,387] INFO Advertised URI: http://connect:8083/ (org.apache.kafka.connect.runtime.rest.RestServer) connect | [2020-02-24 23:34:15,403] INFO Setting up None Policy for ConnectorClientConfigOverride. This will disallow any client configuration to be overridden (org.apache.kafka.connect.connector.policy.NoneConnectorClientConfigOverridePolicy) connect | [2020-02-24 23:34:15,436] INFO Kafka version: 5.4.0-ccs (org.apache.kafka.common.utils.AppInfoParser) connect | [2020-02-24 23:34:15,436] INFO Kafka commitId: 9401111f8c4bb0ec (org.apache.kafka.common.utils.AppInfoParser) connect | [2020-02-24 23:34:15,436] INFO Kafka startTimeMs: 1582587255436 (org.apache.kafka.common.utils.AppInfoParser) connect | [2020-02-24 23:34:15,514] INFO JVM Runtime does not support Modules (org.eclipse.jetty.util.TypeUtil) connect | [2020-02-24 23:34:15,681] INFO JsonConverterConfig values: connect | converter.type = key connect | decimal.format = BASE64 connect | schemas.cache.size = 1000 connect | schemas.enable = false connect | (org.apache.kafka.connect.json.JsonConverterConfig) connect | [2020-02-24 23:34:15,684] INFO JsonConverterConfig values: connect | converter.type = value connect | decimal.format = BASE64 connect | schemas.cache.size = 1000 connect | schemas.enable = false connect | (org.apache.kafka.connect.json.JsonConverterConfig) connect | [2020-02-24 23:34:15,757] INFO Kafka version: 5.4.0-ccs (org.apache.kafka.common.utils.AppInfoParser) connect | [2020-02-24 23:34:15,757] INFO Kafka commitId: 9401111f8c4bb0ec (org.apache.kafka.common.utils.AppInfoParser) connect | [2020-02-24 23:34:15,758] INFO Kafka startTimeMs: 1582587255757 (org.apache.kafka.common.utils.AppInfoParser) connect | [2020-02-24 23:34:15,763] INFO Kafka Connect distributed worker initialization took 15532ms (org.apache.kafka.connect.cli.ConnectDistributed) connect | [2020-02-24 23:34:15,763] INFO Kafka Connect starting (org.apache.kafka.connect.runtime.Connect) connect | [2020-02-24 23:34:15,763] INFO Initializing REST resources (org.apache.kafka.connect.runtime.rest.RestServer) connect | [2020-02-24 23:34:15,764] INFO [Worker clientId=connect-1, groupId=tracking-kafka-connect-s3] Herder starting (org.apache.kafka.connect.runtime.distributed.DistributedHerder) connect | [2020-02-24 23:34:15,764] INFO Worker starting (org.apache.kafka.connect.runtime.Worker) connect | [2020-02-24 23:34:15,764] INFO Starting KafkaOffsetBackingStore (org.apache.kafka.connect.storage.KafkaOffsetBackingStore) connect | [2020-02-24 23:34:15,764] INFO Starting KafkaBasedLog with topic tracking-kafka-connect-s3-offsets (org.apache.kafka.connect.util.KafkaBasedLog) connect | [2020-02-24 23:34:15,764] INFO AdminClientConfig values: connect | bootstrap.servers = [broker:29092] connect | client.dns.lookup = default connect | client.id = connect | connections.max.idle.ms = 300000 connect | metadata.max.age.ms = 300000 connect | metric.reporters = [] connect | metrics.num.samples = 2 connect | metrics.recording.level = INFO connect | metrics.sample.window.ms = 30000 connect | receive.buffer.bytes = 65536 connect | reconnect.backoff.max.ms = 1000 connect | reconnect.backoff.ms = 50 connect | request.timeout.ms = 120000 connect | retries = 5 connect | retry.backoff.ms = 100 connect | sasl.client.callback.handler.class = null connect | sasl.jaas.config = null connect | sasl.kerberos.kinit.cmd = /usr/bin/kinit connect | sasl.kerberos.min.time.before.relogin = 60000 connect | sasl.kerberos.service.name = null connect | sasl.kerberos.ticket.renew.jitter = 0.05 connect | sasl.kerberos.ticket.renew.window.factor = 0.8 connect | sasl.login.callback.handler.class = null connect | sasl.login.class = null connect | sasl.login.refresh.buffer.seconds = 300 connect | sasl.login.refresh.min.period.seconds = 60 connect | sasl.login.refresh.window.factor = 0.8 connect | sasl.login.refresh.window.jitter = 0.05 connect | sasl.mechanism = GSSAPI connect | security.protocol = PLAINTEXT connect | security.providers = null connect | send.buffer.bytes = 131072 connect | ssl.cipher.suites = null connect | ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] connect | ssl.endpoint.identification.algorithm = https connect | ssl.key.password = null connect | ssl.keymanager.algorithm = SunX509 connect | ssl.keystore.location = null connect | ssl.keystore.password = null connect | ssl.keystore.type = JKS connect | ssl.protocol = TLS connect | ssl.provider = null connect | ssl.secure.random.implementation = null connect | ssl.trustmanager.algorithm = PKIX connect | ssl.truststore.location = null connect | ssl.truststore.password = null connect | ssl.truststore.type = JKS connect | (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:15,789] WARN The configuration 'log4j.loggers' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:15,789] WARN The configuration 'config.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:15,789] WARN The configuration 'schema.compatibility' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:15,789] WARN The configuration 'rest.advertised.host.name' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:15,789] WARN The configuration 'status.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:15,790] WARN The configuration 'group.id' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:15,790] WARN The configuration 'plugin.path' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:15,790] WARN The configuration 'internal.key.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:15,790] WARN The configuration 'value.converter.protoclassname' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:15,790] WARN The configuration 'config.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:15,790] WARN The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:15,790] WARN The configuration 'rest.port' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:15,791] WARN The configuration 'internal.key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:15,791] WARN The configuration 'zookeeper.connect' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:15,791] WARN The configuration 'internal.value.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:15,791] WARN The configuration 'status.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:15,792] WARN The configuration 'internal.value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:15,792] WARN The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:15,792] WARN The configuration 'offset.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:15,793] WARN The configuration 'value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:15,793] WARN The configuration 'key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:15,793] INFO Kafka version: 5.4.0-ccs (org.apache.kafka.common.utils.AppInfoParser) connect | [2020-02-24 23:34:15,794] INFO Kafka commitId: 9401111f8c4bb0ec (org.apache.kafka.common.utils.AppInfoParser) connect | [2020-02-24 23:34:15,794] INFO Kafka startTimeMs: 1582587255793 (org.apache.kafka.common.utils.AppInfoParser) connect | [2020-02-24 23:34:16,042] INFO Adding admin resources to main listener (org.apache.kafka.connect.runtime.rest.RestServer) connect | [2020-02-24 23:34:16,234] INFO DefaultSessionIdManager workerName=node0 (org.eclipse.jetty.server.session) connect | [2020-02-24 23:34:16,234] INFO No SessionScavenger set, using defaults (org.eclipse.jetty.server.session) connect | [2020-02-24 23:34:16,238] INFO node0 Scavenging every 660000ms (org.eclipse.jetty.server.session) connect | [2020-02-24 23:34:16,850] INFO Created topic (name=tracking-kafka-connect-s3-offsets, numPartitions=25, replicationFactor=1, replicasAssignments=null, configs={cleanup.policy=compact}) on brokers at broker:29092 (org.apache.kafka.connect.util.TopicAdmin) connect | [2020-02-24 23:34:16,871] INFO ProducerConfig values: connect | acks = all connect | batch.size = 16384 connect | bootstrap.servers = [broker:29092] connect | buffer.memory = 33554432 connect | client.dns.lookup = default connect | client.id = connect | compression.type = none connect | connections.max.idle.ms = 540000 connect | delivery.timeout.ms = 2147483647 connect | enable.idempotence = false connect | interceptor.classes = [] connect | key.serializer = class org.apache.kafka.common.serialization.ByteArraySerializer connect | linger.ms = 0 connect | max.block.ms = 60000 connect | max.in.flight.requests.per.connection = 1 connect | max.request.size = 1048576 connect | metadata.max.age.ms = 300000 connect | metric.reporters = [] connect | metrics.num.samples = 2 connect | metrics.recording.level = INFO connect | metrics.sample.window.ms = 30000 connect | partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner connect | receive.buffer.bytes = 32768 connect | reconnect.backoff.max.ms = 1000 connect | reconnect.backoff.ms = 50 connect | request.timeout.ms = 30000 connect | retries = 2147483647 connect | retry.backoff.ms = 100 connect | sasl.client.callback.handler.class = null connect | sasl.jaas.config = null connect | sasl.kerberos.kinit.cmd = /usr/bin/kinit connect | sasl.kerberos.min.time.before.relogin = 60000 connect | sasl.kerberos.service.name = null connect | sasl.kerberos.ticket.renew.jitter = 0.05 connect | sasl.kerberos.ticket.renew.window.factor = 0.8 connect | sasl.login.callback.handler.class = null connect | sasl.login.class = null connect | sasl.login.refresh.buffer.seconds = 300 connect | sasl.login.refresh.min.period.seconds = 60 connect | sasl.login.refresh.window.factor = 0.8 connect | sasl.login.refresh.window.jitter = 0.05 connect | sasl.mechanism = GSSAPI connect | security.protocol = PLAINTEXT connect | security.providers = null connect | send.buffer.bytes = 131072 connect | ssl.cipher.suites = null connect | ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] connect | ssl.endpoint.identification.algorithm = https connect | ssl.key.password = null connect | ssl.keymanager.algorithm = SunX509 connect | ssl.keystore.location = null connect | ssl.keystore.password = null connect | ssl.keystore.type = JKS connect | ssl.protocol = TLS connect | ssl.provider = null connect | ssl.secure.random.implementation = null connect | ssl.trustmanager.algorithm = PKIX connect | ssl.truststore.location = null connect | ssl.truststore.password = null connect | ssl.truststore.type = JKS connect | transaction.timeout.ms = 60000 connect | transactional.id = null connect | value.serializer = class org.apache.kafka.common.serialization.ByteArraySerializer connect | (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:16,908] WARN The configuration 'log4j.loggers' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:16,908] WARN The configuration 'group.id' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:16,911] WARN The configuration 'plugin.path' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:16,911] WARN The configuration 'internal.key.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:16,911] WARN The configuration 'status.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:16,912] WARN The configuration 'offset.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:16,912] WARN The configuration 'value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:16,913] WARN The configuration 'key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:16,913] WARN The configuration 'config.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:16,913] WARN The configuration 'schema.compatibility' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:16,914] WARN The configuration 'rest.advertised.host.name' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:16,914] WARN The configuration 'status.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:16,914] WARN The configuration 'value.converter.protoclassname' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:16,915] WARN The configuration 'config.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:16,916] WARN The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:16,916] WARN The configuration 'rest.port' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:16,916] WARN The configuration 'internal.key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:16,916] WARN The configuration 'zookeeper.connect' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:16,918] WARN The configuration 'internal.value.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:16,918] WARN The configuration 'internal.value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:16,919] WARN The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:16,921] INFO Kafka version: 5.4.0-ccs (org.apache.kafka.common.utils.AppInfoParser) connect | [2020-02-24 23:34:16,921] INFO Kafka commitId: 9401111f8c4bb0ec (org.apache.kafka.common.utils.AppInfoParser) connect | [2020-02-24 23:34:16,921] INFO Kafka startTimeMs: 1582587256919 (org.apache.kafka.common.utils.AppInfoParser) connect | [2020-02-24 23:34:16,935] INFO ConsumerConfig values: connect | allow.auto.create.topics = true connect | auto.commit.interval.ms = 5000 connect | auto.offset.reset = earliest connect | bootstrap.servers = [broker:29092] connect | check.crcs = true connect | client.dns.lookup = default connect | client.id = connect | client.rack = connect | connections.max.idle.ms = 540000 connect | default.api.timeout.ms = 60000 connect | enable.auto.commit = false connect | exclude.internal.topics = true connect | fetch.max.bytes = 52428800 connect | fetch.max.wait.ms = 500 connect | fetch.min.bytes = 1 connect | group.id = tracking-kafka-connect-s3 connect | group.instance.id = null connect | heartbeat.interval.ms = 3000 connect | interceptor.classes = [] connect | internal.leave.group.on.close = true connect | isolation.level = read_uncommitted connect | key.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer connect | max.partition.fetch.bytes = 1048576 connect | max.poll.interval.ms = 300000 connect | max.poll.records = 500 connect | metadata.max.age.ms = 300000 connect | metric.reporters = [] connect | metrics.num.samples = 2 connect | metrics.recording.level = INFO connect | metrics.sample.window.ms = 30000 connect | partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor] connect | receive.buffer.bytes = 65536 connect | reconnect.backoff.max.ms = 1000 connect | reconnect.backoff.ms = 50 connect | request.timeout.ms = 30000 connect | retry.backoff.ms = 100 connect | sasl.client.callback.handler.class = null connect | sasl.jaas.config = null connect | sasl.kerberos.kinit.cmd = /usr/bin/kinit connect | sasl.kerberos.min.time.before.relogin = 60000 connect | sasl.kerberos.service.name = null connect | sasl.kerberos.ticket.renew.jitter = 0.05 connect | sasl.kerberos.ticket.renew.window.factor = 0.8 connect | sasl.login.callback.handler.class = null connect | sasl.login.class = null connect | sasl.login.refresh.buffer.seconds = 300 connect | sasl.login.refresh.min.period.seconds = 60 connect | sasl.login.refresh.window.factor = 0.8 connect | sasl.login.refresh.window.jitter = 0.05 connect | sasl.mechanism = GSSAPI connect | security.protocol = PLAINTEXT connect | security.providers = null connect | send.buffer.bytes = 131072 connect | session.timeout.ms = 10000 connect | ssl.cipher.suites = null connect | ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] connect | ssl.endpoint.identification.algorithm = https connect | ssl.key.password = null connect | ssl.keymanager.algorithm = SunX509 connect | ssl.keystore.location = null connect | ssl.keystore.password = null connect | ssl.keystore.type = JKS connect | ssl.protocol = TLS connect | ssl.provider = null connect | ssl.secure.random.implementation = null connect | ssl.trustmanager.algorithm = PKIX connect | ssl.truststore.location = null connect | ssl.truststore.password = null connect | ssl.truststore.type = JKS connect | value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer connect | (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:16,980] WARN The configuration 'log4j.loggers' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:16,980] WARN The configuration 'plugin.path' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:16,980] WARN The configuration 'internal.key.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:16,980] WARN The configuration 'status.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:16,981] WARN The configuration 'offset.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:16,981] WARN The configuration 'value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:16,981] WARN The configuration 'key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:16,982] WARN The configuration 'config.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:16,982] WARN The configuration 'schema.compatibility' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:16,983] WARN The configuration 'rest.advertised.host.name' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:16,983] WARN The configuration 'status.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:16,983] WARN The configuration 'value.converter.protoclassname' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:16,983] WARN The configuration 'config.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:16,983] WARN The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:16,983] WARN The configuration 'rest.port' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:16,983] WARN The configuration 'internal.key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:16,983] WARN The configuration 'zookeeper.connect' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:16,984] WARN The configuration 'internal.value.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:16,984] WARN The configuration 'internal.value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:16,984] WARN The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:16,985] INFO Kafka version: 5.4.0-ccs (org.apache.kafka.common.utils.AppInfoParser) connect | [2020-02-24 23:34:16,985] INFO Kafka commitId: 9401111f8c4bb0ec (org.apache.kafka.common.utils.AppInfoParser) connect | [2020-02-24 23:34:16,985] INFO Kafka startTimeMs: 1582587256984 (org.apache.kafka.common.utils.AppInfoParser) connect | [2020-02-24 23:34:17,020] INFO [Producer clientId=producer-1] Cluster ID: ZeSnCBNdRh6pGQwnc-VNVQ (org.apache.kafka.clients.Metadata) connect | [2020-02-24 23:34:17,065] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-1, groupId=tracking-kafka-connect-s3] Subscribed to partition(s): tracking-kafka-connect-s3-offsets-0, tracking-kafka-connect-s3-offsets-5, tracking-kafka-connect-s3-offsets-10, tracking-kafka-connect-s3-offsets-20, tracking-kafka-connect-s3-offsets-15, tracking-kafka-connect-s3-offsets-9, tracking-kafka-connect-s3-offsets-11, tracking-kafka-connect-s3-offsets-4, tracking-kafka-connect-s3-offsets-16, tracking-kafka-connect-s3-offsets-17, tracking-kafka-connect-s3-offsets-3, tracking-kafka-connect-s3-offsets-24, tracking-kafka-connect-s3-offsets-23, tracking-kafka-connect-s3-offsets-13, tracking-kafka-connect-s3-offsets-18, tracking-kafka-connect-s3-offsets-22, tracking-kafka-connect-s3-offsets-2, tracking-kafka-connect-s3-offsets-8, tracking-kafka-connect-s3-offsets-12, tracking-kafka-connect-s3-offsets-19, tracking-kafka-connect-s3-offsets-14, tracking-kafka-connect-s3-offsets-1, tracking-kafka-connect-s3-offsets-6, tracking-kafka-connect-s3-offsets-7, tracking-kafka-connect-s3-offsets-21 (org.apache.kafka.clients.consumer.KafkaConsumer) connect | [2020-02-24 23:34:17,071] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-1, groupId=tracking-kafka-connect-s3] Seeking to EARLIEST offset of partition tracking-kafka-connect-s3-offsets-0 (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:17,073] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-1, groupId=tracking-kafka-connect-s3] Seeking to EARLIEST offset of partition tracking-kafka-connect-s3-offsets-5 (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:17,073] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-1, groupId=tracking-kafka-connect-s3] Seeking to EARLIEST offset of partition tracking-kafka-connect-s3-offsets-10 (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:17,074] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-1, groupId=tracking-kafka-connect-s3] Seeking to EARLIEST offset of partition tracking-kafka-connect-s3-offsets-20 (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:17,074] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-1, groupId=tracking-kafka-connect-s3] Seeking to EARLIEST offset of partition tracking-kafka-connect-s3-offsets-15 (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:17,074] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-1, groupId=tracking-kafka-connect-s3] Seeking to EARLIEST offset of partition tracking-kafka-connect-s3-offsets-9 (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:17,075] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-1, groupId=tracking-kafka-connect-s3] Seeking to EARLIEST offset of partition tracking-kafka-connect-s3-offsets-11 (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:17,075] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-1, groupId=tracking-kafka-connect-s3] Seeking to EARLIEST offset of partition tracking-kafka-connect-s3-offsets-4 (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:17,075] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-1, groupId=tracking-kafka-connect-s3] Seeking to EARLIEST offset of partition tracking-kafka-connect-s3-offsets-16 (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:17,076] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-1, groupId=tracking-kafka-connect-s3] Seeking to EARLIEST offset of partition tracking-kafka-connect-s3-offsets-17 (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:17,077] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-1, groupId=tracking-kafka-connect-s3] Seeking to EARLIEST offset of partition tracking-kafka-connect-s3-offsets-3 (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:17,077] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-1, groupId=tracking-kafka-connect-s3] Seeking to EARLIEST offset of partition tracking-kafka-connect-s3-offsets-24 (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:17,077] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-1, groupId=tracking-kafka-connect-s3] Seeking to EARLIEST offset of partition tracking-kafka-connect-s3-offsets-23 (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:17,077] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-1, groupId=tracking-kafka-connect-s3] Seeking to EARLIEST offset of partition tracking-kafka-connect-s3-offsets-13 (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:17,077] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-1, groupId=tracking-kafka-connect-s3] Seeking to EARLIEST offset of partition tracking-kafka-connect-s3-offsets-18 (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:17,077] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-1, groupId=tracking-kafka-connect-s3] Seeking to EARLIEST offset of partition tracking-kafka-connect-s3-offsets-22 (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:17,077] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-1, groupId=tracking-kafka-connect-s3] Seeking to EARLIEST offset of partition tracking-kafka-connect-s3-offsets-2 (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:17,077] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-1, groupId=tracking-kafka-connect-s3] Seeking to EARLIEST offset of partition tracking-kafka-connect-s3-offsets-8 (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:17,077] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-1, groupId=tracking-kafka-connect-s3] Seeking to EARLIEST offset of partition tracking-kafka-connect-s3-offsets-12 (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:17,077] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-1, groupId=tracking-kafka-connect-s3] Seeking to EARLIEST offset of partition tracking-kafka-connect-s3-offsets-19 (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:17,077] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-1, groupId=tracking-kafka-connect-s3] Seeking to EARLIEST offset of partition tracking-kafka-connect-s3-offsets-14 (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:17,077] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-1, groupId=tracking-kafka-connect-s3] Seeking to EARLIEST offset of partition tracking-kafka-connect-s3-offsets-1 (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:17,078] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-1, groupId=tracking-kafka-connect-s3] Seeking to EARLIEST offset of partition tracking-kafka-connect-s3-offsets-6 (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:17,078] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-1, groupId=tracking-kafka-connect-s3] Seeking to EARLIEST offset of partition tracking-kafka-connect-s3-offsets-7 (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:17,078] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-1, groupId=tracking-kafka-connect-s3] Seeking to EARLIEST offset of partition tracking-kafka-connect-s3-offsets-21 (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:17,096] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-1, groupId=tracking-kafka-connect-s3] Cluster ID: ZeSnCBNdRh6pGQwnc-VNVQ (org.apache.kafka.clients.Metadata) connect | [2020-02-24 23:34:17,162] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-1, groupId=tracking-kafka-connect-s3] Resetting offset for partition tracking-kafka-connect-s3-offsets-24 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:17,165] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-1, groupId=tracking-kafka-connect-s3] Resetting offset for partition tracking-kafka-connect-s3-offsets-22 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:17,165] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-1, groupId=tracking-kafka-connect-s3] Resetting offset for partition tracking-kafka-connect-s3-offsets-3 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:17,169] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-1, groupId=tracking-kafka-connect-s3] Resetting offset for partition tracking-kafka-connect-s3-offsets-1 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:17,169] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-1, groupId=tracking-kafka-connect-s3] Resetting offset for partition tracking-kafka-connect-s3-offsets-11 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:17,170] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-1, groupId=tracking-kafka-connect-s3] Resetting offset for partition tracking-kafka-connect-s3-offsets-9 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:17,170] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-1, groupId=tracking-kafka-connect-s3] Resetting offset for partition tracking-kafka-connect-s3-offsets-7 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:17,172] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-1, groupId=tracking-kafka-connect-s3] Resetting offset for partition tracking-kafka-connect-s3-offsets-5 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:17,172] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-1, groupId=tracking-kafka-connect-s3] Resetting offset for partition tracking-kafka-connect-s3-offsets-19 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:17,173] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-1, groupId=tracking-kafka-connect-s3] Resetting offset for partition tracking-kafka-connect-s3-offsets-17 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:17,173] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-1, groupId=tracking-kafka-connect-s3] Resetting offset for partition tracking-kafka-connect-s3-offsets-15 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:17,174] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-1, groupId=tracking-kafka-connect-s3] Resetting offset for partition tracking-kafka-connect-s3-offsets-13 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:17,174] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-1, groupId=tracking-kafka-connect-s3] Resetting offset for partition tracking-kafka-connect-s3-offsets-23 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:17,175] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-1, groupId=tracking-kafka-connect-s3] Resetting offset for partition tracking-kafka-connect-s3-offsets-21 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:17,176] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-1, groupId=tracking-kafka-connect-s3] Resetting offset for partition tracking-kafka-connect-s3-offsets-4 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:17,177] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-1, groupId=tracking-kafka-connect-s3] Resetting offset for partition tracking-kafka-connect-s3-offsets-2 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:17,178] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-1, groupId=tracking-kafka-connect-s3] Resetting offset for partition tracking-kafka-connect-s3-offsets-0 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:17,179] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-1, groupId=tracking-kafka-connect-s3] Resetting offset for partition tracking-kafka-connect-s3-offsets-12 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:17,180] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-1, groupId=tracking-kafka-connect-s3] Resetting offset for partition tracking-kafka-connect-s3-offsets-10 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:17,181] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-1, groupId=tracking-kafka-connect-s3] Resetting offset for partition tracking-kafka-connect-s3-offsets-8 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:17,181] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-1, groupId=tracking-kafka-connect-s3] Resetting offset for partition tracking-kafka-connect-s3-offsets-6 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:17,184] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-1, groupId=tracking-kafka-connect-s3] Resetting offset for partition tracking-kafka-connect-s3-offsets-20 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:17,185] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-1, groupId=tracking-kafka-connect-s3] Resetting offset for partition tracking-kafka-connect-s3-offsets-18 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:17,185] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-1, groupId=tracking-kafka-connect-s3] Resetting offset for partition tracking-kafka-connect-s3-offsets-16 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:17,185] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-1, groupId=tracking-kafka-connect-s3] Resetting offset for partition tracking-kafka-connect-s3-offsets-14 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:17,188] INFO Finished reading KafkaBasedLog for topic tracking-kafka-connect-s3-offsets (org.apache.kafka.connect.util.KafkaBasedLog) connect | [2020-02-24 23:34:17,188] INFO Started KafkaBasedLog for topic tracking-kafka-connect-s3-offsets (org.apache.kafka.connect.util.KafkaBasedLog) connect | [2020-02-24 23:34:17,188] INFO Finished reading offsets topic and starting KafkaOffsetBackingStore (org.apache.kafka.connect.storage.KafkaOffsetBackingStore) connect | [2020-02-24 23:34:17,193] INFO Worker started (org.apache.kafka.connect.runtime.Worker) connect | [2020-02-24 23:34:17,193] INFO Starting KafkaBasedLog with topic tracking-kafka-connect-s3-status (org.apache.kafka.connect.util.KafkaBasedLog) connect | [2020-02-24 23:34:17,194] INFO AdminClientConfig values: connect | bootstrap.servers = [broker:29092] connect | client.dns.lookup = default connect | client.id = connect | connections.max.idle.ms = 300000 connect | metadata.max.age.ms = 300000 connect | metric.reporters = [] connect | metrics.num.samples = 2 connect | metrics.recording.level = INFO connect | metrics.sample.window.ms = 30000 connect | receive.buffer.bytes = 65536 connect | reconnect.backoff.max.ms = 1000 connect | reconnect.backoff.ms = 50 connect | request.timeout.ms = 120000 connect | retries = 5 connect | retry.backoff.ms = 100 connect | sasl.client.callback.handler.class = null connect | sasl.jaas.config = null connect | sasl.kerberos.kinit.cmd = /usr/bin/kinit connect | sasl.kerberos.min.time.before.relogin = 60000 connect | sasl.kerberos.service.name = null connect | sasl.kerberos.ticket.renew.jitter = 0.05 connect | sasl.kerberos.ticket.renew.window.factor = 0.8 connect | sasl.login.callback.handler.class = null connect | sasl.login.class = null connect | sasl.login.refresh.buffer.seconds = 300 connect | sasl.login.refresh.min.period.seconds = 60 connect | sasl.login.refresh.window.factor = 0.8 connect | sasl.login.refresh.window.jitter = 0.05 connect | sasl.mechanism = GSSAPI connect | security.protocol = PLAINTEXT connect | security.providers = null connect | send.buffer.bytes = 131072 connect | ssl.cipher.suites = null connect | ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] connect | ssl.endpoint.identification.algorithm = https connect | ssl.key.password = null connect | ssl.keymanager.algorithm = SunX509 connect | ssl.keystore.location = null connect | ssl.keystore.password = null connect | ssl.keystore.type = JKS connect | ssl.protocol = TLS connect | ssl.provider = null connect | ssl.secure.random.implementation = null connect | ssl.trustmanager.algorithm = PKIX connect | ssl.truststore.location = null connect | ssl.truststore.password = null connect | ssl.truststore.type = JKS connect | (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:17,206] WARN The configuration 'log4j.loggers' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:17,206] WARN The configuration 'config.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:17,206] WARN The configuration 'schema.compatibility' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:17,207] WARN The configuration 'rest.advertised.host.name' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:17,207] WARN The configuration 'status.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:17,207] WARN The configuration 'group.id' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:17,208] WARN The configuration 'plugin.path' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:17,208] WARN The configuration 'internal.key.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:17,208] WARN The configuration 'value.converter.protoclassname' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:17,209] WARN The configuration 'config.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:17,210] WARN The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:17,210] WARN The configuration 'rest.port' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:17,210] WARN The configuration 'internal.key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:17,212] WARN The configuration 'zookeeper.connect' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:17,216] WARN The configuration 'internal.value.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:17,216] WARN The configuration 'status.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:17,217] WARN The configuration 'internal.value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:17,217] WARN The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:17,217] WARN The configuration 'offset.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:17,217] WARN The configuration 'value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:17,217] WARN The configuration 'key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:17,218] INFO Kafka version: 5.4.0-ccs (org.apache.kafka.common.utils.AppInfoParser) connect | [2020-02-24 23:34:17,218] INFO Kafka commitId: 9401111f8c4bb0ec (org.apache.kafka.common.utils.AppInfoParser) connect | [2020-02-24 23:34:17,219] INFO Kafka startTimeMs: 1582587257217 (org.apache.kafka.common.utils.AppInfoParser) connect | Feb 24, 2020 11:34:17 PM org.glassfish.jersey.internal.inject.Providers checkProviderRuntime connect | WARNING: A provider org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource will be ignored. connect | Feb 24, 2020 11:34:17 PM org.glassfish.jersey.internal.inject.Providers checkProviderRuntime connect | WARNING: A provider org.apache.kafka.connect.runtime.rest.resources.ConnectorPluginsResource registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider org.apache.kafka.connect.runtime.rest.resources.ConnectorPluginsResource will be ignored. connect | Feb 24, 2020 11:34:17 PM org.glassfish.jersey.internal.inject.Providers checkProviderRuntime connect | WARNING: A provider org.apache.kafka.connect.runtime.rest.resources.LoggingResource registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider org.apache.kafka.connect.runtime.rest.resources.LoggingResource will be ignored. connect | Feb 24, 2020 11:34:17 PM org.glassfish.jersey.internal.inject.Providers checkProviderRuntime connect | WARNING: A provider org.apache.kafka.connect.runtime.rest.resources.RootResource registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider org.apache.kafka.connect.runtime.rest.resources.RootResource will be ignored. connect | [2020-02-24 23:34:17,516] INFO Created topic (name=tracking-kafka-connect-s3-status, numPartitions=5, replicationFactor=1, replicasAssignments=null, configs={cleanup.policy=compact}) on brokers at broker:29092 (org.apache.kafka.connect.util.TopicAdmin) connect | [2020-02-24 23:34:17,521] INFO ProducerConfig values: connect | acks = all connect | batch.size = 16384 connect | bootstrap.servers = [broker:29092] connect | buffer.memory = 33554432 connect | client.dns.lookup = default connect | client.id = connect | compression.type = none connect | connections.max.idle.ms = 540000 connect | delivery.timeout.ms = 120000 connect | enable.idempotence = false connect | interceptor.classes = [] connect | key.serializer = class org.apache.kafka.common.serialization.StringSerializer connect | linger.ms = 0 connect | max.block.ms = 60000 connect | max.in.flight.requests.per.connection = 1 connect | max.request.size = 1048576 connect | metadata.max.age.ms = 300000 connect | metric.reporters = [] connect | metrics.num.samples = 2 connect | metrics.recording.level = INFO connect | metrics.sample.window.ms = 30000 connect | partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner connect | receive.buffer.bytes = 32768 connect | reconnect.backoff.max.ms = 1000 connect | reconnect.backoff.ms = 50 connect | request.timeout.ms = 30000 connect | retries = 0 connect | retry.backoff.ms = 100 connect | sasl.client.callback.handler.class = null connect | sasl.jaas.config = null connect | sasl.kerberos.kinit.cmd = /usr/bin/kinit connect | sasl.kerberos.min.time.before.relogin = 60000 connect | sasl.kerberos.service.name = null connect | sasl.kerberos.ticket.renew.jitter = 0.05 connect | sasl.kerberos.ticket.renew.window.factor = 0.8 connect | sasl.login.callback.handler.class = null connect | sasl.login.class = null connect | sasl.login.refresh.buffer.seconds = 300 connect | sasl.login.refresh.min.period.seconds = 60 connect | sasl.login.refresh.window.factor = 0.8 connect | sasl.login.refresh.window.jitter = 0.05 connect | sasl.mechanism = GSSAPI connect | security.protocol = PLAINTEXT connect | security.providers = null connect | send.buffer.bytes = 131072 connect | ssl.cipher.suites = null connect | ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] connect | ssl.endpoint.identification.algorithm = https connect | ssl.key.password = null connect | ssl.keymanager.algorithm = SunX509 connect | ssl.keystore.location = null connect | ssl.keystore.password = null connect | ssl.keystore.type = JKS connect | ssl.protocol = TLS connect | ssl.provider = null connect | ssl.secure.random.implementation = null connect | ssl.trustmanager.algorithm = PKIX connect | ssl.truststore.location = null connect | ssl.truststore.password = null connect | ssl.truststore.type = JKS connect | transaction.timeout.ms = 60000 connect | transactional.id = null connect | value.serializer = class org.apache.kafka.common.serialization.ByteArraySerializer connect | (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:17,537] WARN The configuration 'log4j.loggers' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:17,537] WARN The configuration 'group.id' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:17,538] WARN The configuration 'plugin.path' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:17,539] WARN The configuration 'internal.key.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:17,539] WARN The configuration 'status.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:17,539] WARN The configuration 'offset.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:17,539] WARN The configuration 'value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:17,539] WARN The configuration 'key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:17,540] WARN The configuration 'config.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:17,540] WARN The configuration 'schema.compatibility' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:17,541] WARN The configuration 'rest.advertised.host.name' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:17,541] WARN The configuration 'status.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:17,541] WARN The configuration 'value.converter.protoclassname' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:17,541] WARN The configuration 'config.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:17,541] WARN The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:17,542] WARN The configuration 'rest.port' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:17,542] WARN The configuration 'internal.key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:17,542] WARN The configuration 'zookeeper.connect' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:17,542] WARN The configuration 'internal.value.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:17,542] WARN The configuration 'internal.value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:17,543] WARN The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:17,543] INFO Kafka version: 5.4.0-ccs (org.apache.kafka.common.utils.AppInfoParser) connect | [2020-02-24 23:34:17,543] INFO Kafka commitId: 9401111f8c4bb0ec (org.apache.kafka.common.utils.AppInfoParser) connect | [2020-02-24 23:34:17,543] INFO Kafka startTimeMs: 1582587257543 (org.apache.kafka.common.utils.AppInfoParser) connect | [2020-02-24 23:34:17,549] INFO ConsumerConfig values: connect | allow.auto.create.topics = true connect | auto.commit.interval.ms = 5000 connect | auto.offset.reset = earliest connect | bootstrap.servers = [broker:29092] connect | check.crcs = true connect | client.dns.lookup = default connect | client.id = connect | client.rack = connect | connections.max.idle.ms = 540000 connect | default.api.timeout.ms = 60000 connect | enable.auto.commit = false connect | exclude.internal.topics = true connect | fetch.max.bytes = 52428800 connect | fetch.max.wait.ms = 500 connect | fetch.min.bytes = 1 connect | group.id = tracking-kafka-connect-s3 connect | group.instance.id = null connect | heartbeat.interval.ms = 3000 connect | interceptor.classes = [] connect | internal.leave.group.on.close = true connect | isolation.level = read_uncommitted connect | key.deserializer = class org.apache.kafka.common.serialization.StringDeserializer connect | max.partition.fetch.bytes = 1048576 connect | max.poll.interval.ms = 300000 connect | max.poll.records = 500 connect | metadata.max.age.ms = 300000 connect | metric.reporters = [] connect | metrics.num.samples = 2 connect | metrics.recording.level = INFO connect | metrics.sample.window.ms = 30000 connect | partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor] connect | receive.buffer.bytes = 65536 connect | reconnect.backoff.max.ms = 1000 connect | reconnect.backoff.ms = 50 connect | request.timeout.ms = 30000 connect | retry.backoff.ms = 100 connect | sasl.client.callback.handler.class = null connect | sasl.jaas.config = null connect | sasl.kerberos.kinit.cmd = /usr/bin/kinit connect | sasl.kerberos.min.time.before.relogin = 60000 connect | sasl.kerberos.service.name = null connect | sasl.kerberos.ticket.renew.jitter = 0.05 connect | sasl.kerberos.ticket.renew.window.factor = 0.8 connect | sasl.login.callback.handler.class = null connect | sasl.login.class = null connect | sasl.login.refresh.buffer.seconds = 300 connect | sasl.login.refresh.min.period.seconds = 60 connect | sasl.login.refresh.window.factor = 0.8 connect | sasl.login.refresh.window.jitter = 0.05 connect | sasl.mechanism = GSSAPI connect | security.protocol = PLAINTEXT connect | security.providers = null connect | send.buffer.bytes = 131072 connect | session.timeout.ms = 10000 connect | ssl.cipher.suites = null connect | ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] connect | ssl.endpoint.identification.algorithm = https connect | ssl.key.password = null connect | ssl.keymanager.algorithm = SunX509 connect | ssl.keystore.location = null connect | ssl.keystore.password = null connect | ssl.keystore.type = JKS connect | ssl.protocol = TLS connect | ssl.provider = null connect | ssl.secure.random.implementation = null connect | ssl.trustmanager.algorithm = PKIX connect | ssl.truststore.location = null connect | ssl.truststore.password = null connect | ssl.truststore.type = JKS connect | value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer connect | (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:17,556] WARN The configuration 'log4j.loggers' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:17,556] WARN The configuration 'plugin.path' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:17,556] WARN The configuration 'internal.key.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:17,556] WARN The configuration 'status.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:17,557] WARN The configuration 'offset.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:17,557] WARN The configuration 'value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:17,557] WARN The configuration 'key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:17,557] WARN The configuration 'config.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:17,557] WARN The configuration 'schema.compatibility' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:17,558] WARN The configuration 'rest.advertised.host.name' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:17,558] WARN The configuration 'status.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:17,558] WARN The configuration 'value.converter.protoclassname' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:17,558] WARN The configuration 'config.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:17,559] WARN The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:17,559] WARN The configuration 'rest.port' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:17,559] WARN The configuration 'internal.key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:17,560] WARN The configuration 'zookeeper.connect' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:17,560] WARN The configuration 'internal.value.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:17,560] WARN The configuration 'internal.value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:17,560] WARN The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:17,561] INFO Kafka version: 5.4.0-ccs (org.apache.kafka.common.utils.AppInfoParser) connect | [2020-02-24 23:34:17,561] INFO Kafka commitId: 9401111f8c4bb0ec (org.apache.kafka.common.utils.AppInfoParser) connect | [2020-02-24 23:34:17,561] INFO Kafka startTimeMs: 1582587257561 (org.apache.kafka.common.utils.AppInfoParser) connect | [2020-02-24 23:34:17,603] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-2, groupId=tracking-kafka-connect-s3] Subscribed to partition(s): tracking-kafka-connect-s3-status-0, tracking-kafka-connect-s3-status-4, tracking-kafka-connect-s3-status-1, tracking-kafka-connect-s3-status-2, tracking-kafka-connect-s3-status-3 (org.apache.kafka.clients.consumer.KafkaConsumer) connect | [2020-02-24 23:34:17,603] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-2, groupId=tracking-kafka-connect-s3] Seeking to EARLIEST offset of partition tracking-kafka-connect-s3-status-0 (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:17,603] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-2, groupId=tracking-kafka-connect-s3] Seeking to EARLIEST offset of partition tracking-kafka-connect-s3-status-4 (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:17,603] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-2, groupId=tracking-kafka-connect-s3] Seeking to EARLIEST offset of partition tracking-kafka-connect-s3-status-1 (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:17,603] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-2, groupId=tracking-kafka-connect-s3] Seeking to EARLIEST offset of partition tracking-kafka-connect-s3-status-2 (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:17,604] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-2, groupId=tracking-kafka-connect-s3] Seeking to EARLIEST offset of partition tracking-kafka-connect-s3-status-3 (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:17,609] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-2, groupId=tracking-kafka-connect-s3] Cluster ID: ZeSnCBNdRh6pGQwnc-VNVQ (org.apache.kafka.clients.Metadata) connect | [2020-02-24 23:34:17,621] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-2, groupId=tracking-kafka-connect-s3] Resetting offset for partition tracking-kafka-connect-s3-status-0 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:17,621] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-2, groupId=tracking-kafka-connect-s3] Resetting offset for partition tracking-kafka-connect-s3-status-3 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:17,621] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-2, groupId=tracking-kafka-connect-s3] Resetting offset for partition tracking-kafka-connect-s3-status-4 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:17,622] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-2, groupId=tracking-kafka-connect-s3] Resetting offset for partition tracking-kafka-connect-s3-status-1 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:17,622] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-2, groupId=tracking-kafka-connect-s3] Resetting offset for partition tracking-kafka-connect-s3-status-2 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:17,625] INFO Finished reading KafkaBasedLog for topic tracking-kafka-connect-s3-status (org.apache.kafka.connect.util.KafkaBasedLog) connect | [2020-02-24 23:34:17,625] INFO Started KafkaBasedLog for topic tracking-kafka-connect-s3-status (org.apache.kafka.connect.util.KafkaBasedLog) connect | [2020-02-24 23:34:17,628] INFO Starting KafkaConfigBackingStore (org.apache.kafka.connect.storage.KafkaConfigBackingStore) connect | [2020-02-24 23:34:17,628] INFO Starting KafkaBasedLog with topic tracking-kafka-connect-s3-configs (org.apache.kafka.connect.util.KafkaBasedLog) connect | [2020-02-24 23:34:17,629] INFO AdminClientConfig values: connect | bootstrap.servers = [broker:29092] connect | client.dns.lookup = default connect | client.id = connect | connections.max.idle.ms = 300000 connect | metadata.max.age.ms = 300000 connect | metric.reporters = [] connect | metrics.num.samples = 2 connect | metrics.recording.level = INFO connect | metrics.sample.window.ms = 30000 connect | receive.buffer.bytes = 65536 connect | reconnect.backoff.max.ms = 1000 connect | reconnect.backoff.ms = 50 connect | request.timeout.ms = 120000 connect | retries = 5 connect | retry.backoff.ms = 100 connect | sasl.client.callback.handler.class = null connect | sasl.jaas.config = null connect | sasl.kerberos.kinit.cmd = /usr/bin/kinit connect | sasl.kerberos.min.time.before.relogin = 60000 connect | sasl.kerberos.service.name = null connect | sasl.kerberos.ticket.renew.jitter = 0.05 connect | sasl.kerberos.ticket.renew.window.factor = 0.8 connect | sasl.login.callback.handler.class = null connect | sasl.login.class = null connect | sasl.login.refresh.buffer.seconds = 300 connect | sasl.login.refresh.min.period.seconds = 60 connect | sasl.login.refresh.window.factor = 0.8 connect | sasl.login.refresh.window.jitter = 0.05 connect | sasl.mechanism = GSSAPI connect | security.protocol = PLAINTEXT connect | security.providers = null connect | send.buffer.bytes = 131072 connect | ssl.cipher.suites = null connect | ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] connect | ssl.endpoint.identification.algorithm = https connect | ssl.key.password = null connect | ssl.keymanager.algorithm = SunX509 connect | ssl.keystore.location = null connect | ssl.keystore.password = null connect | ssl.keystore.type = JKS connect | ssl.protocol = TLS connect | ssl.provider = null connect | ssl.secure.random.implementation = null connect | ssl.trustmanager.algorithm = PKIX connect | ssl.truststore.location = null connect | ssl.truststore.password = null connect | ssl.truststore.type = JKS connect | (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:17,637] WARN The configuration 'log4j.loggers' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:17,638] WARN The configuration 'config.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:17,638] WARN The configuration 'schema.compatibility' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:17,638] WARN The configuration 'rest.advertised.host.name' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:17,648] WARN The configuration 'status.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:17,648] WARN The configuration 'group.id' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:17,648] WARN The configuration 'plugin.path' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:17,648] WARN The configuration 'internal.key.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:17,648] WARN The configuration 'value.converter.protoclassname' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:17,648] WARN The configuration 'config.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:17,648] WARN The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:17,649] WARN The configuration 'rest.port' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:17,649] WARN The configuration 'internal.key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:17,649] WARN The configuration 'zookeeper.connect' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:17,649] WARN The configuration 'internal.value.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:17,649] WARN The configuration 'status.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:17,649] WARN The configuration 'internal.value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:17,649] WARN The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:17,649] WARN The configuration 'offset.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:17,649] WARN The configuration 'value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:17,649] WARN The configuration 'key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig) connect | [2020-02-24 23:34:17,650] INFO Kafka version: 5.4.0-ccs (org.apache.kafka.common.utils.AppInfoParser) connect | [2020-02-24 23:34:17,650] INFO Kafka commitId: 9401111f8c4bb0ec (org.apache.kafka.common.utils.AppInfoParser) connect | [2020-02-24 23:34:17,650] INFO Kafka startTimeMs: 1582587257649 (org.apache.kafka.common.utils.AppInfoParser) connect | [2020-02-24 23:34:17,657] INFO [Producer clientId=producer-2] Cluster ID: ZeSnCBNdRh6pGQwnc-VNVQ (org.apache.kafka.clients.Metadata) connect | [2020-02-24 23:34:17,815] INFO Created topic (name=tracking-kafka-connect-s3-configs, numPartitions=1, replicationFactor=1, replicasAssignments=null, configs={cleanup.policy=compact}) on brokers at broker:29092 (org.apache.kafka.connect.util.TopicAdmin) connect | [2020-02-24 23:34:17,823] INFO ProducerConfig values: connect | acks = all connect | batch.size = 16384 connect | bootstrap.servers = [broker:29092] connect | buffer.memory = 33554432 connect | client.dns.lookup = default connect | client.id = connect | compression.type = none connect | connections.max.idle.ms = 540000 connect | delivery.timeout.ms = 2147483647 connect | enable.idempotence = false connect | interceptor.classes = [] connect | key.serializer = class org.apache.kafka.common.serialization.StringSerializer connect | linger.ms = 0 connect | max.block.ms = 60000 connect | max.in.flight.requests.per.connection = 1 connect | max.request.size = 1048576 connect | metadata.max.age.ms = 300000 connect | metric.reporters = [] connect | metrics.num.samples = 2 connect | metrics.recording.level = INFO connect | metrics.sample.window.ms = 30000 connect | partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner connect | receive.buffer.bytes = 32768 connect | reconnect.backoff.max.ms = 1000 connect | reconnect.backoff.ms = 50 connect | request.timeout.ms = 30000 connect | retries = 2147483647 connect | retry.backoff.ms = 100 connect | sasl.client.callback.handler.class = null connect | sasl.jaas.config = null connect | sasl.kerberos.kinit.cmd = /usr/bin/kinit connect | sasl.kerberos.min.time.before.relogin = 60000 connect | sasl.kerberos.service.name = null connect | sasl.kerberos.ticket.renew.jitter = 0.05 connect | sasl.kerberos.ticket.renew.window.factor = 0.8 connect | sasl.login.callback.handler.class = null connect | sasl.login.class = null connect | sasl.login.refresh.buffer.seconds = 300 connect | sasl.login.refresh.min.period.seconds = 60 connect | sasl.login.refresh.window.factor = 0.8 connect | sasl.login.refresh.window.jitter = 0.05 connect | sasl.mechanism = GSSAPI connect | security.protocol = PLAINTEXT connect | security.providers = null connect | send.buffer.bytes = 131072 connect | ssl.cipher.suites = null connect | ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] connect | ssl.endpoint.identification.algorithm = https connect | ssl.key.password = null connect | ssl.keymanager.algorithm = SunX509 connect | ssl.keystore.location = null connect | ssl.keystore.password = null connect | ssl.keystore.type = JKS connect | ssl.protocol = TLS connect | ssl.provider = null connect | ssl.secure.random.implementation = null connect | ssl.trustmanager.algorithm = PKIX connect | ssl.truststore.location = null connect | ssl.truststore.password = null connect | ssl.truststore.type = JKS connect | transaction.timeout.ms = 60000 connect | transactional.id = null connect | value.serializer = class org.apache.kafka.common.serialization.ByteArraySerializer connect | (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:17,858] WARN The configuration 'log4j.loggers' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:17,858] WARN The configuration 'group.id' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:17,858] WARN The configuration 'plugin.path' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:17,859] WARN The configuration 'internal.key.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:17,859] WARN The configuration 'status.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:17,859] WARN The configuration 'offset.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:17,859] WARN The configuration 'value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:17,859] WARN The configuration 'key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:17,859] WARN The configuration 'config.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:17,859] WARN The configuration 'schema.compatibility' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:17,859] WARN The configuration 'rest.advertised.host.name' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:17,863] WARN The configuration 'status.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:17,863] WARN The configuration 'value.converter.protoclassname' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:17,863] WARN The configuration 'config.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:17,864] WARN The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:17,864] WARN The configuration 'rest.port' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:17,864] WARN The configuration 'internal.key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:17,864] WARN The configuration 'zookeeper.connect' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:17,864] WARN The configuration 'internal.value.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:17,864] WARN The configuration 'internal.value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:17,864] WARN The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig) connect | [2020-02-24 23:34:17,864] INFO Kafka version: 5.4.0-ccs (org.apache.kafka.common.utils.AppInfoParser) connect | [2020-02-24 23:34:17,865] INFO Kafka commitId: 9401111f8c4bb0ec (org.apache.kafka.common.utils.AppInfoParser) connect | [2020-02-24 23:34:17,865] INFO Kafka startTimeMs: 1582587257864 (org.apache.kafka.common.utils.AppInfoParser) connect | [2020-02-24 23:34:17,880] INFO ConsumerConfig values: connect | allow.auto.create.topics = true connect | auto.commit.interval.ms = 5000 connect | auto.offset.reset = earliest connect | bootstrap.servers = [broker:29092] connect | check.crcs = true connect | client.dns.lookup = default connect | client.id = connect | client.rack = connect | connections.max.idle.ms = 540000 connect | default.api.timeout.ms = 60000 connect | enable.auto.commit = false connect | exclude.internal.topics = true connect | fetch.max.bytes = 52428800 connect | fetch.max.wait.ms = 500 connect | fetch.min.bytes = 1 connect | group.id = tracking-kafka-connect-s3 connect | group.instance.id = null connect | heartbeat.interval.ms = 3000 connect | interceptor.classes = [] connect | internal.leave.group.on.close = true connect | isolation.level = read_uncommitted connect | key.deserializer = class org.apache.kafka.common.serialization.StringDeserializer connect | max.partition.fetch.bytes = 1048576 connect | max.poll.interval.ms = 300000 connect | max.poll.records = 500 connect | metadata.max.age.ms = 300000 connect | metric.reporters = [] connect | metrics.num.samples = 2 connect | metrics.recording.level = INFO connect | metrics.sample.window.ms = 30000 connect | partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor] connect | receive.buffer.bytes = 65536 connect | reconnect.backoff.max.ms = 1000 connect | reconnect.backoff.ms = 50 connect | request.timeout.ms = 30000 connect | retry.backoff.ms = 100 connect | sasl.client.callback.handler.class = null connect | sasl.jaas.config = null connect | sasl.kerberos.kinit.cmd = /usr/bin/kinit connect | sasl.kerberos.min.time.before.relogin = 60000 connect | sasl.kerberos.service.name = null connect | sasl.kerberos.ticket.renew.jitter = 0.05 connect | sasl.kerberos.ticket.renew.window.factor = 0.8 connect | sasl.login.callback.handler.class = null connect | sasl.login.class = null connect | sasl.login.refresh.buffer.seconds = 300 connect | sasl.login.refresh.min.period.seconds = 60 connect | sasl.login.refresh.window.factor = 0.8 connect | sasl.login.refresh.window.jitter = 0.05 connect | sasl.mechanism = GSSAPI connect | security.protocol = PLAINTEXT connect | security.providers = null connect | send.buffer.bytes = 131072 connect | session.timeout.ms = 10000 connect | ssl.cipher.suites = null connect | ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] connect | ssl.endpoint.identification.algorithm = https connect | ssl.key.password = null connect | ssl.keymanager.algorithm = SunX509 connect | ssl.keystore.location = null connect | ssl.keystore.password = null connect | ssl.keystore.type = JKS connect | ssl.protocol = TLS connect | ssl.provider = null connect | ssl.secure.random.implementation = null connect | ssl.trustmanager.algorithm = PKIX connect | ssl.truststore.location = null connect | ssl.truststore.password = null connect | ssl.truststore.type = JKS connect | value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer connect | (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:17,929] WARN The configuration 'log4j.loggers' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:17,929] WARN The configuration 'plugin.path' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:17,930] WARN The configuration 'internal.key.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:17,930] WARN The configuration 'status.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:17,930] WARN The configuration 'offset.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:17,930] WARN The configuration 'value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:17,930] WARN The configuration 'key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:17,930] WARN The configuration 'config.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:17,930] WARN The configuration 'schema.compatibility' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:17,930] WARN The configuration 'rest.advertised.host.name' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:17,930] WARN The configuration 'status.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:17,930] WARN The configuration 'value.converter.protoclassname' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:17,930] WARN The configuration 'config.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:17,930] WARN The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:17,930] WARN The configuration 'rest.port' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:17,930] WARN The configuration 'internal.key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:17,930] WARN The configuration 'zookeeper.connect' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:17,930] WARN The configuration 'internal.value.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:17,931] WARN The configuration 'internal.value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:17,931] WARN The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:17,931] INFO Kafka version: 5.4.0-ccs (org.apache.kafka.common.utils.AppInfoParser) connect | [2020-02-24 23:34:17,931] INFO Kafka commitId: 9401111f8c4bb0ec (org.apache.kafka.common.utils.AppInfoParser) connect | [2020-02-24 23:34:17,931] INFO Kafka startTimeMs: 1582587257931 (org.apache.kafka.common.utils.AppInfoParser) connect | [2020-02-24 23:34:17,975] INFO [Producer clientId=producer-3] Cluster ID: ZeSnCBNdRh6pGQwnc-VNVQ (org.apache.kafka.clients.Metadata) connect | [2020-02-24 23:34:17,993] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-3, groupId=tracking-kafka-connect-s3] Subscribed to partition(s): tracking-kafka-connect-s3-configs-0 (org.apache.kafka.clients.consumer.KafkaConsumer) connect | [2020-02-24 23:34:17,993] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-3, groupId=tracking-kafka-connect-s3] Seeking to EARLIEST offset of partition tracking-kafka-connect-s3-configs-0 (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:18,000] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-3, groupId=tracking-kafka-connect-s3] Cluster ID: ZeSnCBNdRh6pGQwnc-VNVQ (org.apache.kafka.clients.Metadata) connect | [2020-02-24 23:34:18,017] INFO [Consumer clientId=consumer-tracking-kafka-connect-s3-3, groupId=tracking-kafka-connect-s3] Resetting offset for partition tracking-kafka-connect-s3-configs-0 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | Feb 24, 2020 11:34:18 PM org.glassfish.jersey.internal.Errors logErrors connect | WARNING: The following warnings have been detected: WARNING: The (sub)resource method listLoggers in org.apache.kafka.connect.runtime.rest.resources.LoggingResource contains empty path annotation. connect | WARNING: The (sub)resource method listConnectors in org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource contains empty path annotation. connect | WARNING: The (sub)resource method createConnector in org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource contains empty path annotation. connect | WARNING: The (sub)resource method listConnectorPlugins in org.apache.kafka.connect.runtime.rest.resources.ConnectorPluginsResource contains empty path annotation. connect | WARNING: The (sub)resource method serverInfo in org.apache.kafka.connect.runtime.rest.resources.RootResource contains empty path annotation. connect | connect | [2020-02-24 23:34:18,029] INFO Finished reading KafkaBasedLog for topic tracking-kafka-connect-s3-configs (org.apache.kafka.connect.util.KafkaBasedLog) connect | [2020-02-24 23:34:18,029] INFO Started KafkaBasedLog for topic tracking-kafka-connect-s3-configs (org.apache.kafka.connect.util.KafkaBasedLog) connect | [2020-02-24 23:34:18,029] INFO Started KafkaConfigBackingStore (org.apache.kafka.connect.storage.KafkaConfigBackingStore) connect | [2020-02-24 23:34:18,029] INFO [Worker clientId=connect-1, groupId=tracking-kafka-connect-s3] Herder started (org.apache.kafka.connect.runtime.distributed.DistributedHerder) connect | [2020-02-24 23:34:18,060] INFO Started o.e.j.s.ServletContextHandler@18578491{/,null,AVAILABLE} (org.eclipse.jetty.server.handler.ContextHandler) connect | [2020-02-24 23:34:18,061] INFO REST resources initialized; server is started and ready to handle requests (org.apache.kafka.connect.runtime.rest.RestServer) connect | [2020-02-24 23:34:18,061] INFO Kafka Connect started (org.apache.kafka.connect.runtime.Connect) connect | [2020-02-24 23:34:18,064] INFO [Worker clientId=connect-1, groupId=tracking-kafka-connect-s3] Cluster ID: ZeSnCBNdRh6pGQwnc-VNVQ (org.apache.kafka.clients.Metadata) connect | [2020-02-24 23:34:19,177] INFO [Worker clientId=connect-1, groupId=tracking-kafka-connect-s3] Discovered group coordinator broker:29092 (id: 2147483646 rack: null) (org.apache.kafka.clients.consumer.internals.AbstractCoordinator) connect | [2020-02-24 23:34:19,181] INFO [Worker clientId=connect-1, groupId=tracking-kafka-connect-s3] Rebalance started (org.apache.kafka.connect.runtime.distributed.WorkerCoordinator) connect | [2020-02-24 23:34:19,181] INFO [Worker clientId=connect-1, groupId=tracking-kafka-connect-s3] (Re-)joining group (org.apache.kafka.clients.consumer.internals.AbstractCoordinator) connect | [2020-02-24 23:34:19,251] INFO [Worker clientId=connect-1, groupId=tracking-kafka-connect-s3] (Re-)joining group (org.apache.kafka.clients.consumer.internals.AbstractCoordinator) connect | [2020-02-24 23:34:19,346] INFO [Worker clientId=connect-1, groupId=tracking-kafka-connect-s3] Successfully joined group with generation 1 (org.apache.kafka.clients.consumer.internals.AbstractCoordinator) connect | [2020-02-24 23:34:19,347] INFO [Worker clientId=connect-1, groupId=tracking-kafka-connect-s3] Joined group at generation 1 with protocol version 2 and got assignment: Assignment{error=0, leader='connect-1-76ef6d4f-e2d8-4218-b914-4c3423f2f2f9', leaderUrl='http://connect:8083/', offset=-1, connectorIds=[], taskIds=[], revokedConnectorIds=[], revokedTaskIds=[], delay=0} with rebalance delay: 0 (org.apache.kafka.connect.runtime.distributed.DistributedHerder) connect | [2020-02-24 23:34:19,351] INFO [Worker clientId=connect-1, groupId=tracking-kafka-connect-s3] Starting connectors and tasks using config offset -1 (org.apache.kafka.connect.runtime.distributed.DistributedHerder) connect | [2020-02-24 23:34:19,351] INFO [Worker clientId=connect-1, groupId=tracking-kafka-connect-s3] Finished starting connectors and tasks (org.apache.kafka.connect.runtime.distributed.DistributedHerder) connect | [2020-02-24 23:34:19,494] INFO [Worker clientId=connect-1, groupId=tracking-kafka-connect-s3] Session key updated (org.apache.kafka.connect.runtime.distributed.DistributedHerder) connect | [2020-02-24 23:34:23,582] INFO AbstractConfig values: connect | (org.apache.kafka.common.config.AbstractConfig) connect | [2020-02-24 23:34:23,596] INFO [Worker clientId=connect-1, groupId=tracking-kafka-connect-s3] Connector s3-sink config updated (org.apache.kafka.connect.runtime.distributed.DistributedHerder) connect | [2020-02-24 23:34:24,106] INFO [Worker clientId=connect-1, groupId=tracking-kafka-connect-s3] Rebalance started (org.apache.kafka.connect.runtime.distributed.WorkerCoordinator) connect | [2020-02-24 23:34:24,106] INFO [Worker clientId=connect-1, groupId=tracking-kafka-connect-s3] (Re-)joining group (org.apache.kafka.clients.consumer.internals.AbstractCoordinator) connect | [2020-02-24 23:34:24,129] INFO [Worker clientId=connect-1, groupId=tracking-kafka-connect-s3] Successfully joined group with generation 2 (org.apache.kafka.clients.consumer.internals.AbstractCoordinator) connect | [2020-02-24 23:34:24,130] INFO [Worker clientId=connect-1, groupId=tracking-kafka-connect-s3] Joined group at generation 2 with protocol version 2 and got assignment: Assignment{error=0, leader='connect-1-76ef6d4f-e2d8-4218-b914-4c3423f2f2f9', leaderUrl='http://connect:8083/', offset=2, connectorIds=[s3-sink], taskIds=[], revokedConnectorIds=[], revokedTaskIds=[], delay=0} with rebalance delay: 0 (org.apache.kafka.connect.runtime.distributed.DistributedHerder) connect | [2020-02-24 23:34:24,130] INFO [Worker clientId=connect-1, groupId=tracking-kafka-connect-s3] Starting connectors and tasks using config offset 2 (org.apache.kafka.connect.runtime.distributed.DistributedHerder) connect | [2020-02-24 23:34:24,133] INFO [Worker clientId=connect-1, groupId=tracking-kafka-connect-s3] Starting connector s3-sink (org.apache.kafka.connect.runtime.distributed.DistributedHerder) connect | [2020-02-24 23:34:24,143] INFO ConnectorConfig values: connect | config.action.reload = restart connect | connector.class = io.confluent.connect.s3.S3SinkConnector connect | errors.log.enable = false connect | errors.log.include.messages = false connect | errors.retry.delay.max.ms = 60000 connect | errors.retry.timeout = 0 connect | errors.tolerance = none connect | header.converter = null connect | key.converter = class org.apache.kafka.connect.storage.StringConverter connect | name = s3-sink connect | tasks.max = 1 connect | transforms = [] connect | value.converter = class com.blueapron.connect.protobuf.ProtobufConverter connect | (org.apache.kafka.connect.runtime.ConnectorConfig) connect | [2020-02-24 23:34:24,146] INFO EnrichedConnectorConfig values: connect | config.action.reload = restart connect | connector.class = io.confluent.connect.s3.S3SinkConnector connect | errors.log.enable = false connect | errors.log.include.messages = false connect | errors.retry.delay.max.ms = 60000 connect | errors.retry.timeout = 0 connect | errors.tolerance = none connect | header.converter = null connect | key.converter = class org.apache.kafka.connect.storage.StringConverter connect | name = s3-sink connect | tasks.max = 1 connect | transforms = [] connect | value.converter = class com.blueapron.connect.protobuf.ProtobufConverter connect | (org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig) connect | [2020-02-24 23:34:24,146] INFO Creating connector s3-sink of type io.confluent.connect.s3.S3SinkConnector (org.apache.kafka.connect.runtime.Worker) connect | [2020-02-24 23:34:24,159] INFO Instantiated connector s3-sink with version 5.4.0 of type class io.confluent.connect.s3.S3SinkConnector (org.apache.kafka.connect.runtime.Worker) connect | [2020-02-24 23:34:24,162] INFO S3SinkConnectorConfig values: connect | avro.codec = null connect | behavior.on.null.values = fail connect | connect.meta.data = true connect | enhanced.avro.schema.support = false connect | filename.offset.zero.pad.width = 10 connect | flush.size = 100000 connect | format.bytearray.extension = .bin connect | format.bytearray.separator = null connect | format.class = class io.confluent.connect.s3.format.parquet.ParquetFormat connect | parquet.codec = snappy connect | retry.backoff.ms = 5000 connect | rotate.interval.ms = 900 connect | rotate.schedule.interval.ms = -1 connect | s3.acl.canned = null connect | s3.bucket.name = dev-bagi connect | s3.compression.type = none connect | s3.credentials.provider.class = class com.amazonaws.auth.DefaultAWSCredentialsProviderChain connect | s3.http.send.expect.continue = true connect | s3.part.retries = 3 connect | s3.part.size = 5242880 connect | s3.proxy.password = [hidden] connect | s3.proxy.url = connect | s3.proxy.user = null connect | s3.region = us-east-1 connect | s3.retry.backoff.ms = 200 connect | s3.sse.customer.key = [hidden] connect | s3.sse.kms.key.id = connect | s3.ssea.name = connect | s3.wan.mode = false connect | schema.cache.size = 1000 connect | schema.compatibility = NONE connect | shutdown.timeout.ms = 3000 connect | (io.confluent.connect.s3.S3SinkConnectorConfig) connect | [2020-02-24 23:34:24,164] INFO StorageCommonConfig values: connect | directory.delim = / connect | file.delim = _ connect | storage.class = class io.confluent.connect.s3.storage.S3Storage connect | store.url = null connect | topics.dir = topics connect | (io.confluent.connect.storage.common.StorageCommonConfig) connect | [2020-02-24 23:34:24,165] INFO PartitionerConfig values: connect | locale = en-US connect | partition.duration.ms = 900000 connect | partition.field.name = [] connect | partitioner.class = class io.confluent.connect.storage.partitioner.TimeBasedPartitioner connect | path.format = 'dt'=YYYY-MM-dd connect | timestamp.extractor = RecordField connect | timestamp.field = event_ts connect | timestamp.unit = s connect | timezone = UTC connect | (io.confluent.connect.storage.partitioner.PartitionerConfig) connect | [2020-02-24 23:34:24,166] INFO Starting S3 connector s3-sink (io.confluent.connect.s3.S3SinkConnector) connect | [2020-02-24 23:34:24,177] INFO Finished creating connector s3-sink (org.apache.kafka.connect.runtime.Worker) connect | [2020-02-24 23:34:24,182] INFO SinkConnectorConfig values: connect | config.action.reload = restart connect | connector.class = io.confluent.connect.s3.S3SinkConnector connect | errors.deadletterqueue.context.headers.enable = false connect | errors.deadletterqueue.topic.name = connect | errors.deadletterqueue.topic.replication.factor = 3 connect | errors.log.enable = false connect | errors.log.include.messages = false connect | errors.retry.delay.max.ms = 60000 connect | errors.retry.timeout = 0 connect | errors.tolerance = none connect | header.converter = null connect | key.converter = class org.apache.kafka.connect.storage.StringConverter connect | name = s3-sink connect | tasks.max = 1 connect | topics = [] connect | topics.regex = .* connect | transforms = [] connect | value.converter = class com.blueapron.connect.protobuf.ProtobufConverter connect | (org.apache.kafka.connect.runtime.SinkConnectorConfig) connect | [2020-02-24 23:34:24,183] INFO EnrichedConnectorConfig values: connect | config.action.reload = restart connect | connector.class = io.confluent.connect.s3.S3SinkConnector connect | errors.deadletterqueue.context.headers.enable = false connect | errors.deadletterqueue.topic.name = connect | errors.deadletterqueue.topic.replication.factor = 3 connect | errors.log.enable = false connect | errors.log.include.messages = false connect | errors.retry.delay.max.ms = 60000 connect | errors.retry.timeout = 0 connect | errors.tolerance = none connect | header.converter = null connect | key.converter = class org.apache.kafka.connect.storage.StringConverter connect | name = s3-sink connect | tasks.max = 1 connect | topics = [] connect | topics.regex = .* connect | transforms = [] connect | value.converter = class com.blueapron.connect.protobuf.ProtobufConverter connect | (org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig) connect | [2020-02-24 23:34:25,138] INFO [Worker clientId=connect-1, groupId=tracking-kafka-connect-s3] Tasks [s3-sink-0] configs updated (org.apache.kafka.connect.runtime.distributed.DistributedHerder) connect | [2020-02-24 23:34:25,140] INFO [Worker clientId=connect-1, groupId=tracking-kafka-connect-s3] Finished starting connectors and tasks (org.apache.kafka.connect.runtime.distributed.DistributedHerder) connect | [2020-02-24 23:34:25,143] INFO [Worker clientId=connect-1, groupId=tracking-kafka-connect-s3] Handling task config update by restarting tasks [] (org.apache.kafka.connect.runtime.distributed.DistributedHerder) connect | [2020-02-24 23:34:25,144] INFO [Worker clientId=connect-1, groupId=tracking-kafka-connect-s3] Rebalance started (org.apache.kafka.connect.runtime.distributed.WorkerCoordinator) connect | [2020-02-24 23:34:25,145] INFO [Worker clientId=connect-1, groupId=tracking-kafka-connect-s3] (Re-)joining group (org.apache.kafka.clients.consumer.internals.AbstractCoordinator) connect | [2020-02-24 23:34:25,160] INFO [Worker clientId=connect-1, groupId=tracking-kafka-connect-s3] Successfully joined group with generation 3 (org.apache.kafka.clients.consumer.internals.AbstractCoordinator) connect | [2020-02-24 23:34:25,161] INFO [Worker clientId=connect-1, groupId=tracking-kafka-connect-s3] Joined group at generation 3 with protocol version 2 and got assignment: Assignment{error=0, leader='connect-1-76ef6d4f-e2d8-4218-b914-4c3423f2f2f9', leaderUrl='http://connect:8083/', offset=4, connectorIds=[s3-sink], taskIds=[s3-sink-0], revokedConnectorIds=[], revokedTaskIds=[], delay=0} with rebalance delay: 0 (org.apache.kafka.connect.runtime.distributed.DistributedHerder) connect | [2020-02-24 23:34:25,162] INFO [Worker clientId=connect-1, groupId=tracking-kafka-connect-s3] Starting connectors and tasks using config offset 4 (org.apache.kafka.connect.runtime.distributed.DistributedHerder) connect | [2020-02-24 23:34:25,164] INFO [Worker clientId=connect-1, groupId=tracking-kafka-connect-s3] Starting task s3-sink-0 (org.apache.kafka.connect.runtime.distributed.DistributedHerder) connect | [2020-02-24 23:34:25,164] INFO Creating task s3-sink-0 (org.apache.kafka.connect.runtime.Worker) connect | [2020-02-24 23:34:25,168] INFO ConnectorConfig values: connect | config.action.reload = restart connect | connector.class = io.confluent.connect.s3.S3SinkConnector connect | errors.log.enable = false connect | errors.log.include.messages = false connect | errors.retry.delay.max.ms = 60000 connect | errors.retry.timeout = 0 connect | errors.tolerance = none connect | header.converter = null connect | key.converter = class org.apache.kafka.connect.storage.StringConverter connect | name = s3-sink connect | tasks.max = 1 connect | transforms = [] connect | value.converter = class com.blueapron.connect.protobuf.ProtobufConverter connect | (org.apache.kafka.connect.runtime.ConnectorConfig) connect | [2020-02-24 23:34:25,168] INFO EnrichedConnectorConfig values: connect | config.action.reload = restart connect | connector.class = io.confluent.connect.s3.S3SinkConnector connect | errors.log.enable = false connect | errors.log.include.messages = false connect | errors.retry.delay.max.ms = 60000 connect | errors.retry.timeout = 0 connect | errors.tolerance = none connect | header.converter = null connect | key.converter = class org.apache.kafka.connect.storage.StringConverter connect | name = s3-sink connect | tasks.max = 1 connect | transforms = [] connect | value.converter = class com.blueapron.connect.protobuf.ProtobufConverter connect | (org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig) connect | [2020-02-24 23:34:25,172] INFO TaskConfig values: connect | task.class = class io.confluent.connect.s3.S3SinkTask connect | (org.apache.kafka.connect.runtime.TaskConfig) connect | [2020-02-24 23:34:25,174] INFO Instantiated task s3-sink-0 with version 5.4.0 of type io.confluent.connect.s3.S3SinkTask (org.apache.kafka.connect.runtime.Worker) connect | [2020-02-24 23:34:25,176] INFO StringConverterConfig values: connect | converter.encoding = UTF8 connect | converter.type = key connect | (org.apache.kafka.connect.storage.StringConverterConfig) connect | [2020-02-24 23:34:25,347] INFO Set up the key converter class org.apache.kafka.connect.storage.StringConverter for task s3-sink-0 using the connector config (org.apache.kafka.connect.runtime.Worker) connect | [2020-02-24 23:34:25,347] INFO Set up the value converter class com.blueapron.connect.protobuf.ProtobufConverter for task s3-sink-0 using the connector config (org.apache.kafka.connect.runtime.Worker) connect | [2020-02-24 23:34:25,348] INFO Set up the header converter class org.apache.kafka.connect.storage.SimpleHeaderConverter for task s3-sink-0 using the worker config (org.apache.kafka.connect.runtime.Worker) connect | [2020-02-24 23:34:25,354] INFO Initializing: org.apache.kafka.connect.runtime.TransformationChain{} (org.apache.kafka.connect.runtime.Worker) connect | [2020-02-24 23:34:25,355] INFO SinkConnectorConfig values: connect | config.action.reload = restart connect | connector.class = io.confluent.connect.s3.S3SinkConnector connect | errors.deadletterqueue.context.headers.enable = false connect | errors.deadletterqueue.topic.name = connect | errors.deadletterqueue.topic.replication.factor = 3 connect | errors.log.enable = false connect | errors.log.include.messages = false connect | errors.retry.delay.max.ms = 60000 connect | errors.retry.timeout = 0 connect | errors.tolerance = none connect | header.converter = null connect | key.converter = class org.apache.kafka.connect.storage.StringConverter connect | name = s3-sink connect | tasks.max = 1 connect | topics = [] connect | topics.regex = .* connect | transforms = [] connect | value.converter = class com.blueapron.connect.protobuf.ProtobufConverter connect | (org.apache.kafka.connect.runtime.SinkConnectorConfig) connect | [2020-02-24 23:34:25,355] INFO EnrichedConnectorConfig values: connect | config.action.reload = restart connect | connector.class = io.confluent.connect.s3.S3SinkConnector connect | errors.deadletterqueue.context.headers.enable = false connect | errors.deadletterqueue.topic.name = connect | errors.deadletterqueue.topic.replication.factor = 3 connect | errors.log.enable = false connect | errors.log.include.messages = false connect | errors.retry.delay.max.ms = 60000 connect | errors.retry.timeout = 0 connect | errors.tolerance = none connect | header.converter = null connect | key.converter = class org.apache.kafka.connect.storage.StringConverter connect | name = s3-sink connect | tasks.max = 1 connect | topics = [] connect | topics.regex = .* connect | transforms = [] connect | value.converter = class com.blueapron.connect.protobuf.ProtobufConverter connect | (org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig) connect | [2020-02-24 23:34:25,358] INFO ConsumerConfig values: connect | allow.auto.create.topics = true connect | auto.commit.interval.ms = 5000 connect | auto.offset.reset = earliest connect | bootstrap.servers = [broker:29092] connect | check.crcs = true connect | client.dns.lookup = default connect | client.id = connector-consumer-s3-sink-0 connect | client.rack = connect | connections.max.idle.ms = 540000 connect | default.api.timeout.ms = 60000 connect | enable.auto.commit = false connect | exclude.internal.topics = true connect | fetch.max.bytes = 52428800 connect | fetch.max.wait.ms = 500 connect | fetch.min.bytes = 1 connect | group.id = connect-s3-sink connect | group.instance.id = null connect | heartbeat.interval.ms = 3000 connect | interceptor.classes = [] connect | internal.leave.group.on.close = true connect | isolation.level = read_uncommitted connect | key.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer connect | max.partition.fetch.bytes = 1048576 connect | max.poll.interval.ms = 300000 connect | max.poll.records = 500 connect | metadata.max.age.ms = 300000 connect | metric.reporters = [] connect | metrics.num.samples = 2 connect | metrics.recording.level = INFO connect | metrics.sample.window.ms = 30000 connect | partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor] connect | receive.buffer.bytes = 65536 connect | reconnect.backoff.max.ms = 1000 connect | reconnect.backoff.ms = 50 connect | request.timeout.ms = 30000 connect | retry.backoff.ms = 100 connect | sasl.client.callback.handler.class = null connect | sasl.jaas.config = null connect | sasl.kerberos.kinit.cmd = /usr/bin/kinit connect | sasl.kerberos.min.time.before.relogin = 60000 connect | sasl.kerberos.service.name = null connect | sasl.kerberos.ticket.renew.jitter = 0.05 connect | sasl.kerberos.ticket.renew.window.factor = 0.8 connect | sasl.login.callback.handler.class = null connect | sasl.login.class = null connect | sasl.login.refresh.buffer.seconds = 300 connect | sasl.login.refresh.min.period.seconds = 60 connect | sasl.login.refresh.window.factor = 0.8 connect | sasl.login.refresh.window.jitter = 0.05 connect | sasl.mechanism = GSSAPI connect | security.protocol = PLAINTEXT connect | security.providers = null connect | send.buffer.bytes = 131072 connect | session.timeout.ms = 10000 connect | ssl.cipher.suites = null connect | ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] connect | ssl.endpoint.identification.algorithm = https connect | ssl.key.password = null connect | ssl.keymanager.algorithm = SunX509 connect | ssl.keystore.location = null connect | ssl.keystore.password = null connect | ssl.keystore.type = JKS connect | ssl.protocol = TLS connect | ssl.provider = null connect | ssl.secure.random.implementation = null connect | ssl.trustmanager.algorithm = PKIX connect | ssl.truststore.location = null connect | ssl.truststore.password = null connect | ssl.truststore.type = JKS connect | value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer connect | (org.apache.kafka.clients.consumer.ConsumerConfig) connect | [2020-02-24 23:34:25,367] INFO Kafka version: 5.4.0-ccs (org.apache.kafka.common.utils.AppInfoParser) connect | [2020-02-24 23:34:25,367] INFO Kafka commitId: 9401111f8c4bb0ec (org.apache.kafka.common.utils.AppInfoParser) connect | [2020-02-24 23:34:25,367] INFO Kafka startTimeMs: 1582587265367 (org.apache.kafka.common.utils.AppInfoParser) connect | [2020-02-24 23:34:25,379] INFO [Worker clientId=connect-1, groupId=tracking-kafka-connect-s3] Finished starting connectors and tasks (org.apache.kafka.connect.runtime.distributed.DistributedHerder) connect | [2020-02-24 23:34:25,383] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Subscribed to pattern: '.*' (org.apache.kafka.clients.consumer.KafkaConsumer) connect | [2020-02-24 23:34:25,387] INFO S3SinkConnectorConfig values: connect | avro.codec = null connect | behavior.on.null.values = fail connect | connect.meta.data = true connect | enhanced.avro.schema.support = false connect | filename.offset.zero.pad.width = 10 connect | flush.size = 100000 connect | format.bytearray.extension = .bin connect | format.bytearray.separator = null connect | format.class = class io.confluent.connect.s3.format.parquet.ParquetFormat connect | parquet.codec = snappy connect | retry.backoff.ms = 5000 connect | rotate.interval.ms = 900 connect | rotate.schedule.interval.ms = -1 connect | s3.acl.canned = null connect | s3.bucket.name = dev-bagi connect | s3.compression.type = none connect | s3.credentials.provider.class = class com.amazonaws.auth.DefaultAWSCredentialsProviderChain connect | s3.http.send.expect.continue = true connect | s3.part.retries = 3 connect | s3.part.size = 5242880 connect | s3.proxy.password = [hidden] connect | s3.proxy.url = connect | s3.proxy.user = null connect | s3.region = us-east-1 connect | s3.retry.backoff.ms = 200 connect | s3.sse.customer.key = [hidden] connect | s3.sse.kms.key.id = connect | s3.ssea.name = connect | s3.wan.mode = false connect | schema.cache.size = 1000 connect | schema.compatibility = NONE connect | shutdown.timeout.ms = 3000 connect | (io.confluent.connect.s3.S3SinkConnectorConfig) connect | [2020-02-24 23:34:25,389] INFO StorageCommonConfig values: connect | directory.delim = / connect | file.delim = _ connect | storage.class = class io.confluent.connect.s3.storage.S3Storage connect | store.url = null connect | topics.dir = topics connect | (io.confluent.connect.storage.common.StorageCommonConfig) connect | [2020-02-24 23:34:25,394] INFO PartitionerConfig values: connect | locale = en-US connect | partition.duration.ms = 900000 connect | partition.field.name = [] connect | partitioner.class = class io.confluent.connect.storage.partitioner.TimeBasedPartitioner connect | path.format = 'dt'=YYYY-MM-dd connect | timestamp.extractor = RecordField connect | timestamp.field = event_ts connect | timestamp.unit = s connect | timezone = UTC connect | (io.confluent.connect.storage.partitioner.PartitionerConfig) connect | [2020-02-24 23:34:26,679] INFO AvroDataConfig values: connect | connect.meta.data = true connect | enhanced.avro.schema.support = false connect | schemas.cache.config = 1000 connect | (io.confluent.connect.avro.AvroDataConfig) connect | [2020-02-24 23:34:26,688] INFO Started S3 connector task with assigned partitions: [] (io.confluent.connect.s3.S3SinkTask) connect | [2020-02-24 23:34:26,689] INFO WorkerSinkTask{id=s3-sink-0} Sink task finished initialization and start (org.apache.kafka.connect.runtime.WorkerSinkTask) connect | [2020-02-24 23:34:26,711] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Cluster ID: ZeSnCBNdRh6pGQwnc-VNVQ (org.apache.kafka.clients.Metadata) connect | [2020-02-24 23:34:26,712] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Discovered group coordinator broker:29092 (id: 2147483646 rack: null) (org.apache.kafka.clients.consumer.internals.AbstractCoordinator) connect | [2020-02-24 23:34:26,718] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] (Re-)joining group (org.apache.kafka.clients.consumer.internals.AbstractCoordinator) connect | [2020-02-24 23:34:26,742] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] (Re-)joining group (org.apache.kafka.clients.consumer.internals.AbstractCoordinator) connect | [2020-02-24 23:34:26,759] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Finished assignment for group at generation 1: {connector-consumer-s3-sink-0-0f2a6c35-3605-4e67-b9c8-ee63c088ebfd=org.apache.kafka.clients.consumer.ConsumerPartitionAssignor$Assignment@5638a064} (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator) connect | [2020-02-24 23:34:26,768] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Successfully joined group with generation 1 (org.apache.kafka.clients.consumer.internals.AbstractCoordinator) connect | [2020-02-24 23:34:26,778] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Adding newly assigned partitions: tracking-kafka-connect-s3-offsets-24, tracking-kafka-connect-s3-offsets-22, tracking-kafka-connect-s3-offsets-3, tracking-kafka-connect-s3-offsets-1, tracking-kafka-connect-s3-status-4, tracking-kafka-connect-s3-status-2, tracking-kafka-connect-s3-offsets-11, tracking-kafka-connect-s3-offsets-9, tracking-kafka-connect-s3-offsets-7, tracking-kafka-connect-s3-offsets-5, tracking-kafka-connect-s3-offsets-19, tracking-kafka-connect-s3-offsets-17, tracking-kafka-connect-s3-offsets-15, tracking-kafka-connect-s3-offsets-13, tracking-kafka-connect-s3-offsets-23, tracking-kafka-connect-s3-offsets-21, tracking-kafka-connect-s3-offsets-4, tracking-kafka-connect-s3-offsets-2, tracking-kafka-connect-s3-offsets-0, tracking-kafka-connect-s3-status-3, tracking-kafka-connect-s3-status-1, tracking-kafka-connect-s3-offsets-12, tracking-kafka-connect-s3-configs-0, tracking-kafka-connect-s3-status-0, tracking-kafka-connect-s3-offsets-10, tracking-kafka-connect-s3-offsets-8, tracking-kafka-connect-s3-offsets-6, tracking-kafka-connect-s3-offsets-20, tracking-kafka-connect-s3-offsets-18, tracking-kafka-connect-s3-offsets-16, tracking-kafka-connect-s3-offsets-14, __confluent.support.metrics-0 (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator) connect | [2020-02-24 23:34:26,796] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Found no committed offset for partition tracking-kafka-connect-s3-offsets-24 (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator) connect | [2020-02-24 23:34:26,796] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Found no committed offset for partition tracking-kafka-connect-s3-offsets-22 (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator) connect | [2020-02-24 23:34:26,796] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Found no committed offset for partition tracking-kafka-connect-s3-offsets-3 (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator) connect | [2020-02-24 23:34:26,796] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Found no committed offset for partition tracking-kafka-connect-s3-offsets-1 (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator) connect | [2020-02-24 23:34:26,797] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Found no committed offset for partition tracking-kafka-connect-s3-status-4 (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator) connect | [2020-02-24 23:34:26,797] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Found no committed offset for partition tracking-kafka-connect-s3-status-2 (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator) connect | [2020-02-24 23:34:26,797] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Found no committed offset for partition tracking-kafka-connect-s3-offsets-11 (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator) connect | [2020-02-24 23:34:26,797] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Found no committed offset for partition tracking-kafka-connect-s3-offsets-9 (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator) connect | [2020-02-24 23:34:26,797] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Found no committed offset for partition tracking-kafka-connect-s3-offsets-7 (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator) connect | [2020-02-24 23:34:26,797] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Found no committed offset for partition tracking-kafka-connect-s3-offsets-5 (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator) connect | [2020-02-24 23:34:26,797] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Found no committed offset for partition tracking-kafka-connect-s3-offsets-19 (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator) connect | [2020-02-24 23:34:26,797] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Found no committed offset for partition tracking-kafka-connect-s3-offsets-17 (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator) connect | [2020-02-24 23:34:26,797] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Found no committed offset for partition tracking-kafka-connect-s3-offsets-15 (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator) connect | [2020-02-24 23:34:26,797] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Found no committed offset for partition tracking-kafka-connect-s3-offsets-13 (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator) connect | [2020-02-24 23:34:26,797] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Found no committed offset for partition tracking-kafka-connect-s3-offsets-23 (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator) connect | [2020-02-24 23:34:26,797] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Found no committed offset for partition tracking-kafka-connect-s3-offsets-21 (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator) connect | [2020-02-24 23:34:26,797] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Found no committed offset for partition tracking-kafka-connect-s3-offsets-4 (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator) connect | [2020-02-24 23:34:26,797] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Found no committed offset for partition tracking-kafka-connect-s3-offsets-2 (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator) connect | [2020-02-24 23:34:26,797] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Found no committed offset for partition tracking-kafka-connect-s3-offsets-0 (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator) connect | [2020-02-24 23:34:26,798] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Found no committed offset for partition tracking-kafka-connect-s3-status-3 (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator) connect | [2020-02-24 23:34:26,798] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Found no committed offset for partition tracking-kafka-connect-s3-status-1 (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator) connect | [2020-02-24 23:34:26,798] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Found no committed offset for partition tracking-kafka-connect-s3-offsets-12 (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator) connect | [2020-02-24 23:34:26,798] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Found no committed offset for partition tracking-kafka-connect-s3-configs-0 (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator) connect | [2020-02-24 23:34:26,798] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Found no committed offset for partition tracking-kafka-connect-s3-status-0 (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator) connect | [2020-02-24 23:34:26,798] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Found no committed offset for partition tracking-kafka-connect-s3-offsets-10 (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator) connect | [2020-02-24 23:34:26,798] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Found no committed offset for partition tracking-kafka-connect-s3-offsets-8 (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator) connect | [2020-02-24 23:34:26,799] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Found no committed offset for partition tracking-kafka-connect-s3-offsets-6 (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator) connect | [2020-02-24 23:34:26,799] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Found no committed offset for partition tracking-kafka-connect-s3-offsets-20 (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator) connect | [2020-02-24 23:34:26,799] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Found no committed offset for partition tracking-kafka-connect-s3-offsets-18 (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator) connect | [2020-02-24 23:34:26,799] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Found no committed offset for partition tracking-kafka-connect-s3-offsets-16 (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator) connect | [2020-02-24 23:34:26,800] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Found no committed offset for partition tracking-kafka-connect-s3-offsets-14 (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator) connect | [2020-02-24 23:34:26,800] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Found no committed offset for partition __confluent.support.metrics-0 (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator) connect | [2020-02-24 23:34:26,816] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Resetting offset for partition tracking-kafka-connect-s3-offsets-24 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:26,817] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Resetting offset for partition tracking-kafka-connect-s3-offsets-22 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:26,818] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Resetting offset for partition tracking-kafka-connect-s3-offsets-3 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:26,818] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Resetting offset for partition tracking-kafka-connect-s3-offsets-1 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:26,818] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Resetting offset for partition tracking-kafka-connect-s3-status-4 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:26,819] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Resetting offset for partition tracking-kafka-connect-s3-status-2 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:26,819] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Resetting offset for partition tracking-kafka-connect-s3-offsets-11 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:26,820] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Resetting offset for partition tracking-kafka-connect-s3-offsets-9 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:26,820] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Resetting offset for partition tracking-kafka-connect-s3-offsets-7 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:26,820] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Resetting offset for partition tracking-kafka-connect-s3-offsets-5 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:26,821] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Resetting offset for partition tracking-kafka-connect-s3-offsets-19 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:26,821] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Resetting offset for partition tracking-kafka-connect-s3-offsets-17 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:26,822] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Resetting offset for partition tracking-kafka-connect-s3-offsets-15 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:26,822] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Resetting offset for partition tracking-kafka-connect-s3-offsets-13 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:26,822] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Resetting offset for partition tracking-kafka-connect-s3-offsets-23 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:26,823] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Resetting offset for partition tracking-kafka-connect-s3-offsets-21 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:26,823] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Resetting offset for partition tracking-kafka-connect-s3-offsets-4 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:26,823] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Resetting offset for partition tracking-kafka-connect-s3-offsets-2 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:26,823] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Resetting offset for partition tracking-kafka-connect-s3-status-3 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:26,824] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Resetting offset for partition tracking-kafka-connect-s3-offsets-0 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:26,824] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Resetting offset for partition tracking-kafka-connect-s3-status-1 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:26,825] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Resetting offset for partition tracking-kafka-connect-s3-offsets-12 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:26,825] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Resetting offset for partition tracking-kafka-connect-s3-configs-0 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:26,826] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Resetting offset for partition tracking-kafka-connect-s3-status-0 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:26,827] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Resetting offset for partition tracking-kafka-connect-s3-offsets-10 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:26,827] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Resetting offset for partition tracking-kafka-connect-s3-offsets-8 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:26,827] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Resetting offset for partition tracking-kafka-connect-s3-offsets-6 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:26,828] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Resetting offset for partition tracking-kafka-connect-s3-offsets-20 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:26,828] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Resetting offset for partition tracking-kafka-connect-s3-offsets-18 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:26,829] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Resetting offset for partition tracking-kafka-connect-s3-offsets-16 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:26,829] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Resetting offset for partition tracking-kafka-connect-s3-offsets-14 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:26,830] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Resetting offset for partition __confluent.support.metrics-0 to offset 0. (org.apache.kafka.clients.consumer.internals.SubscriptionState) connect | [2020-02-24 23:34:26,887] ERROR WorkerSinkTask{id=s3-sink-0} Task threw an uncaught and unrecoverable exception (org.apache.kafka.connect.runtime.WorkerTask) connect | org.apache.kafka.connect.errors.ConnectException: Tolerance exceeded in error handler connect | at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndHandleError(RetryWithToleranceOperator.java:178) connect | at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execute(RetryWithToleranceOperator.java:104) connect | at org.apache.kafka.connect.runtime.WorkerSinkTask.convertAndTransformRecord(WorkerSinkTask.java:488) connect | at org.apache.kafka.connect.runtime.WorkerSinkTask.convertMessages(WorkerSinkTask.java:465) connect | at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:321) connect | at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:224) connect | at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:192) connect | at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:177) connect | at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:227) connect | at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) connect | at java.util.concurrent.FutureTask.run(FutureTask.java:266) connect | at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) connect | at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) connect | at java.lang.Thread.run(Thread.java:748) connect | Caused by: org.apache.kafka.connect.errors.DataException: Invalid protobuf data connect | at com.blueapron.connect.protobuf.ProtobufData.getMessage(ProtobufData.java:65) connect | at com.blueapron.connect.protobuf.ProtobufData.toConnectData(ProtobufData.java:102) connect | at com.blueapron.connect.protobuf.ProtobufConverter.toConnectData(ProtobufConverter.java:66) connect | at org.apache.kafka.connect.storage.Converter.toConnectData(Converter.java:86) connect | at org.apache.kafka.connect.runtime.WorkerSinkTask.lambda$convertAndTransformRecord$2(WorkerSinkTask.java:488) connect | at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndRetry(RetryWithToleranceOperator.java:128) connect | at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndHandleError(RetryWithToleranceOperator.java:162) connect | ... 13 more connect | Caused by: com.google.protobuf.InvalidProtocolBufferException: While parsing a protocol message, the input ended unexpectedly in the middle of a field. This could mean either that the input has been truncated or that an embedded message misreported its own length. connect | at com.google.protobuf.InvalidProtocolBufferException.truncatedMessage(InvalidProtocolBufferException.java:84) connect | at com.google.protobuf.CodedInputStream$ArrayDecoder.readRawLittleEndian32(CodedInputStream.java:1140) connect | at com.google.protobuf.CodedInputStream$ArrayDecoder.readFixed32(CodedInputStream.java:777) connect | at com.google.protobuf.UnknownFieldSet$Builder.mergeFieldFrom(UnknownFieldSet.java:554) connect | at com.google.protobuf.UnknownFieldSet$Builder.mergeFrom(UnknownFieldSet.java:521) connect | at com.google.protobuf.UnknownFieldSet$Builder.mergeFrom(UnknownFieldSet.java:634) connect | at com.google.protobuf.UnknownFieldSet$Builder.mergeFrom(UnknownFieldSet.java:308) connect | at com.google.protobuf.CodedInputStream$ArrayDecoder.readGroup(CodedInputStream.java:833) connect | at com.google.protobuf.UnknownFieldSet$Builder.mergeFieldFrom(UnknownFieldSet.java:548) connect | at com.google.protobuf.UnknownFieldSet$Builder.mergeFrom(UnknownFieldSet.java:521) connect | at com.google.protobuf.UnknownFieldSet$Builder.mergeFrom(UnknownFieldSet.java:634) connect | at com.google.protobuf.UnknownFieldSet$Builder.mergeFrom(UnknownFieldSet.java:308) connect | at com.google.protobuf.CodedInputStream$ArrayDecoder.readGroup(CodedInputStream.java:833) connect | at com.google.protobuf.UnknownFieldSet$Builder.mergeFieldFrom(UnknownFieldSet.java:548) connect | at com.google.protobuf.GeneratedMessageV3.parseUnknownField(GeneratedMessageV3.java:320) connect | at com.bagi.protobuf.TrackingEvent$Event.<init>(TrackingEvent.java:1422) connect | at com.bagi.protobuf.TrackingEvent$Event.<init>(TrackingEvent.java:1028) connect | at com.bagi.protobuf.TrackingEvent$Event$1.parsePartialFrom(TrackingEvent.java:9588) connect | at com.bagi.protobuf.TrackingEvent$Event$1.parsePartialFrom(TrackingEvent.java:9582) connect | at com.bagi.protobuf.TrackingEvent$Event$Builder.mergeFrom(TrackingEvent.java:5003) connect | at com.bagi.protobuf.TrackingEvent$Event$Builder.mergeFrom(TrackingEvent.java:4508) connect | at com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:420) connect | at com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:317) connect | at com.google.protobuf.AbstractMessageLite$Builder.mergeFrom(AbstractMessageLite.java:222) connect | at com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:542) connect | at com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:317) connect | at com.google.protobuf.AbstractMessageLite$Builder.mergeFrom(AbstractMessageLite.java:214) connect | at com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:536) connect | at com.blueapron.connect.protobuf.ProtobufData.getMessage(ProtobufData.java:63) connect | ... 19 more connect | [2020-02-24 23:34:26,889] ERROR WorkerSinkTask{id=s3-sink-0} Task is being killed and will not recover until manually restarted (org.apache.kafka.connect.runtime.WorkerTask) connect | [2020-02-24 23:34:26,890] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Revoke previously assigned partitions tracking-kafka-connect-s3-offsets-24, tracking-kafka-connect-s3-offsets-22, tracking-kafka-connect-s3-offsets-3, tracking-kafka-connect-s3-offsets-1, tracking-kafka-connect-s3-status-4, tracking-kafka-connect-s3-status-2, tracking-kafka-connect-s3-offsets-11, tracking-kafka-connect-s3-offsets-9, tracking-kafka-connect-s3-offsets-7, tracking-kafka-connect-s3-offsets-5, tracking-kafka-connect-s3-offsets-19, tracking-kafka-connect-s3-offsets-17, tracking-kafka-connect-s3-offsets-15, tracking-kafka-connect-s3-offsets-13, tracking-kafka-connect-s3-offsets-23, tracking-kafka-connect-s3-offsets-21, tracking-kafka-connect-s3-offsets-4, tracking-kafka-connect-s3-offsets-2, tracking-kafka-connect-s3-offsets-0, tracking-kafka-connect-s3-status-3, tracking-kafka-connect-s3-status-1, tracking-kafka-connect-s3-offsets-12, tracking-kafka-connect-s3-status-0, tracking-kafka-connect-s3-configs-0, tracking-kafka-connect-s3-offsets-10, tracking-kafka-connect-s3-offsets-8, tracking-kafka-connect-s3-offsets-6, tracking-kafka-connect-s3-offsets-20, tracking-kafka-connect-s3-offsets-18, tracking-kafka-connect-s3-offsets-16, tracking-kafka-connect-s3-offsets-14, __confluent.support.metrics-0 (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator) connect | [2020-02-24 23:34:26,891] INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Member connector-consumer-s3-sink-0-0f2a6c35-3605-4e67-b9c8-ee63c088ebfd sending LeaveGroup request to coordinator broker:29092 (id: 2147483646 rack: null) due to the consumer is being closed (org.apache.kafka.clients.consumer.internals.AbstractCoordinator) C02TW0TMHTDG:kafka pbagrecha$
However the whole pipeline works if i use kafka installed via brew and use either of connect-standalone or connect-distributed.
resolved it thanks to https://github.com/confluentinc/cp-docker-images/issues/838#issuecomment-591205817
Here is the Dockerfile for building kafka-connect-s3
Here is my docker-compose.yml
Here is the config for kafka-connct-s3 which is used in
curl -XPOST http://localhost:8083/connectors -H "Content-Type: application/json" -d @tracking-kafka-connect-s3.json
to invoke the connectorHere are the connect logs
However the whole pipeline works if i use kafka installed via brew and use either of connect-standalone or connect-distributed.