GoogleCloudPlatform / pubsub

This repository contains open-source projects managed by the owners of Google Cloud Pub/Sub.
Apache License 2.0
247 stars 146 forks source link

Commit of offsets timed out - Many thread exceptions #221

Open jonathanendy opened 4 years ago

jonathanendy commented 4 years ago

Hi, Suddenly we got exceptions with the connector. I'm not sure but it seems that sending data to pubsub got exceptions for all parallel threads I don't think it is Kafka issue because we have another tool that works fine but sending to kinesis. Can someone help me to pinpoint the issue?

Thanks

[2020-05-21 06:10:03,533] INFO WorkerSinkTask{id=data-output-stream-0} Committing offsets asynchronously using sequence number 1483: {data-output-stream-0=OffsetAndMetadata{offset=599325844, leaderEpoch=null, metadata=''}} (org.apache.kafka.connect.runtime.WorkerSinkTask:344)
[2020-05-21 06:11:03,623] ERROR WorkerSinkTask{id=data-output-stream-0} Offset commit failed, rewinding to last committed offsets (org.apache.kafka.connect.runtime.WorkerSinkTask:384)
java.lang.RuntimeException: java.util.concurrent.ExecutionException: com.google.api.gax.rpc.DeadlineExceededException: io.grpc.StatusRuntimeException: DEADLINE_EXCEEDED: deadline exceeded after 9999884670ns
        at com.google.pubsub.kafka.sink.CloudPubSubSinkTask.flush(CloudPubSubSinkTask.java:324)
        at org.apache.kafka.connect.sink.SinkTask.preCommit(SinkTask.java:125)
        at org.apache.kafka.connect.runtime.WorkerSinkTask.commitOffsets(WorkerSinkTask.java:378)
        at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:208)
        at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:192)
        at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:177)
        at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:227)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
Caused by: java.util.concurrent.ExecutionException: com.google.api.gax.rpc.DeadlineExceededException: io.grpc.StatusRuntimeException: DEADLINE_EXCEEDED: deadline exceeded after 9999884670ns
        at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:500)
        at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:479)
        at com.google.common.util.concurrent.AbstractFuture$TrustedFuture.get(AbstractFuture.java:76)
        at com.google.common.util.concurrent.ForwardingFuture.get(ForwardingFuture.java:62)
        at com.google.pubsub.kafka.sink.CloudPubSubSinkTask.flush(CloudPubSubSinkTask.java:322)
        ... 11 more
Caused by: com.google.api.gax.rpc.DeadlineExceededException: io.grpc.StatusRuntimeException: DEADLINE_EXCEEDED: deadline exceeded after 9999884670ns
        at com.google.api.gax.rpc.ApiExceptionFactory.createException(ApiExceptionFactory.java:51)
        at com.google.api.gax.grpc.GrpcApiExceptionFactory.create(GrpcApiExceptionFactory.java:72)
        at com.google.api.gax.grpc.GrpcApiExceptionFactory.create(GrpcApiExceptionFactory.java:60)
        at com.google.api.gax.grpc.GrpcExceptionCallable$ExceptionTransformingFuture.onFailure(GrpcExceptionCallable.java:97)
        at com.google.api.core.ApiFutures$1.onFailure(ApiFutures.java:68)
        at com.google.common.util.concurrent.Futures$4.run(Futures.java:1123)
        at com.google.common.util.concurrent.MoreExecutors$DirectExecutor.execute(MoreExecutors.java:435)
        at com.google.common.util.concurrent.AbstractFuture.executeListener(AbstractFuture.java:900)
        at com.google.common.util.concurrent.AbstractFuture.complete(AbstractFuture.java:811)
        at com.google.common.util.concurrent.AbstractFuture.setException(AbstractFuture.java:675)
        at io.grpc.stub.ClientCalls$GrpcFuture.setException(ClientCalls.java:507)
        at io.grpc.stub.ClientCalls$UnaryStreamToFuture.onClose(ClientCalls.java:482)
        at io.grpc.PartialForwardingClientCallListener.onClose(PartialForwardingClientCallListener.java:39)
        at io.grpc.ForwardingClientCallListener.onClose(ForwardingClientCallListener.java:23)
        at io.grpc.ForwardingClientCallListener$SimpleForwardingClientCallListener.onClose(ForwardingClientCallListener.java:40)
        at io.grpc.internal.CensusStatsModule$StatsClientInterceptor$1$1.onClose(CensusStatsModule.java:678)
        at io.grpc.PartialForwardingClientCallListener.onClose(PartialForwardingClientCallListener.java:39)
        at io.grpc.ForwardingClientCallListener.onClose(ForwardingClientCallListener.java:23)
        at io.grpc.ForwardingClientCallListener$SimpleForwardingClientCallListener.onClose(ForwardingClientCallListener.java:40)
        at io.grpc.internal.CensusTracingModule$TracingClientInterceptor$1$1.onClose(CensusTracingModule.java:397)
        at io.grpc.internal.ClientCallImpl.closeObserver(ClientCallImpl.java:459)
        at io.grpc.internal.ClientCallImpl.access$300(ClientCallImpl.java:63)
        at io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl.close(ClientCallImpl.java:546)
        at io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl.access$600(ClientCallImpl.java:467)
        at io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl$1StreamClosed.runInContext(ClientCallImpl.java:584)
        at io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
        at io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
        at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
        ... 3 more
Caused by: io.grpc.StatusRuntimeException: DEADLINE_EXCEEDED: deadline exceeded after 9999884670ns
        at io.grpc.Status.asRuntimeException(Status.java:526)
        ... 23 more

and

ERROR WorkerSinkTask{id=data-output-stream-0} Commit of offsets threw an unexpected exception for sequence number 1486: null (org.apache.kafka.connect.runtime.WorkerSinkTask:259)
java.lang.RuntimeException: java.util.concurrent.ExecutionException: com.google.api.gax.rpc.DeadlineExceededException: io.grpc.StatusRuntimeException: DEADLINE_EXCEEDED: deadline exceeded after 9999859478ns
        at com.google.pubsub.kafka.sink.CloudPubSubSinkTask.flush(CloudPubSubSinkTask.java:324)
        at org.apache.kafka.connect.sink.SinkTask.preCommit(SinkTask.java:125)
        at org.apache.kafka.connect.runtime.WorkerSinkTask.commitOffsets(WorkerSinkTask.java:378)
        at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:208)
        at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:192)
        at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:177)
        at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:227)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
Caused by: java.util.concurrent.ExecutionException: com.google.api.gax.rpc.DeadlineExceededException: io.grpc.StatusRuntimeException: DEADLINE_EXCEEDED: deadline exceeded after 9999859478ns
        at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:500)
        at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:479)
        at com.google.common.util.concurrent.AbstractFuture$TrustedFuture.get(AbstractFuture.java:76)
        at com.google.common.util.concurrent.ForwardingFuture.get(ForwardingFuture.java:62)
        at com.google.pubsub.kafka.sink.CloudPubSubSinkTask.flush(CloudPubSubSinkTask.java:322)
        ... 11 more
Caused by: com.google.api.gax.rpc.DeadlineExceededException: io.grpc.StatusRuntimeException: DEADLINE_EXCEEDED: deadline exceeded after 9999859478ns
        at com.google.api.gax.rpc.ApiExceptionFactory.createException(ApiExceptionFactory.java:51)
        at com.google.api.gax.grpc.GrpcApiExceptionFactory.create(GrpcApiExceptionFactory.java:72)
        at com.google.api.gax.grpc.GrpcApiExceptionFactory.create(GrpcApiExceptionFactory.java:60)
        at com.google.api.gax.grpc.GrpcExceptionCallable$ExceptionTransformingFuture.onFailure(GrpcExceptionCallable.java:97)
        at com.google.api.core.ApiFutures$1.onFailure(ApiFutures.java:68)
        at com.google.common.util.concurrent.Futures$4.run(Futures.java:1123)
        at com.google.common.util.concurrent.MoreExecutors$DirectExecutor.execute(MoreExecutors.java:435)
        at com.google.common.util.concurrent.AbstractFuture.executeListener(AbstractFuture.java:900)
        at com.google.common.util.concurrent.AbstractFuture.complete(AbstractFuture.java:811)
        at com.google.common.util.concurrent.AbstractFuture.setException(AbstractFuture.java:675)
        at io.grpc.stub.ClientCalls$GrpcFuture.setException(ClientCalls.java:507)
        at io.grpc.stub.ClientCalls$UnaryStreamToFuture.onClose(ClientCalls.java:482)
        at io.grpc.PartialForwardingClientCallListener.onClose(PartialForwardingClientCallListener.java:39)
        at io.grpc.ForwardingClientCallListener.onClose(ForwardingClientCallListener.java:23)
        at io.grpc.ForwardingClientCallListener$SimpleForwardingClientCallListener.onClose(ForwardingClientCallListener.java:40)
        at io.grpc.internal.CensusStatsModule$StatsClientInterceptor$1$1.onClose(CensusStatsModule.java:678)
        at io.grpc.PartialForwardingClientCallListener.onClose(PartialForwardingClientCallListener.java:39)
        at io.grpc.ForwardingClientCallListener.onClose(ForwardingClientCallListener.java:23)
        at io.grpc.ForwardingClientCallListener$SimpleForwardingClientCallListener.onClose(ForwardingClientCallListener.java:40)
        at io.grpc.internal.CensusTracingModule$TracingClientInterceptor$1$1.onClose(CensusTracingModule.java:397)
        at io.grpc.internal.ClientCallImpl.closeObserver(ClientCallImpl.java:459)
        at io.grpc.internal.ClientCallImpl.access$300(ClientCallImpl.java:63)
        at io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl.close(ClientCallImpl.java:546)
        at io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl.access$600(ClientCallImpl.java:467)
        at io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl$1StreamClosed.runInContext(ClientCallImpl.java:584)
        at io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
        at io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
        at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
        ... 3 more
kamalaboulhosn commented 4 years ago

From where are you running the connector? Is it in GCP or elsewhere? It could be an issue with the connection between wherever Kafka Connect is running and GCP. I see that the default retry settings in the connector are not set properly, so I will fix that. In the meantime, you could try setting the maxTotalTimeoutMs property to 60000.

kamalaboulhosn commented 4 years ago

Actually, sorry, I was mistaken. The default is indeed 60s. Did you by chance change the value of maxTotalTimeoutMs?

jonathanendy commented 4 years ago

I didn't touch the default, I'm adding the full logs: The architecture is a local site (on prem) Kafka with the connector on the same CPU passing the data to pubsub

#####  /opt/kafka-latest/config/connect-standalone.properties  ####
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements.  See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License.  You may obtain a copy of the License at
#
#    http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# These are defaults. This file just demonstrates how to override some settings.
bootstrap.servers=kafka:29092
# The converters specify the format of data in Kafka and how to translate it into Connect data. Every Connect user will
# need to configure these based on the format they want their data in when loaded from or stored into Kafka
key.converter=org.apache.kafka.connect.storage.StringConverter
value.converter=org.apache.kafka.connect.storage.StringConverter
# Converter-specific settings can be passed in by prefixing the Converter's setting with the converter we want to apply
# it to
key.converter.schemas.enable=false
value.converter.schemas.enable=false
offset.storage.file.filename=/tmp/connect.offsets
# Flush much faster than normal, which is useful for testing/debugging
offset.flush.interval.ms=10000
# Set to a list of filesystem paths separated by commas (,) to enable class loading isolation for plugins
# (connectors, converters, transformations). The list should consist of top level directories that include 
# any combination of: 
# a) directories immediately containing jars with plugins and their dependencies
# b) uber-jars with plugins and their dependencies
# c) directories immediately containing the package directory structure of classes of plugins and their dependencies
# Note: symlinks will be followed to discover dependencies or plugins.
# Examples: 
#plugin.path=/usr/local/share/java,/usr/local/share/kafka/plugins,/opt/connectors,
plugin.path=/opt/kafka-latest/connectors#####  /opt/kafka-latest/config/cps-sink-connector.properties  ####
### INFO: 
### kafka connect offical site: http://kafka.apache.org/documentation.html#connect
### GCP kafka connect: https://github.com/GoogleCloudPlatform/pubsub/tree/master/kafka-connector

# name - Unique name for the connector. Attempting to register again with the same name will fail.
name=data-output-stream

# connector.class - The Java class for the connector
connector.class=com.google.pubsub.kafka.sink.CloudPubSubSinkConnector

# tasks.max - The maximum number of tasks that should be created for this connector. 
#  The connector may create fewer tasks if it cannot achieve this level of parallelism.
tasks.max=10

# topics - A comma-separated list of topics to use as input for this connector
topics=data-output-stream

# cps.topic - The topic in Cloud Pub/Sub to publish to, e.g. "foo" for topic "/projects/bar/topics/foo".
cps.topic=data-output-stream
cps.project=fabric-data-252413
[2020-05-22 03:45:23,296] INFO Kafka Connect standalone worker initializing ... (org.apache.kafka.connect.cli.ConnectStandalone:69)
[2020-05-22 03:45:23,328] INFO WorkerInfo values: 
    jvm.args = -Xms256M, -Xmx2G, -XX:+UseG1GC, -XX:MaxGCPauseMillis=20, -XX:InitiatingHeapOccupancyPercent=35, -XX:+ExplicitGCInvokesConcurrent, -Djava.awt.headless=true, -Dcom.sun.management.jmxremote, -Dcom.sun.management.jmxremote.authenticate=false, -Dcom.sun.management.jmxremote.ssl=false, -Dkafka.logs.dir=/opt/kafka-latest/bin/../logs, -Dlog4j.configuration=file:/opt/kafka-latest/bin/../config/connect-log4j.properties
    jvm.spec = Private Build, OpenJDK 64-Bit Server VM, 1.8.0_232, 25.232-b09
    jvm.classpath = /opt/kafka-latest/bin/../libs/activation-1.1.1.jar:/opt/kafka-latest/bin/../libs/aopalliance-repackaged-2.5.0.jar:/opt/kafka-latest/bin/../libs/argparse4j-0.7.0.jar:/opt/kafka-latest/bin/../libs/audience-annotations-0.5.0.jar:/opt/kafka-latest/bin/../libs/commons-lang3-3.8.1.jar:/opt/kafka-latest/bin/../libs/connect-api-2.3.0.jar:/opt/kafka-latest/bin/../libs/connect-basic-auth-extension-2.3.0.jar:/opt/kafka-latest/bin/../libs/connect-file-2.3.0.jar:/opt/kafka-latest/bin/../libs/connect-json-2.3.0.jar:/opt/kafka-latest/bin/../libs/connect-runtime-2.3.0.jar:/opt/kafka-latest/bin/../libs/connect-transforms-2.3.0.jar:/opt/kafka-latest/bin/../libs/guava-20.0.jar:/opt/kafka-latest/bin/../libs/hk2-api-2.5.0.jar:/opt/kafka-latest/bin/../libs/hk2-locator-2.5.0.jar:/opt/kafka-latest/bin/../libs/hk2-utils-2.5.0.jar:/opt/kafka-latest/bin/../libs/jackson-annotations-2.9.9.jar:/opt/kafka-latest/bin/../libs/jackson-core-2.9.9.jar:/opt/kafka-latest/bin/../libs/jackson-databind-2.9.9.jar:/opt/kafka-latest/bin/../libs/jackson-dataformat-csv-2.9.9.jar:/opt/kafka-latest/bin/../libs/jackson-datatype-jdk8-2.9.9.jar:/opt/kafka-latest/bin/../libs/jackson-jaxrs-base-2.9.9.jar:/opt/kafka-latest/bin/../libs/jackson-jaxrs-json-provider-2.9.9.jar:/opt/kafka-latest/bin/../libs/jackson-module-jaxb-annotations-2.9.9.jar:/opt/kafka-latest/bin/../libs/jackson-module-paranamer-2.9.9.jar:/opt/kafka-latest/bin/../libs/jackson-module-scala_2.12-2.9.9.jar:/opt/kafka-latest/bin/../libs/jakarta.annotation-api-1.3.4.jar:/opt/kafka-latest/bin/../libs/jakarta.inject-2.5.0.jar:/opt/kafka-latest/bin/../libs/jakarta.ws.rs-api-2.1.5.jar:/opt/kafka-latest/bin/../libs/javassist-3.22.0-CR2.jar:/opt/kafka-latest/bin/../libs/javax.servlet-api-3.1.0.jar:/opt/kafka-latest/bin/../libs/javax.ws.rs-api-2.1.1.jar:/opt/kafka-latest/bin/../libs/jaxb-api-2.3.0.jar:/opt/kafka-latest/bin/../libs/jersey-client-2.28.jar:/opt/kafka-latest/bin/../libs/jersey-common-2.28.jar:/opt/kafka-latest/bin/../libs/jersey-container-servlet-2.28.jar:/opt/kafka-latest/bin/../libs/jersey-container-servlet-core-2.28.jar:/opt/kafka-latest/bin/../libs/jersey-hk2-2.28.jar:/opt/kafka-latest/bin/../libs/jersey-media-jaxb-2.28.jar:/opt/kafka-latest/bin/../libs/jersey-server-2.28.jar:/opt/kafka-latest/bin/../libs/jetty-client-9.4.18.v20190429.jar:/opt/kafka-latest/bin/../libs/jetty-continuation-9.4.18.v20190429.jar:/opt/kafka-latest/bin/../libs/jetty-http-9.4.18.v20190429.jar:/opt/kafka-latest/bin/../libs/jetty-io-9.4.18.v20190429.jar:/opt/kafka-latest/bin/../libs/jetty-security-9.4.18.v20190429.jar:/opt/kafka-latest/bin/../libs/jetty-server-9.4.18.v20190429.jar:/opt/kafka-latest/bin/../libs/jetty-servlet-9.4.18.v20190429.jar:/opt/kafka-latest/bin/../libs/jetty-servlets-9.4.18.v20190429.jar:/opt/kafka-latest/bin/../libs/jetty-util-9.4.18.v20190429.jar:/opt/kafka-latest/bin/../libs/jopt-simple-5.0.4.jar:/opt/kafka-latest/bin/../libs/jsr305-3.0.2.jar:/opt/kafka-latest/bin/../libs/kafka-clients-2.3.0.jar:/opt/kafka-latest/bin/../libs/kafka-log4j-appender-2.3.0.jar:/opt/kafka-latest/bin/../libs/kafka-streams-2.3.0.jar:/opt/kafka-latest/bin/../libs/kafka-streams-examples-2.3.0.jar:/opt/kafka-latest/bin/../libs/kafka-streams-scala_2.12-2.3.0.jar:/opt/kafka-latest/bin/../libs/kafka-streams-test-utils-2.3.0.jar:/opt/kafka-latest/bin/../libs/kafka-tools-2.3.0.jar:/opt/kafka-latest/bin/../libs/kafka_2.12-2.3.0-sources.jar:/opt/kafka-latest/bin/../libs/kafka_2.12-2.3.0.jar:/opt/kafka-latest/bin/../libs/log4j-1.2.17.jar:/opt/kafka-latest/bin/../libs/lz4-java-1.6.0.jar:/opt/kafka-latest/bin/../libs/maven-artifact-3.6.1.jar:/opt/kafka-latest/bin/../libs/metrics-core-2.2.0.jar:/opt/kafka-latest/bin/../libs/osgi-resource-locator-1.0.1.jar:/opt/kafka-latest/bin/../libs/paranamer-2.8.jar:/opt/kafka-latest/bin/../libs/plexus-utils-3.2.0.jar:/opt/kafka-latest/bin/../libs/reflections-0.9.11.jar:/opt/kafka-latest/bin/../libs/rocksdbjni-5.18.3.jar:/opt/kafka-latest/bin/../libs/scala-library-2.12.8.jar:/opt/kafka-latest/bin/../libs/scala-logging_2.12-3.9.0.jar:/opt/kafka-latest/bin/../libs/scala-reflect-2.12.8.jar:/opt/kafka-latest/bin/../libs/slf4j-api-1.7.26.jar:/opt/kafka-latest/bin/../libs/slf4j-log4j12-1.7.26.jar:/opt/kafka-latest/bin/../libs/snappy-java-1.1.7.3.jar:/opt/kafka-latest/bin/../libs/spotbugs-annotations-3.1.9.jar:/opt/kafka-latest/bin/../libs/validation-api-2.0.1.Final.jar:/opt/kafka-latest/bin/../libs/zkclient-0.11.jar:/opt/kafka-latest/bin/../libs/zookeeper-3.4.14.jar:/opt/kafka-latest/bin/../libs/zstd-jni-1.4.0-1.jar
    os.spec = Linux, amd64, 4.15.0-99-generic
    os.vcpus = 4
 (org.apache.kafka.connect.runtime.WorkerInfo:71)
[2020-05-22 03:45:23,354] INFO Scanning for plugin classes. This might take a moment ... (org.apache.kafka.connect.cli.ConnectStandalone:78)
[2020-05-22 03:45:23,433] INFO Loading plugin from: /opt/kafka-latest/connectors/cps-kafka-connector.jar (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:222)
[2020-05-22 03:45:26,789] INFO Registered loader: PluginClassLoader{pluginLocation=file:/opt/kafka-latest/connectors/cps-kafka-connector.jar} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:245)
[2020-05-22 03:45:26,790] INFO Added plugin 'com.google.pubsub.kafka.sink.CloudPubSubSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174)
[2020-05-22 03:45:26,790] INFO Added plugin 'com.google.pubsub.kafka.source.CloudPubSubSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174)
[2020-05-22 03:45:26,790] INFO Added plugin 'org.apache.kafka.connect.connector.policy.AllConnectorClientConfigOverridePolicy' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174)
[2020-05-22 03:45:26,790] INFO Added plugin 'org.apache.kafka.connect.connector.policy.PrincipalConnectorClientConfigOverridePolicy' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174)
[2020-05-22 03:45:26,790] INFO Added plugin 'org.apache.kafka.connect.connector.policy.NoneConnectorClientConfigOverridePolicy' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174)
[2020-05-22 03:45:29,574] INFO Registered loader: sun.misc.Launcher$AppClassLoader@764c12b6 (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:245)
[2020-05-22 03:45:29,574] INFO Added plugin 'org.apache.kafka.connect.tools.MockConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174)
[2020-05-22 03:45:29,575] INFO Added plugin 'org.apache.kafka.connect.tools.SchemaSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174)
[2020-05-22 03:45:29,576] INFO Added plugin 'org.apache.kafka.connect.tools.VerifiableSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174)
[2020-05-22 03:45:29,578] INFO Added plugin 'org.apache.kafka.connect.tools.MockSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174)
[2020-05-22 03:45:29,579] INFO Added plugin 'org.apache.kafka.connect.file.FileStreamSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174)
[2020-05-22 03:45:29,581] INFO Added plugin 'org.apache.kafka.connect.tools.MockSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174)
[2020-05-22 03:45:29,582] INFO Added plugin 'org.apache.kafka.connect.file.FileStreamSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174)
[2020-05-22 03:45:29,583] INFO Added plugin 'org.apache.kafka.connect.tools.VerifiableSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174)
[2020-05-22 03:45:29,583] INFO Added plugin 'org.apache.kafka.connect.converters.ByteArrayConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174)
[2020-05-22 03:45:29,583] INFO Added plugin 'org.apache.kafka.connect.storage.StringConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174)
[2020-05-22 03:45:29,584] INFO Added plugin 'org.apache.kafka.connect.json.JsonConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174)
[2020-05-22 03:45:29,584] INFO Added plugin 'org.apache.kafka.connect.converters.FloatConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174)
[2020-05-22 03:45:29,584] INFO Added plugin 'org.apache.kafka.connect.converters.DoubleConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174)
[2020-05-22 03:45:29,585] INFO Added plugin 'org.apache.kafka.connect.converters.IntegerConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174)
[2020-05-22 03:45:29,585] INFO Added plugin 'org.apache.kafka.connect.converters.LongConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174)
[2020-05-22 03:45:29,585] INFO Added plugin 'org.apache.kafka.connect.converters.ShortConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174)
[2020-05-22 03:45:29,585] INFO Added plugin 'org.apache.kafka.connect.storage.SimpleHeaderConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174)
[2020-05-22 03:45:29,586] INFO Added plugin 'org.apache.kafka.connect.transforms.MaskField$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174)
[2020-05-22 03:45:29,587] INFO Added plugin 'org.apache.kafka.connect.transforms.ValueToKey' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174)
[2020-05-22 03:45:29,587] INFO Added plugin 'org.apache.kafka.connect.transforms.Flatten$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174)
[2020-05-22 03:45:29,591] INFO Added plugin 'org.apache.kafka.connect.transforms.HoistField$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174)
[2020-05-22 03:45:29,591] INFO Added plugin 'org.apache.kafka.connect.transforms.InsertField$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174)
[2020-05-22 03:45:29,591] INFO Added plugin 'org.apache.kafka.connect.transforms.Flatten$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174)
[2020-05-22 03:45:29,594] INFO Added plugin 'org.apache.kafka.connect.transforms.SetSchemaMetadata$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174)
[2020-05-22 03:45:29,595] INFO Added plugin 'org.apache.kafka.connect.transforms.Cast$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174)
[2020-05-22 03:45:29,596] INFO Added plugin 'org.apache.kafka.connect.transforms.ExtractField$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174)
[2020-05-22 03:45:29,596] INFO Added plugin 'org.apache.kafka.connect.transforms.RegexRouter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174)
[2020-05-22 03:45:29,597] INFO Added plugin 'org.apache.kafka.connect.transforms.TimestampConverter$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174)
[2020-05-22 03:45:29,597] INFO Added plugin 'org.apache.kafka.connect.transforms.ReplaceField$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174)
[2020-05-22 03:45:29,599] INFO Added plugin 'org.apache.kafka.connect.transforms.InsertField$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174)
[2020-05-22 03:45:29,599] INFO Added plugin 'org.apache.kafka.connect.transforms.ReplaceField$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174)
[2020-05-22 03:45:29,600] INFO Added plugin 'org.apache.kafka.connect.transforms.MaskField$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174)
[2020-05-22 03:45:29,600] INFO Added plugin 'org.apache.kafka.connect.transforms.SetSchemaMetadata$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174)
[2020-05-22 03:45:29,601] INFO Added plugin 'org.apache.kafka.connect.transforms.TimestampConverter$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174)
[2020-05-22 03:45:29,601] INFO Added plugin 'org.apache.kafka.connect.transforms.HoistField$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174)
[2020-05-22 03:45:29,602] INFO Added plugin 'org.apache.kafka.connect.transforms.TimestampRouter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174)
[2020-05-22 03:45:29,603] INFO Added plugin 'org.apache.kafka.connect.transforms.Cast$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174)
[2020-05-22 03:45:29,603] INFO Added plugin 'org.apache.kafka.connect.transforms.ExtractField$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174)
[2020-05-22 03:45:29,604] INFO Added plugin 'org.apache.kafka.common.config.provider.FileConfigProvider' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174)
[2020-05-22 03:45:29,605] INFO Added plugin 'org.apache.kafka.connect.rest.basic.auth.extension.BasicAuthSecurityRestExtension' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174)
[2020-05-22 03:45:29,607] INFO Added aliases 'CloudPubSubSinkConnector' and 'CloudPubSubSink' to plugin 'com.google.pubsub.kafka.sink.CloudPubSubSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397)
[2020-05-22 03:45:29,608] INFO Added aliases 'CloudPubSubSourceConnector' and 'CloudPubSubSource' to plugin 'com.google.pubsub.kafka.source.CloudPubSubSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397)
[2020-05-22 03:45:29,609] INFO Added aliases 'FileStreamSinkConnector' and 'FileStreamSink' to plugin 'org.apache.kafka.connect.file.FileStreamSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397)
[2020-05-22 03:45:29,610] INFO Added aliases 'FileStreamSourceConnector' and 'FileStreamSource' to plugin 'org.apache.kafka.connect.file.FileStreamSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397)
[2020-05-22 03:45:29,610] INFO Added aliases 'MockConnector' and 'Mock' to plugin 'org.apache.kafka.connect.tools.MockConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397)
[2020-05-22 03:45:29,611] INFO Added aliases 'MockSinkConnector' and 'MockSink' to plugin 'org.apache.kafka.connect.tools.MockSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397)
[2020-05-22 03:45:29,612] INFO Added aliases 'MockSourceConnector' and 'MockSource' to plugin 'org.apache.kafka.connect.tools.MockSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397)
[2020-05-22 03:45:29,613] INFO Added aliases 'SchemaSourceConnector' and 'SchemaSource' to plugin 'org.apache.kafka.connect.tools.SchemaSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397)
[2020-05-22 03:45:29,614] INFO Added aliases 'VerifiableSinkConnector' and 'VerifiableSink' to plugin 'org.apache.kafka.connect.tools.VerifiableSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397)
[2020-05-22 03:45:29,614] INFO Added aliases 'VerifiableSourceConnector' and 'VerifiableSource' to plugin 'org.apache.kafka.connect.tools.VerifiableSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397)
[2020-05-22 03:45:29,615] INFO Added aliases 'ByteArrayConverter' and 'ByteArray' to plugin 'org.apache.kafka.connect.converters.ByteArrayConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397)
[2020-05-22 03:45:29,616] INFO Added aliases 'DoubleConverter' and 'Double' to plugin 'org.apache.kafka.connect.converters.DoubleConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397)
[2020-05-22 03:45:29,616] INFO Added aliases 'FloatConverter' and 'Float' to plugin 'org.apache.kafka.connect.converters.FloatConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397)
[2020-05-22 03:45:29,617] INFO Added aliases 'IntegerConverter' and 'Integer' to plugin 'org.apache.kafka.connect.converters.IntegerConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397)
[2020-05-22 03:45:29,618] INFO Added aliases 'LongConverter' and 'Long' to plugin 'org.apache.kafka.connect.converters.LongConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397)
[2020-05-22 03:45:29,618] INFO Added aliases 'ShortConverter' and 'Short' to plugin 'org.apache.kafka.connect.converters.ShortConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397)
[2020-05-22 03:45:29,619] INFO Added aliases 'JsonConverter' and 'Json' to plugin 'org.apache.kafka.connect.json.JsonConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397)
[2020-05-22 03:45:29,620] INFO Added aliases 'StringConverter' and 'String' to plugin 'org.apache.kafka.connect.storage.StringConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397)
[2020-05-22 03:45:29,621] INFO Added aliases 'ByteArrayConverter' and 'ByteArray' to plugin 'org.apache.kafka.connect.converters.ByteArrayConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397)
[2020-05-22 03:45:29,622] INFO Added aliases 'DoubleConverter' and 'Double' to plugin 'org.apache.kafka.connect.converters.DoubleConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397)
[2020-05-22 03:45:29,623] INFO Added aliases 'FloatConverter' and 'Float' to plugin 'org.apache.kafka.connect.converters.FloatConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397)
[2020-05-22 03:45:29,623] INFO Added aliases 'IntegerConverter' and 'Integer' to plugin 'org.apache.kafka.connect.converters.IntegerConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397)
[2020-05-22 03:45:29,624] INFO Added aliases 'LongConverter' and 'Long' to plugin 'org.apache.kafka.connect.converters.LongConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397)
[2020-05-22 03:45:29,624] INFO Added aliases 'ShortConverter' and 'Short' to plugin 'org.apache.kafka.connect.converters.ShortConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397)
[2020-05-22 03:45:29,625] INFO Added aliases 'JsonConverter' and 'Json' to plugin 'org.apache.kafka.connect.json.JsonConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397)
[2020-05-22 03:45:29,626] INFO Added alias 'SimpleHeaderConverter' to plugin 'org.apache.kafka.connect.storage.SimpleHeaderConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:394)
[2020-05-22 03:45:29,627] INFO Added aliases 'StringConverter' and 'String' to plugin 'org.apache.kafka.connect.storage.StringConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397)
[2020-05-22 03:45:29,628] INFO Added alias 'RegexRouter' to plugin 'org.apache.kafka.connect.transforms.RegexRouter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:394)
[2020-05-22 03:45:29,630] INFO Added alias 'TimestampRouter' to plugin 'org.apache.kafka.connect.transforms.TimestampRouter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:394)
[2020-05-22 03:45:29,630] INFO Added alias 'ValueToKey' to plugin 'org.apache.kafka.connect.transforms.ValueToKey' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:394)
[2020-05-22 03:45:29,631] INFO Added alias 'BasicAuthSecurityRestExtension' to plugin 'org.apache.kafka.connect.rest.basic.auth.extension.BasicAuthSecurityRestExtension' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:394)
[2020-05-22 03:45:29,632] INFO Added aliases 'AllConnectorClientConfigOverridePolicy' and 'All' to plugin 'org.apache.kafka.connect.connector.policy.AllConnectorClientConfigOverridePolicy' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397)
[2020-05-22 03:45:29,632] INFO Added aliases 'NoneConnectorClientConfigOverridePolicy' and 'None' to plugin 'org.apache.kafka.connect.connector.policy.NoneConnectorClientConfigOverridePolicy' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397)
[2020-05-22 03:45:29,633] INFO Added aliases 'PrincipalConnectorClientConfigOverridePolicy' and 'Principal' to plugin 'org.apache.kafka.connect.connector.policy.PrincipalConnectorClientConfigOverridePolicy' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397)
[2020-05-22 03:45:29,644] INFO StandaloneConfig values: 
    access.control.allow.methods = 
    access.control.allow.origin = 
    bootstrap.servers = [kafka:29092]
    client.dns.lookup = default
    config.providers = []
    connector.client.config.override.policy = None
    header.converter = class org.apache.kafka.connect.storage.SimpleHeaderConverter
    internal.key.converter = class org.apache.kafka.connect.json.JsonConverter
    internal.value.converter = class org.apache.kafka.connect.json.JsonConverter
    key.converter = class org.apache.kafka.connect.storage.StringConverter
    listeners = null
    metric.reporters = []
    metrics.num.samples = 2
    metrics.recording.level = INFO
    metrics.sample.window.ms = 30000
    offset.flush.interval.ms = 10000
    offset.flush.timeout.ms = 5000
    offset.storage.file.filename = /tmp/connect.offsets
    plugin.path = [/opt/kafka-latest/connectors]
    rest.advertised.host.name = null
    rest.advertised.listener = null
    rest.advertised.port = null
    rest.extension.classes = []
    rest.host.name = null
    rest.port = 8083
    ssl.client.auth = none
    task.shutdown.graceful.timeout.ms = 5000
    value.converter = class org.apache.kafka.connect.storage.StringConverter
 (org.apache.kafka.connect.runtime.standalone.StandaloneConfig:347)
[2020-05-22 03:45:29,648] INFO Creating Kafka admin client (org.apache.kafka.connect.util.ConnectUtils:43)
[2020-05-22 03:45:29,663] INFO AdminClientConfig values: 
    bootstrap.servers = [kafka:29092]
    client.dns.lookup = default
    client.id = 
    connections.max.idle.ms = 300000
    metadata.max.age.ms = 300000
    metric.reporters = []
    metrics.num.samples = 2
    metrics.recording.level = INFO
    metrics.sample.window.ms = 30000
    receive.buffer.bytes = 65536
    reconnect.backoff.max.ms = 1000
    reconnect.backoff.ms = 50
    request.timeout.ms = 120000
    retries = 5
    retry.backoff.ms = 100
    sasl.client.callback.handler.class = null
    sasl.jaas.config = null
    sasl.kerberos.kinit.cmd = /usr/bin/kinit
    sasl.kerberos.min.time.before.relogin = 60000
    sasl.kerberos.service.name = null
    sasl.kerberos.ticket.renew.jitter = 0.05
    sasl.kerberos.ticket.renew.window.factor = 0.8
    sasl.login.callback.handler.class = null
    sasl.login.class = null
    sasl.login.refresh.buffer.seconds = 300
    sasl.login.refresh.min.period.seconds = 60
    sasl.login.refresh.window.factor = 0.8
    sasl.login.refresh.window.jitter = 0.05
    sasl.mechanism = GSSAPI
    security.protocol = PLAINTEXT
    send.buffer.bytes = 131072
    ssl.cipher.suites = null
    ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
    ssl.endpoint.identification.algorithm = https
    ssl.key.password = null
    ssl.keymanager.algorithm = SunX509
    ssl.keystore.location = null
    ssl.keystore.password = null
    ssl.keystore.type = JKS
    ssl.protocol = TLS
    ssl.provider = null
    ssl.secure.random.implementation = null
    ssl.trustmanager.algorithm = PKIX
    ssl.truststore.location = null
    ssl.truststore.password = null
    ssl.truststore.type = JKS
 (org.apache.kafka.clients.admin.AdminClientConfig:347)
[2020-05-22 03:45:29,838] WARN The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355)
[2020-05-22 03:45:29,839] WARN The configuration 'key.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355)
[2020-05-22 03:45:29,842] WARN The configuration 'offset.storage.file.filename' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355)
[2020-05-22 03:45:29,843] WARN The configuration 'value.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355)
[2020-05-22 03:45:29,843] WARN The configuration 'plugin.path' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355)
[2020-05-22 03:45:29,844] WARN The configuration 'value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355)
[2020-05-22 03:45:29,847] WARN The configuration 'key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355)
[2020-05-22 03:45:29,848] INFO Kafka version: 2.3.0 (org.apache.kafka.common.utils.AppInfoParser:117)
[2020-05-22 03:45:29,848] INFO Kafka commitId: fc1aaa116b661c8a (org.apache.kafka.common.utils.AppInfoParser:118)
[2020-05-22 03:45:29,849] INFO Kafka startTimeMs: 1590119129847 (org.apache.kafka.common.utils.AppInfoParser:119)
[2020-05-22 03:45:30,166] INFO Kafka cluster ID: 36MPGS-bQEGQTzqv2AM-rg (org.apache.kafka.connect.util.ConnectUtils:59)
[2020-05-22 03:45:30,211] INFO Logging initialized @9179ms to org.eclipse.jetty.util.log.Slf4jLog (org.eclipse.jetty.util.log:193)
[2020-05-22 03:45:30,439] INFO Added connector for http://:8083 (org.apache.kafka.connect.runtime.rest.RestServer:122)
[2020-05-22 03:45:30,440] INFO Initializing REST server (org.apache.kafka.connect.runtime.rest.RestServer:166)
[2020-05-22 03:45:30,456] INFO jetty-9.4.18.v20190429; built: 2019-04-29T20:42:08.989Z; git: e1bc35120a6617ee3df052294e433f3a25ce7097; jvm 1.8.0_232-8u232-b09-0ubuntu1~18.04.1-b09 (org.eclipse.jetty.server.Server:370)
[2020-05-22 03:45:30,549] INFO Started http_8083@48d5f34e{HTTP/1.1,[http/1.1]}{0.0.0.0:8083} (org.eclipse.jetty.server.AbstractConnector:292)
[2020-05-22 03:45:30,550] INFO Started @9519ms (org.eclipse.jetty.server.Server:410)
[2020-05-22 03:45:30,595] INFO Advertised URI: http://172.18.0.5:8083/ (org.apache.kafka.connect.runtime.rest.RestServer:285)
[2020-05-22 03:45:30,596] INFO REST server listening at http://172.18.0.5:8083/, advertising URL http://172.18.0.5:8083/ (org.apache.kafka.connect.runtime.rest.RestServer:181)
[2020-05-22 03:45:30,597] INFO Advertised URI: http://172.18.0.5:8083/ (org.apache.kafka.connect.runtime.rest.RestServer:285)
[2020-05-22 03:45:30,599] INFO Setting up None Policy for ConnectorClientConfigOverride. This will disallow any client configuration to be overridden (org.apache.kafka.connect.connector.policy.NoneConnectorClientConfigOverridePolicy:45)
[2020-05-22 03:45:30,620] INFO Kafka version: 2.3.0 (org.apache.kafka.common.utils.AppInfoParser:117)
[2020-05-22 03:45:30,620] INFO Kafka commitId: fc1aaa116b661c8a (org.apache.kafka.common.utils.AppInfoParser:118)
[2020-05-22 03:45:30,621] INFO Kafka startTimeMs: 1590119130620 (org.apache.kafka.common.utils.AppInfoParser:119)
[2020-05-22 03:45:30,882] INFO JsonConverterConfig values: 
    converter.type = key
    schemas.cache.size = 1000
    schemas.enable = false
 (org.apache.kafka.connect.json.JsonConverterConfig:347)
[2020-05-22 03:45:30,884] INFO JsonConverterConfig values: 
    converter.type = value
    schemas.cache.size = 1000
    schemas.enable = false
 (org.apache.kafka.connect.json.JsonConverterConfig:347)
[2020-05-22 03:45:30,893] INFO Kafka Connect standalone worker initialization took 7594ms (org.apache.kafka.connect.cli.ConnectStandalone:100)
[2020-05-22 03:45:30,893] INFO Kafka Connect starting (org.apache.kafka.connect.runtime.Connect:50)
[2020-05-22 03:45:30,893] INFO Herder starting (org.apache.kafka.connect.runtime.standalone.StandaloneHerder:91)
[2020-05-22 03:45:30,894] INFO Worker starting (org.apache.kafka.connect.runtime.Worker:182)
[2020-05-22 03:45:30,894] INFO Starting FileOffsetBackingStore with file /tmp/connect.offsets (org.apache.kafka.connect.storage.FileOffsetBackingStore:58)
[2020-05-22 03:45:30,897] INFO Worker started (org.apache.kafka.connect.runtime.Worker:187)
[2020-05-22 03:45:30,897] INFO Herder started (org.apache.kafka.connect.runtime.standalone.StandaloneHerder:93)
[2020-05-22 03:45:30,897] INFO Initializing REST resources (org.apache.kafka.connect.runtime.rest.RestServer:186)
[2020-05-22 03:45:31,025] INFO DefaultSessionIdManager workerName=node0 (org.eclipse.jetty.server.session:365)
[2020-05-22 03:45:31,025] INFO No SessionScavenger set, using defaults (org.eclipse.jetty.server.session:370)
[2020-05-22 03:45:31,027] INFO node0 Scavenging every 600000ms (org.eclipse.jetty.server.session:149)
[2020-05-22 03:45:31,856] INFO Started o.e.j.s.ServletContextHandler@70fab835{/,null,AVAILABLE} (org.eclipse.jetty.server.handler.ContextHandler:855)
[2020-05-22 03:45:31,856] INFO REST resources initialized; server is started and ready to handle requests (org.apache.kafka.connect.runtime.rest.RestServer:233)
[2020-05-22 03:45:31,856] INFO Kafka Connect started (org.apache.kafka.connect.runtime.Connect:56)
[2020-05-22 03:45:31,869] INFO AbstractConfig values: 
 (org.apache.kafka.common.config.AbstractConfig:347)
[2020-05-22 03:45:31,882] INFO ConnectorConfig values: 
    config.action.reload = restart
    connector.class = com.google.pubsub.kafka.sink.CloudPubSubSinkConnector
    errors.log.enable = false
    errors.log.include.messages = false
    errors.retry.delay.max.ms = 60000
    errors.retry.timeout = 0
    errors.tolerance = none
    header.converter = null
    key.converter = null
    name = data-output-stream
    tasks.max = 10
    transforms = []
    value.converter = null
 (org.apache.kafka.connect.runtime.ConnectorConfig:347)
[2020-05-22 03:45:31,883] INFO EnrichedConnectorConfig values: 
    config.action.reload = restart
    connector.class = com.google.pubsub.kafka.sink.CloudPubSubSinkConnector
    errors.log.enable = false
    errors.log.include.messages = false
    errors.retry.delay.max.ms = 60000
    errors.retry.timeout = 0
    errors.tolerance = none
    header.converter = null
    key.converter = null
    name = data-output-stream
    tasks.max = 10
    transforms = []
    value.converter = null
 (org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig:347)
[2020-05-22 03:45:31,884] INFO Creating connector data-output-stream of type com.google.pubsub.kafka.sink.CloudPubSubSinkConnector (org.apache.kafka.connect.runtime.Worker:246)
[2020-05-22 03:45:31,890] INFO Instantiated connector data-output-stream with version 2.3.0 of type class com.google.pubsub.kafka.sink.CloudPubSubSinkConnector (org.apache.kafka.connect.runtime.Worker:249)
[2020-05-22 03:45:31,892] INFO Started the CloudPubSubSinkConnector. (com.google.pubsub.kafka.sink.CloudPubSubSinkConnector:66)
[2020-05-22 03:45:31,893] INFO Finished creating connector data-output-stream (org.apache.kafka.connect.runtime.Worker:268)
[2020-05-22 03:45:31,895] INFO SinkConnectorConfig values: 
    config.action.reload = restart
    connector.class = com.google.pubsub.kafka.sink.CloudPubSubSinkConnector
    errors.deadletterqueue.context.headers.enable = false
    errors.deadletterqueue.topic.name = 
    errors.deadletterqueue.topic.replication.factor = 3
    errors.log.enable = false
    errors.log.include.messages = false
    errors.retry.delay.max.ms = 60000
    errors.retry.timeout = 0
    errors.tolerance = none
    header.converter = null
    key.converter = null
    name = data-output-stream
    tasks.max = 10
    topics = [data-output-stream]
    topics.regex = 
    transforms = []
    value.converter = null
 (org.apache.kafka.connect.runtime.SinkConnectorConfig:347)
[2020-05-22 03:45:31,896] INFO EnrichedConnectorConfig values: 
    config.action.reload = restart
    connector.class = com.google.pubsub.kafka.sink.CloudPubSubSinkConnector
    errors.deadletterqueue.context.headers.enable = false
    errors.deadletterqueue.topic.name = 
    errors.deadletterqueue.topic.replication.factor = 3
    errors.log.enable = false
    errors.log.include.messages = false
    errors.retry.delay.max.ms = 60000
    errors.retry.timeout = 0
    errors.tolerance = none
    header.converter = null
    key.converter = null
    name = data-output-stream
    tasks.max = 10
    topics = [data-output-stream]
    topics.regex = 
    transforms = []
    value.converter = null
 (org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig:347)
[2020-05-22 03:45:31,900] INFO Creating task data-output-stream-0 (org.apache.kafka.connect.runtime.Worker:414)
[2020-05-22 03:45:31,901] INFO ConnectorConfig values: 
    config.action.reload = restart
    connector.class = com.google.pubsub.kafka.sink.CloudPubSubSinkConnector
    errors.log.enable = false
    errors.log.include.messages = false
    errors.retry.delay.max.ms = 60000
    errors.retry.timeout = 0
    errors.tolerance = none
    header.converter = null
    key.converter = null
    name = data-output-stream
    tasks.max = 10
    transforms = []
    value.converter = null
 (org.apache.kafka.connect.runtime.ConnectorConfig:347)
[2020-05-22 03:45:31,902] INFO EnrichedConnectorConfig values: 
    config.action.reload = restart
    connector.class = com.google.pubsub.kafka.sink.CloudPubSubSinkConnector
    errors.log.enable = false
    errors.log.include.messages = false
    errors.retry.delay.max.ms = 60000
    errors.retry.timeout = 0
    errors.tolerance = none
    header.converter = null
    key.converter = null
    name = data-output-stream
    tasks.max = 10
    transforms = []
    value.converter = null
 (org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig:347)
[2020-05-22 03:45:31,906] INFO TaskConfig values: 
    task.class = class com.google.pubsub.kafka.sink.CloudPubSubSinkTask
 (org.apache.kafka.connect.runtime.TaskConfig:347)
[2020-05-22 03:45:31,909] INFO Instantiated task data-output-stream-0 with version 2.3.0 of type com.google.pubsub.kafka.sink.CloudPubSubSinkTask (org.apache.kafka.connect.runtime.Worker:428)
[2020-05-22 03:45:31,911] INFO StringConverterConfig values: 
    converter.encoding = UTF8
    converter.type = key
 (org.apache.kafka.connect.storage.StringConverterConfig:347)
[2020-05-22 03:45:31,912] INFO Set up the key converter class org.apache.kafka.connect.storage.StringConverter for task data-output-stream-0 using the worker config (org.apache.kafka.connect.runtime.Worker:441)
[2020-05-22 03:45:31,912] INFO StringConverterConfig values: 
    converter.encoding = UTF8
    converter.type = value
 (org.apache.kafka.connect.storage.StringConverterConfig:347)
[2020-05-22 03:45:31,913] INFO Set up the value converter class org.apache.kafka.connect.storage.StringConverter for task data-output-stream-0 using the worker config (org.apache.kafka.connect.runtime.Worker:447)
[2020-05-22 03:45:31,913] INFO Set up the header converter class org.apache.kafka.connect.storage.SimpleHeaderConverter for task data-output-stream-0 using the worker config (org.apache.kafka.connect.runtime.Worker:454)
[2020-05-22 03:45:31,922] INFO Initializing: org.apache.kafka.connect.runtime.TransformationChain{} (org.apache.kafka.connect.runtime.Worker:522)
[2020-05-22 03:45:31,923] INFO SinkConnectorConfig values: 
    config.action.reload = restart
    connector.class = com.google.pubsub.kafka.sink.CloudPubSubSinkConnector
    errors.deadletterqueue.context.headers.enable = false
    errors.deadletterqueue.topic.name = 
    errors.deadletterqueue.topic.replication.factor = 3
    errors.log.enable = false
    errors.log.include.messages = false
    errors.retry.delay.max.ms = 60000
    errors.retry.timeout = 0
    errors.tolerance = none
    header.converter = null
    key.converter = null
    name = data-output-stream
    tasks.max = 10
    topics = [data-output-stream]
    topics.regex = 
    transforms = []
    value.converter = null
 (org.apache.kafka.connect.runtime.SinkConnectorConfig:347)
[2020-05-22 03:45:31,924] INFO EnrichedConnectorConfig values: 
    config.action.reload = restart
    connector.class = com.google.pubsub.kafka.sink.CloudPubSubSinkConnector
    errors.deadletterqueue.context.headers.enable = false
    errors.deadletterqueue.topic.name = 
    errors.deadletterqueue.topic.replication.factor = 3
    errors.log.enable = false
    errors.log.include.messages = false
    errors.retry.delay.max.ms = 60000
    errors.retry.timeout = 0
    errors.tolerance = none
    header.converter = null
    key.converter = null
    name = data-output-stream
    tasks.max = 10
    topics = [data-output-stream]
    topics.regex = 
    transforms = []
    value.converter = null
 (org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig:347)
[2020-05-22 03:45:31,941] INFO ConsumerConfig values: 
    allow.auto.create.topics = true
    auto.commit.interval.ms = 5000
    auto.offset.reset = earliest
    bootstrap.servers = [kafka:29092]
    check.crcs = true
    client.dns.lookup = default
    client.id = connector-consumer-data-output-stream-0
    client.rack = 
    connections.max.idle.ms = 540000
    default.api.timeout.ms = 60000
    enable.auto.commit = false
    exclude.internal.topics = true
    fetch.max.bytes = 52428800
    fetch.max.wait.ms = 500
    fetch.min.bytes = 1
    group.id = connect-data-output-stream
    group.instance.id = null
    heartbeat.interval.ms = 3000
    interceptor.classes = []
    internal.leave.group.on.close = true
    isolation.level = read_uncommitted
    key.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
    max.partition.fetch.bytes = 1048576
    max.poll.interval.ms = 300000
    max.poll.records = 500
    metadata.max.age.ms = 300000
    metric.reporters = []
    metrics.num.samples = 2
    metrics.recording.level = INFO
    metrics.sample.window.ms = 30000
    partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor]
    receive.buffer.bytes = 65536
    reconnect.backoff.max.ms = 1000
    reconnect.backoff.ms = 50
    request.timeout.ms = 30000
    retry.backoff.ms = 100
    sasl.client.callback.handler.class = null
    sasl.jaas.config = null
    sasl.kerberos.kinit.cmd = /usr/bin/kinit
    sasl.kerberos.min.time.before.relogin = 60000
    sasl.kerberos.service.name = null
    sasl.kerberos.ticket.renew.jitter = 0.05
    sasl.kerberos.ticket.renew.window.factor = 0.8
    sasl.login.callback.handler.class = null
    sasl.login.class = null
    sasl.login.refresh.buffer.seconds = 300
    sasl.login.refresh.min.period.seconds = 60
    sasl.login.refresh.window.factor = 0.8
    sasl.login.refresh.window.jitter = 0.05
    sasl.mechanism = GSSAPI
    security.protocol = PLAINTEXT
    send.buffer.bytes = 131072
    session.timeout.ms = 10000
    ssl.cipher.suites = null
    ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
    ssl.endpoint.identification.algorithm = https
    ssl.key.password = null
    ssl.keymanager.algorithm = SunX509
    ssl.keystore.location = null
    ssl.keystore.password = null
    ssl.keystore.type = JKS
    ssl.protocol = TLS
    ssl.provider = null
    ssl.secure.random.implementation = null
    ssl.trustmanager.algorithm = PKIX
    ssl.truststore.location = null
    ssl.truststore.password = null
    ssl.truststore.type = JKS
    value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
 (org.apache.kafka.clients.consumer.ConsumerConfig:347)
[2020-05-22 03:45:31,994] INFO Kafka version: 2.3.0 (org.apache.kafka.common.utils.AppInfoParser:117)
[2020-05-22 03:45:31,995] INFO Kafka commitId: fc1aaa116b661c8a (org.apache.kafka.common.utils.AppInfoParser:118)
[2020-05-22 03:45:31,995] INFO Kafka startTimeMs: 1590119131994 (org.apache.kafka.common.utils.AppInfoParser:119)
[2020-05-22 03:45:32,005] INFO Creating task data-output-stream-1 (org.apache.kafka.connect.runtime.Worker:414)
[2020-05-22 03:45:32,006] INFO ConnectorConfig values: 
    config.action.reload = restart
    connector.class = com.google.pubsub.kafka.sink.CloudPubSubSinkConnector
    errors.log.enable = false
    errors.log.include.messages = false
    errors.retry.delay.max.ms = 60000
    errors.retry.timeout = 0
    errors.tolerance = none
    header.converter = null
    key.converter = null
    name = data-output-stream
    tasks.max = 10
    transforms = []
    value.converter = null
 (org.apache.kafka.connect.runtime.ConnectorConfig:347)
[2020-05-22 03:45:32,009] INFO EnrichedConnectorConfig values: 
    config.action.reload = restart
    connector.class = com.google.pubsub.kafka.sink.CloudPubSubSinkConnector
    errors.log.enable = false
    errors.log.include.messages = false
    errors.retry.delay.max.ms = 60000
    errors.retry.timeout = 0
    errors.tolerance = none
    header.converter = null
    key.converter = null
    name = data-output-stream
    tasks.max = 10
    transforms = []
    value.converter = null
 (org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig:347)
[2020-05-22 03:45:32,009] INFO TaskConfig values: 
    task.class = class com.google.pubsub.kafka.sink.CloudPubSubSinkTask
 (org.apache.kafka.connect.runtime.TaskConfig:347)
[2020-05-22 03:45:32,009] INFO Instantiated task data-output-stream-1 with version 2.3.0 of type com.google.pubsub.kafka.sink.CloudPubSubSinkTask (org.apache.kafka.connect.runtime.Worker:428)
[2020-05-22 03:45:32,010] INFO StringConverterConfig values: 
    converter.encoding = UTF8
    converter.type = key
 (org.apache.kafka.connect.storage.StringConverterConfig:347)
[2020-05-22 03:45:32,010] INFO Set up the key converter class org.apache.kafka.connect.storage.StringConverter for task data-output-stream-1 using the worker config (org.apache.kafka.connect.runtime.Worker:441)
[2020-05-22 03:45:32,010] INFO StringConverterConfig values: 
    converter.encoding = UTF8
    converter.type = value
 (org.apache.kafka.connect.storage.StringConverterConfig:347)
[2020-05-22 03:45:32,010] INFO Set up the value converter class org.apache.kafka.connect.storage.StringConverter for task data-output-stream-1 using the worker config (org.apache.kafka.connect.runtime.Worker:447)
[2020-05-22 03:45:32,011] INFO Set up the header converter class org.apache.kafka.connect.storage.SimpleHeaderConverter for task data-output-stream-1 using the worker config (org.apache.kafka.connect.runtime.Worker:454)
[2020-05-22 03:45:32,015] INFO Initializing: org.apache.kafka.connect.runtime.TransformationChain{} (org.apache.kafka.connect.runtime.Worker:522)
[2020-05-22 03:45:32,016] INFO SinkConnectorConfig values: 
    config.action.reload = restart
    connector.class = com.google.pubsub.kafka.sink.CloudPubSubSinkConnector
    errors.deadletterqueue.context.headers.enable = false
    errors.deadletterqueue.topic.name = 
    errors.deadletterqueue.topic.replication.factor = 3
    errors.log.enable = false
    errors.log.include.messages = false
    errors.retry.delay.max.ms = 60000
    errors.retry.timeout = 0
    errors.tolerance = none
    header.converter = null
    key.converter = null
    name = data-output-stream
    tasks.max = 10
    topics = [data-output-stream]
    topics.regex = 
    transforms = []
    value.converter = null
 (org.apache.kafka.connect.runtime.SinkConnectorConfig:347)
[2020-05-22 03:45:32,016] INFO EnrichedConnectorConfig values: 
    config.action.reload = restart
    connector.class = com.google.pubsub.kafka.sink.CloudPubSubSinkConnector
    errors.deadletterqueue.context.headers.enable = false
    errors.deadletterqueue.topic.name = 
    errors.deadletterqueue.topic.replication.factor = 3
    errors.log.enable = false
    errors.log.include.messages = false
    errors.retry.delay.max.ms = 60000
    errors.retry.timeout = 0
    errors.tolerance = none
    header.converter = null
    key.converter = null
    name = data-output-stream
    tasks.max = 10
    topics = [data-output-stream]
    topics.regex = 
    transforms = []
    value.converter = null
 (org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig:347)
[2020-05-22 03:45:32,017] INFO ConsumerConfig values: 
    allow.auto.create.topics = true
    auto.commit.interval.ms = 5000
    auto.offset.reset = earliest
    bootstrap.servers = [kafka:29092]
    check.crcs = true
    client.dns.lookup = default
    client.id = connector-consumer-data-output-stream-1
    client.rack = 
    connections.max.idle.ms = 540000
    default.api.timeout.ms = 60000
    enable.auto.commit = false
    exclude.internal.topics = true
    fetch.max.bytes = 52428800
    fetch.max.wait.ms = 500
    fetch.min.bytes = 1
    group.id = connect-data-output-stream
    group.instance.id = null
    heartbeat.interval.ms = 3000
    interceptor.classes = []
    internal.leave.group.on.close = true
    isolation.level = read_uncommitted
    key.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
    max.partition.fetch.bytes = 1048576
    max.poll.interval.ms = 300000
    max.poll.records = 500
    metadata.max.age.ms = 300000
    metric.reporters = []
    metrics.num.samples = 2
    metrics.recording.level = INFO
    metrics.sample.window.ms = 30000
    partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor]
    receive.buffer.bytes = 65536
    reconnect.backoff.max.ms = 1000
    reconnect.backoff.ms = 50
    request.timeout.ms = 30000
    retry.backoff.ms = 100
    sasl.client.callback.handler.class = null
    sasl.jaas.config = null
    sasl.kerberos.kinit.cmd = /usr/bin/kinit
    sasl.kerberos.min.time.before.relogin = 60000
    sasl.kerberos.service.name = null
    sasl.kerberos.ticket.renew.jitter = 0.05
    sasl.kerberos.ticket.renew.window.factor = 0.8
    sasl.login.callback.handler.class = null
    sasl.login.class = null
    sasl.login.refresh.buffer.seconds = 300
    sasl.login.refresh.min.period.seconds = 60
    sasl.login.refresh.window.factor = 0.8
    sasl.login.refresh.window.jitter = 0.05
    sasl.mechanism = GSSAPI
    security.protocol = PLAINTEXT
    send.buffer.bytes = 131072
    session.timeout.ms = 10000
    ssl.cipher.suites = null
    ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
    ssl.endpoint.identification.algorithm = https
    ssl.key.password = null
    ssl.keymanager.algorithm = SunX509
    ssl.keystore.location = null
    ssl.keystore.password = null
    ssl.keystore.type = JKS
    ssl.protocol = TLS
    ssl.provider = null
    ssl.secure.random.implementation = null
    ssl.trustmanager.algorithm = PKIX
    ssl.truststore.location = null
    ssl.truststore.password = null
    ssl.truststore.type = JKS
    value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
 (org.apache.kafka.clients.consumer.ConsumerConfig:347)
[2020-05-22 03:45:32,034] INFO [Consumer clientId=connector-consumer-data-output-stream-0, groupId=connect-data-output-stream] Subscribed to topic(s): data-output-stream (org.apache.kafka.clients.consumer.KafkaConsumer:964)
[2020-05-22 03:45:32,036] INFO Kafka version: 2.3.0 (org.apache.kafka.common.utils.AppInfoParser:117)
[2020-05-22 03:45:32,037] INFO Kafka commitId: fc1aaa116b661c8a (org.apache.kafka.common.utils.AppInfoParser:118)
[2020-05-22 03:45:32,037] INFO Kafka startTimeMs: 1590119132036 (org.apache.kafka.common.utils.AppInfoParser:119)
[2020-05-22 03:45:32,047] INFO Creating task data-output-stream-2 (org.apache.kafka.connect.runtime.Worker:414)
[2020-05-22 03:45:32,047] INFO ConnectorConfig values: 
    config.action.reload = restart
    connector.class = com.google.pubsub.kafka.sink.CloudPubSubSinkConnector
    errors.log.enable = false
    errors.log.include.messages = false
    errors.retry.delay.max.ms = 60000
    errors.retry.timeout = 0
    errors.tolerance = none
    header.converter = null
    key.converter = null
    name = data-output-stream
    tasks.max = 10
    transforms = []
    value.converter = null
 (org.apache.kafka.connect.runtime.ConnectorConfig:347)
[2020-05-22 03:45:32,048] INFO EnrichedConnectorConfig values: 
    config.action.reload = restart
    connector.class = com.google.pubsub.kafka.sink.CloudPubSubSinkConnector
    errors.log.enable = false
    errors.log.include.messages = false
    errors.retry.delay.max.ms = 60000
    errors.retry.timeout = 0
    errors.tolerance = none
    header.converter = null
    key.converter = null
    name = data-output-stream
    tasks.max = 10
    transforms = []
    value.converter = null
 (org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig:347)
[2020-05-22 03:45:32,048] INFO TaskConfig values: 
    task.class = class com.google.pubsub.kafka.sink.CloudPubSubSinkTask
 (org.apache.kafka.connect.runtime.TaskConfig:347)
[2020-05-22 03:45:32,049] INFO Instantiated task data-output-stream-2 with version 2.3.0 of type com.google.pubsub.kafka.sink.CloudPubSubSinkTask (org.apache.kafka.connect.runtime.Worker:428)
[2020-05-22 03:45:32,050] INFO StringConverterConfig values: 
    converter.encoding = UTF8
    converter.type = key
 (org.apache.kafka.connect.storage.StringConverterConfig:347)
[2020-05-22 03:45:32,050] INFO Set up the key converter class org.apache.kafka.connect.storage.StringConverter for task data-output-stream-2 using the worker config (org.apache.kafka.connect.runtime.Worker:441)
[2020-05-22 03:45:32,051] INFO StringConverterConfig values: 
    converter.encoding = UTF8
    converter.type = value
 (org.apache.kafka.connect.storage.StringConverterConfig:347)
[2020-05-22 03:45:32,051] INFO Set up the value converter class org.apache.kafka.connect.storage.StringConverter for task data-output-stream-2 using the worker config (org.apache.kafka.connect.runtime.Worker:447)
[2020-05-22 03:45:32,052] INFO Set up the header converter class org.apache.kafka.connect.storage.SimpleHeaderConverter for task data-output-stream-2 using the worker config (org.apache.kafka.connect.runtime.Worker:454)
[2020-05-22 03:45:32,055] INFO Initializing: org.apache.kafka.connect.runtime.TransformationChain{} (org.apache.kafka.connect.runtime.Worker:522)
[2020-05-22 03:45:32,056] INFO [Consumer clientId=connector-consumer-data-output-stream-1, groupId=connect-data-output-stream] Subscribed to topic(s): data-output-stream (org.apache.kafka.clients.consumer.KafkaConsumer:964)
[2020-05-22 03:45:32,058] INFO SinkConnectorConfig values: 
    config.action.reload = restart
    connector.class = com.google.pubsub.kafka.sink.CloudPubSubSinkConnector
    errors.deadletterqueue.context.headers.enable = false
    errors.deadletterqueue.topic.name = 
    errors.deadletterqueue.topic.replication.factor = 3
    errors.log.enable = false
    errors.log.include.messages = false
    errors.retry.delay.max.ms = 60000
    errors.retry.timeout = 0
    errors.tolerance = none
    header.converter = null
    key.converter = null
    name = data-output-stream
    tasks.max = 10
    topics = [data-output-stream]
    topics.regex = 
    transforms = []
    value.converter = null
 (org.apache.kafka.connect.runtime.SinkConnectorConfig:347)
[2020-05-22 03:45:32,059] INFO EnrichedConnectorConfig values: 
    config.action.reload = restart
    connector.class = com.google.pubsub.kafka.sink.CloudPubSubSinkConnector
    errors.deadletterqueue.context.headers.enable = false
    errors.deadletterqueue.topic.name = 
    errors.deadletterqueue.topic.replication.factor = 3
    errors.log.enable = false
    errors.log.include.messages = false
    errors.retry.delay.max.ms = 60000
    errors.retry.timeout = 0
    errors.tolerance = none
    header.converter = null
    key.converter = null
    name = data-output-stream
    tasks.max = 10
    topics = [data-output-stream]
    topics.regex = 
    transforms = []
    value.converter = null
 (org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig:347)
[2020-05-22 03:45:32,060] INFO ConsumerConfig values: 
    allow.auto.create.topics = true
    auto.commit.interval.ms = 5000
    auto.offset.reset = earliest
    bootstrap.servers = [kafka:29092]
    check.crcs = true
    client.dns.lookup = default
    client.id = connector-consumer-data-output-stream-2
    client.rack = 
    connections.max.idle.ms = 540000
    default.api.timeout.ms = 60000
    enable.auto.commit = false
    exclude.internal.topics = true
    fetch.max.bytes = 52428800
    fetch.max.wait.ms = 500
    fetch.min.bytes = 1
    group.id = connect-data-output-stream
    group.instance.id = null
    heartbeat.interval.ms = 3000
    interceptor.classes = []
    internal.leave.group.on.close = true
    isolation.level = read_uncommitted
    key.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
    max.partition.fetch.bytes = 1048576
    max.poll.interval.ms = 300000
    max.poll.records = 500
    metadata.max.age.ms = 300000
    metric.reporters = []
    metrics.num.samples = 2
    metrics.recording.level = INFO
    metrics.sample.window.ms = 30000
    partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor]
    receive.buffer.bytes = 65536
    reconnect.backoff.max.ms = 1000
    reconnect.backoff.ms = 50
    request.timeout.ms = 30000
    retry.backoff.ms = 100
    sasl.client.callback.handler.class = null
    sasl.jaas.config = null
    sasl.kerberos.kinit.cmd = /usr/bin/kinit
    sasl.kerberos.min.time.before.relogin = 60000
    sasl.kerberos.service.name = null
    sasl.kerberos.ticket.renew.jitter = 0.05
    sasl.kerberos.ticket.renew.window.factor = 0.8
    sasl.login.callback.handler.class = null
    sasl.login.class = null
    sasl.login.refresh.buffer.seconds = 300
    sasl.login.refresh.min.period.seconds = 60
    sasl.login.refresh.window.factor = 0.8
    sasl.login.refresh.window.jitter = 0.05
    sasl.mechanism = GSSAPI
    security.protocol = PLAINTEXT
    send.buffer.bytes = 131072
    session.timeout.ms = 10000
    ssl.cipher.suites = null
    ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
    ssl.endpoint.identification.algorithm = https
    ssl.key.password = null
    ssl.keymanager.algorithm = SunX509
    ssl.keystore.location = null
    ssl.keystore.password = null
    ssl.keystore.type = JKS
    ssl.protocol = TLS
    ssl.provider = null
    ssl.secure.random.implementation = null
    ssl.trustmanager.algorithm = PKIX
    ssl.truststore.location = null
    ssl.truststore.password = null
    ssl.truststore.type = JKS
    value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
 (org.apache.kafka.clients.consumer.ConsumerConfig:347)
[2020-05-22 03:45:32,075] INFO Kafka version: 2.3.0 (org.apache.kafka.common.utils.AppInfoParser:117)
[2020-05-22 03:45:32,076] INFO Kafka commitId: fc1aaa116b661c8a (org.apache.kafka.common.utils.AppInfoParser:118)
[2020-05-22 03:45:32,076] INFO Kafka startTimeMs: 1590119132075 (org.apache.kafka.common.utils.AppInfoParser:119)
[2020-05-22 03:45:32,083] INFO Creating task data-output-stream-3 (org.apache.kafka.connect.runtime.Worker:414)
[2020-05-22 03:45:32,084] INFO [Consumer clientId=connector-consumer-data-output-stream-2, groupId=connect-data-output-stream] Subscribed to topic(s): data-output-stream (org.apache.kafka.clients.consumer.KafkaConsumer:964)
[2020-05-22 03:45:32,085] INFO ConnectorConfig values: 
    config.action.reload = restart
    connector.class = com.google.pubsub.kafka.sink.CloudPubSubSinkConnector
    errors.log.enable = false
    errors.log.include.messages = false
    errors.retry.delay.max.ms = 60000
    errors.retry.timeout = 0
    errors.tolerance = none
    header.converter = null
    key.converter = null
    name = data-output-stream
    tasks.max = 10
    transforms = []
    value.converter = null
 (org.apache.kafka.connect.runtime.ConnectorConfig:347)
[2020-05-22 03:45:32,087] INFO EnrichedConnectorConfig values: 
    config.action.reload = restart
    connector.class = com.google.pubsub.kafka.sink.CloudPubSubSinkConnector
    errors.log.enable = false
    errors.log.include.messages = false
    errors.retry.delay.max.ms = 60000
    errors.retry.timeout = 0
    errors.tolerance = none
    header.converter = null
    key.converter = null
    name = data-output-stream
    tasks.max = 10
    transforms = []
    value.converter = null
 (org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig:347)
[2020-05-22 03:45:32,087] INFO TaskConfig values: 
    task.class = class com.google.pubsub.kafka.sink.CloudPubSubSinkTask
 (org.apache.kafka.connect.runtime.TaskConfig:347)
[2020-05-22 03:45:32,087] INFO Instantiated task data-output-stream-3 with version 2.3.0 of type com.google.pubsub.kafka.sink.CloudPubSubSinkTask (org.apache.kafka.connect.runtime.Worker:428)
[2020-05-22 03:45:32,088] INFO StringConverterConfig values: 
    converter.encoding = UTF8
    converter.type = key
 (org.apache.kafka.connect.storage.StringConverterConfig:347)
[2020-05-22 03:45:32,088] INFO Set up the key converter class org.apache.kafka.connect.storage.StringConverter for task data-output-stream-3 using the worker config (org.apache.kafka.connect.runtime.Worker:441)
[2020-05-22 03:45:32,088] INFO StringConverterConfig values: 
    converter.encoding = UTF8
    converter.type = value
 (org.apache.kafka.connect.storage.StringConverterConfig:347)
[2020-05-22 03:45:32,089] INFO Set up the value converter class org.apache.kafka.connect.storage.StringConverter for task data-output-stream-3 using the worker config (org.apache.kafka.connect.runtime.Worker:447)
[2020-05-22 03:45:32,090] INFO Set up the header converter class org.apache.kafka.connect.storage.SimpleHeaderConverter for task data-output-stream-3 using the worker config (org.apache.kafka.connect.runtime.Worker:454)
[2020-05-22 03:45:32,092] INFO Initializing: org.apache.kafka.connect.runtime.TransformationChain{} (org.apache.kafka.connect.runtime.Worker:522)
[2020-05-22 03:45:32,092] INFO SinkConnectorConfig values: 
    config.action.reload = restart
    connector.class = com.google.pubsub.kafka.sink.CloudPubSubSinkConnector
    errors.deadletterqueue.context.headers.enable = false
    errors.deadletterqueue.topic.name = 
    errors.deadletterqueue.topic.replication.factor = 3
    errors.log.enable = false
    errors.log.include.messages = false
    errors.retry.delay.max.ms = 60000
    errors.retry.timeout = 0
    errors.tolerance = none
    header.converter = null
    key.converter = null
    name = data-output-stream
    tasks.max = 10
    topics = [data-output-stream]
    topics.regex = 
    transforms = []
    value.converter = null
 (org.apache.kafka.connect.runtime.SinkConnectorConfig:347)
[2020-05-22 03:45:32,094] INFO EnrichedConnectorConfig values: 
    config.action.reload = restart
    connector.class = com.google.pubsub.kafka.sink.CloudPubSubSinkConnector
    errors.deadletterqueue.context.headers.enable = false
    errors.deadletterqueue.topic.name = 
    errors.deadletterqueue.topic.replication.factor = 3
    errors.log.enable = false
    errors.log.include.messages = false
    errors.retry.delay.max.ms = 60000
    errors.retry.timeout = 0
    errors.tolerance = none
    header.converter = null
    key.converter = null
    name = data-output-stream
    tasks.max = 10
    topics = [data-output-stream]
    topics.regex = 
    transforms = []
    value.converter = null
 (org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig:347)
[2020-05-22 03:45:32,095] INFO ConsumerConfig values: 
    allow.auto.create.topics = true
    auto.commit.interval.ms = 5000
    auto.offset.reset = earliest
    bootstrap.servers = [kafka:29092]
    check.crcs = true
    client.dns.lookup = default
    client.id = connector-consumer-data-output-stream-3
    client.rack = 
    connections.max.idle.ms = 540000
    default.api.timeout.ms = 60000
    enable.auto.commit = false
    exclude.internal.topics = true
    fetch.max.bytes = 52428800
    fetch.max.wait.ms = 500
    fetch.min.bytes = 1
    group.id = connect-data-output-stream
    group.instance.id = null
    heartbeat.interval.ms = 3000
    interceptor.classes = []
    internal.leave.group.on.close = true
    isolation.level = read_uncommitted
    key.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
    max.partition.fetch.bytes = 1048576
    max.poll.interval.ms = 300000
    max.poll.records = 500
    metadata.max.age.ms = 300000
    metric.reporters = []
    metrics.num.samples = 2
    metrics.recording.level = INFO
    metrics.sample.window.ms = 30000
    partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor]
    receive.buffer.bytes = 65536
    reconnect.backoff.max.ms = 1000
    reconnect.backoff.ms = 50
    request.timeout.ms = 30000
    retry.backoff.ms = 100
    sasl.client.callback.handler.class = null
    sasl.jaas.config = null
    sasl.kerberos.kinit.cmd = /usr/bin/kinit
    sasl.kerberos.min.time.before.relogin = 60000
    sasl.kerberos.service.name = null
    sasl.kerberos.ticket.renew.jitter = 0.05
    sasl.kerberos.ticket.renew.window.factor = 0.8
    sasl.login.callback.handler.class = null
    sasl.login.class = null
    sasl.login.refresh.buffer.seconds = 300
    sasl.login.refresh.min.period.seconds = 60
    sasl.login.refresh.window.factor = 0.8
    sasl.login.refresh.window.jitter = 0.05
    sasl.mechanism = GSSAPI
    security.protocol = PLAINTEXT
    send.buffer.bytes = 131072
    session.timeout.ms = 10000
    ssl.cipher.suites = null
    ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
    ssl.endpoint.identification.algorithm = https
    ssl.key.password = null
    ssl.keymanager.algorithm = SunX509
    ssl.keystore.location = null
    ssl.keystore.password = null
    ssl.keystore.type = JKS
    ssl.protocol = TLS
    ssl.provider = null
    ssl.secure.random.implementation = null
    ssl.trustmanager.algorithm = PKIX
    ssl.truststore.location = null
    ssl.truststore.password = null
    ssl.truststore.type = JKS
    value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
 (org.apache.kafka.clients.consumer.ConsumerConfig:347)
[2020-05-22 03:45:32,107] INFO Kafka version: 2.3.0 (org.apache.kafka.common.utils.AppInfoParser:117)
[2020-05-22 03:45:32,107] INFO Kafka commitId: fc1aaa116b661c8a (org.apache.kafka.common.utils.AppInfoParser:118)
[2020-05-22 03:45:32,108] INFO Kafka startTimeMs: 1590119132107 (org.apache.kafka.common.utils.AppInfoParser:119)
[2020-05-22 03:45:32,113] INFO Creating task data-output-stream-4 (org.apache.kafka.connect.runtime.Worker:414)
[2020-05-22 03:45:32,114] INFO ConnectorConfig values: 
    config.action.reload = restart
    connector.class = com.google.pubsub.kafka.sink.CloudPubSubSinkConnector
    errors.log.enable = false
    errors.log.include.messages = false
    errors.retry.delay.max.ms = 60000
    errors.retry.timeout = 0
    errors.tolerance = none
    header.converter = null
    key.converter = null
    name = data-output-stream
    tasks.max = 10
    transforms = []
    value.converter = null
 (org.apache.kafka.connect.runtime.ConnectorConfig:347)
[2020-05-22 03:45:32,114] INFO EnrichedConnectorConfig values: 
    config.action.reload = restart
    connector.class = com.google.pubsub.kafka.sink.CloudPubSubSinkConnector
    errors.log.enable = false
    errors.log.include.messages = false
    errors.retry.delay.max.ms = 60000
    errors.retry.timeout = 0
    errors.tolerance = none
    header.converter = null
    key.converter = null
    name = data-output-stream
    tasks.max = 10
    transforms = []
    value.converter = null
 (org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig:347)
[2020-05-22 03:45:32,115] INFO TaskConfig values: 
    task.class = class com.google.pubsub.kafka.sink.CloudPubSubSinkTask
 (org.apache.kafka.connect.runtime.TaskConfig:347)
[2020-05-22 03:45:32,115] INFO Instantiated task data-output-stream-4 with version 2.3.0 of type com.google.pubsub.kafka.sink.CloudPubSubSinkTask (org.apache.kafka.connect.runtime.Worker:428)
[2020-05-22 03:45:32,116] INFO StringConverterConfig values: 
    converter.encoding = UTF8
    converter.type = key
 (org.apache.kafka.connect.storage.StringConverterConfig:347)
[2020-05-22 03:45:32,116] INFO Set up the key converter class org.apache.kafka.connect.storage.StringConverter for task data-output-stream-4 using the worker config (org.apache.kafka.connect.runtime.Worker:441)
[2020-05-22 03:45:32,116] INFO StringConverterConfig values: 
    converter.encoding = UTF8
    converter.type = value
 (org.apache.kafka.connect.storage.StringConverterConfig:347)
[2020-05-22 03:45:32,117] INFO Set up the value converter class org.apache.kafka.connect.storage.StringConverter for task data-output-stream-4 using the worker config (org.apache.kafka.connect.runtime.Worker:447)
[2020-05-22 03:45:32,117] INFO Set up the header converter class org.apache.kafka.connect.storage.SimpleHeaderConverter for task data-output-stream-4 using the worker config (org.apache.kafka.connect.runtime.Worker:454)
[2020-05-22 03:45:32,121] INFO Initializing: org.apache.kafka.connect.runtime.TransformationChain{} (org.apache.kafka.connect.runtime.Worker:522)
[2020-05-22 03:45:32,121] INFO SinkConnectorConfig values: 
    config.action.reload = restart
    connector.class = com.google.pubsub.kafka.sink.CloudPubSubSinkConnector
    errors.deadletterqueue.context.headers.enable = false
    errors.deadletterqueue.topic.name = 
    errors.deadletterqueue.topic.replication.factor = 3
    errors.log.enable = false
    errors.log.include.messages = false
    errors.retry.delay.max.ms = 60000
    errors.retry.timeout = 0
    errors.tolerance = none
    header.converter = null
    key.converter = null
    name = data-output-stream
    tasks.max = 10
    topics = [data-output-stream]
    topics.regex = 
    transforms = []
    value.converter = null
 (org.apache.kafka.connect.runtime.SinkConnectorConfig:347)
[2020-05-22 03:45:32,122] INFO EnrichedConnectorConfig values: 
    config.action.reload = restart
    connector.class = com.google.pubsub.kafka.sink.CloudPubSubSinkConnector
    errors.deadletterqueue.context.headers.enable = false
    errors.deadletterqueue.topic.name = 
    errors.deadletterqueue.topic.replication.factor = 3
    errors.log.enable = false
    errors.log.include.messages = false
    errors.retry.delay.max.ms = 60000
    errors.retry.timeout = 0
    errors.tolerance = none
    header.converter = null
    key.converter = null
    name = data-output-stream
    tasks.max = 10
    topics = [data-output-stream]
    topics.regex = 
    transforms = []
    value.converter = null
 (org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig:347)
[2020-05-22 03:45:32,124] INFO ConsumerConfig values: 
    allow.auto.create.topics = true
    auto.commit.interval.ms = 5000
    auto.offset.reset = earliest
    bootstrap.servers = [kafka:29092]
    check.crcs = true
    client.dns.lookup = default
    client.id = connector-consumer-data-output-stream-4
    client.rack = 
    connections.max.idle.ms = 540000
    default.api.timeout.ms = 60000
    enable.auto.commit = false
    exclude.internal.topics = true
    fetch.max.bytes = 52428800
    fetch.max.wait.ms = 500
    fetch.min.bytes = 1
    group.id = connect-data-output-stream
    group.instance.id = null
    heartbeat.interval.ms = 3000
    interceptor.classes = []
    internal.leave.group.on.close = true
    isolation.level = read_uncommitted
    key.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
    max.partition.fetch.bytes = 1048576
    max.poll.interval.ms = 300000
    max.poll.records = 500
    metadata.max.age.ms = 300000
    metric.reporters = []
    metrics.num.samples = 2
    metrics.recording.level = INFO
    metrics.sample.window.ms = 30000
    partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor]
    receive.buffer.bytes = 65536
    reconnect.backoff.max.ms = 1000
    reconnect.backoff.ms = 50
    request.timeout.ms = 30000
    retry.backoff.ms = 100
    sasl.client.callback.handler.class = null
    sasl.jaas.config = null
    sasl.kerberos.kinit.cmd = /usr/bin/kinit
    sasl.kerberos.min.time.before.relogin = 60000
    sasl.kerberos.service.name = null
    sasl.kerberos.ticket.renew.jitter = 0.05
    sasl.kerberos.ticket.renew.window.factor = 0.8
    sasl.login.callback.handler.class = null
    sasl.login.class = null
    sasl.login.refresh.buffer.seconds = 300
    sasl.login.refresh.min.period.seconds = 60
    sasl.login.refresh.window.factor = 0.8
    sasl.login.refresh.window.jitter = 0.05
    sasl.mechanism = GSSAPI
    security.protocol = PLAINTEXT
    send.buffer.bytes = 131072
    session.timeout.ms = 10000
    ssl.cipher.suites = null
    ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
    ssl.endpoint.identification.algorithm = https
    ssl.key.password = null
    ssl.keymanager.algorithm = SunX509
    ssl.keystore.location = null
    ssl.keystore.password = null
    ssl.keystore.type = JKS
    ssl.protocol = TLS
    ssl.provider = null
    ssl.secure.random.implementation = null
    ssl.trustmanager.algorithm = PKIX
    ssl.truststore.location = null
    ssl.truststore.password = null
    ssl.truststore.type = JKS
    value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
 (org.apache.kafka.clients.consumer.ConsumerConfig:347)
[2020-05-22 03:45:32,130] INFO [Consumer clientId=connector-consumer-data-output-stream-3, groupId=connect-data-output-stream] Subscribed to topic(s): data-output-stream (org.apache.kafka.clients.consumer.KafkaConsumer:964)
[2020-05-22 03:45:32,141] INFO Kafka version: 2.3.0 (org.apache.kafka.common.utils.AppInfoParser:117)
[2020-05-22 03:45:32,142] INFO Kafka commitId: fc1aaa116b661c8a (org.apache.kafka.common.utils.AppInfoParser:118)
[2020-05-22 03:45:32,142] INFO Kafka startTimeMs: 1590119132141 (org.apache.kafka.common.utils.AppInfoParser:119)
[2020-05-22 03:45:32,150] INFO Creating task data-output-stream-5 (org.apache.kafka.connect.runtime.Worker:414)
[2020-05-22 03:45:32,152] INFO ConnectorConfig values: 
    config.action.reload = restart
    connector.class = com.google.pubsub.kafka.sink.CloudPubSubSinkConnector
    errors.log.enable = false
    errors.log.include.messages = false
    errors.retry.delay.max.ms = 60000
    errors.retry.timeout = 0
    errors.tolerance = none
    header.converter = null
    key.converter = null
    name = data-output-stream
    tasks.max = 10
    transforms = []
    value.converter = null
 (org.apache.kafka.connect.runtime.ConnectorConfig:347)
[2020-05-22 03:45:32,152] INFO EnrichedConnectorConfig values: 
    config.action.reload = restart
    connector.class = com.google.pubsub.kafka.sink.CloudPubSubSinkConnector
    errors.log.enable = false
    errors.log.include.messages = false
    errors.retry.delay.max.ms = 60000
    errors.retry.timeout = 0
    errors.tolerance = none
    header.converter = null
    key.converter = null
    name = data-output-stream
    tasks.max = 10
    transforms = []
    value.converter = null
 (org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig:347)
[2020-05-22 03:45:32,153] INFO TaskConfig values: 
    task.class = class com.google.pubsub.kafka.sink.CloudPubSubSinkTask
 (org.apache.kafka.connect.runtime.TaskConfig:347)
[2020-05-22 03:45:32,154] INFO Instantiated task data-output-stream-5 with version 2.3.0 of type com.google.pubsub.kafka.sink.CloudPubSubSinkTask (org.apache.kafka.connect.runtime.Worker:428)
[2020-05-22 03:45:32,154] INFO StringConverterConfig values: 
    converter.encoding = UTF8
    converter.type = key
 (org.apache.kafka.connect.storage.StringConverterConfig:347)
[2020-05-22 03:45:32,155] INFO Set up the key converter class org.apache.kafka.connect.storage.StringConverter for task data-output-stream-5 using the worker config (org.apache.kafka.connect.runtime.Worker:441)
[2020-05-22 03:45:32,155] INFO StringConverterConfig values: 
    converter.encoding = UTF8
    converter.type = value
 (org.apache.kafka.connect.storage.StringConverterConfig:347)
[2020-05-22 03:45:32,156] INFO Set up the value converter class org.apache.kafka.connect.storage.StringConverter for task data-output-stream-5 using the worker config (org.apache.kafka.connect.runtime.Worker:447)
[2020-05-22 03:45:32,157] INFO Set up the header converter class org.apache.kafka.connect.storage.SimpleHeaderConverter for task data-output-stream-5 using the worker config (org.apache.kafka.connect.runtime.Worker:454)
[2020-05-22 03:45:32,160] INFO Initializing: org.apache.kafka.connect.runtime.TransformationChain{} (org.apache.kafka.connect.runtime.Worker:522)
[2020-05-22 03:45:32,160] INFO SinkConnectorConfig values: 
    config.action.reload = restart
    connector.class = com.google.pubsub.kafka.sink.CloudPubSubSinkConnector
    errors.deadletterqueue.context.headers.enable = false
    errors.deadletterqueue.topic.name = 
    errors.deadletterqueue.topic.replication.factor = 3
    errors.log.enable = false
    errors.log.include.messages = false
    errors.retry.delay.max.ms = 60000
    errors.retry.timeout = 0
    errors.tolerance = none
    header.converter = null
    key.converter = null
    name = data-output-stream
    tasks.max = 10
    topics = [data-output-stream]
    topics.regex = 
    transforms = []
    value.converter = null
 (org.apache.kafka.connect.runtime.SinkConnectorConfig:347)
[2020-05-22 03:45:32,161] INFO EnrichedConnectorConfig values: 
    config.action.reload = restart
    connector.class = com.google.pubsub.kafka.sink.CloudPubSubSinkConnector
    errors.deadletterqueue.context.headers.enable = false
    errors.deadletterqueue.topic.name = 
    errors.deadletterqueue.topic.replication.factor = 3
    errors.log.enable = false
    errors.log.include.messages = false
    errors.retry.delay.max.ms = 60000
    errors.retry.timeout = 0
    errors.tolerance = none
    header.converter = null
    key.converter = null
    name = data-output-stream
    tasks.max = 10
    topics = [data-output-stream]
    topics.regex = 
    transforms = []
    value.converter = null
 (org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig:347)
[2020-05-22 03:45:32,163] INFO ConsumerConfig values: 
    allow.auto.create.topics = true
    auto.commit.interval.ms = 5000
    auto.offset.reset = earliest
    bootstrap.servers = [kafka:29092]
    check.crcs = true
    client.dns.lookup = default
    client.id = connector-consumer-data-output-stream-5
    client.rack = 
    connections.max.idle.ms = 540000
    default.api.timeout.ms = 60000
    enable.auto.commit = false
    exclude.internal.topics = true
    fetch.max.bytes = 52428800
    fetch.max.wait.ms = 500
    fetch.min.bytes = 1
    group.id = connect-data-output-stream
    group.instance.id = null
    heartbeat.interval.ms = 3000
    interceptor.classes = []
    internal.leave.group.on.close = true
    isolation.level = read_uncommitted
    key.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
    max.partition.fetch.bytes = 1048576
    max.poll.interval.ms = 300000
    max.poll.records = 500
    metadata.max.age.ms = 300000
    metric.reporters = []
    metrics.num.samples = 2
    metrics.recording.level = INFO
    metrics.sample.window.ms = 30000
    partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor]
    receive.buffer.bytes = 65536
    reconnect.backoff.max.ms = 1000
    reconnect.backoff.ms = 50
    request.timeout.ms = 30000
    retry.backoff.ms = 100
    sasl.client.callback.handler.class = null
    sasl.jaas.config = null
    sasl.kerberos.kinit.cmd = /usr/bin/kinit
    sasl.kerberos.min.time.before.relogin = 60000
    sasl.kerberos.service.name = null
    sasl.kerberos.ticket.renew.jitter = 0.05
    sasl.kerberos.ticket.renew.window.factor = 0.8
    sasl.login.callback.handler.class = null
    sasl.login.class = null
    sasl.login.refresh.buffer.seconds = 300
    sasl.login.refresh.min.period.seconds = 60
    sasl.login.refresh.window.factor = 0.8
    sasl.login.refresh.window.jitter = 0.05
    sasl.mechanism = GSSAPI
    security.protocol = PLAINTEXT
    send.buffer.bytes = 131072
    session.timeout.ms = 10000
    ssl.cipher.suites = null
    ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
    ssl.endpoint.identification.algorithm = https
    ssl.key.password = null
    ssl.keymanager.algorithm = SunX509
    ssl.keystore.location = null
    ssl.keystore.password = null
    ssl.keystore.type = JKS
    ssl.protocol = TLS
    ssl.provider = null
    ssl.secure.random.implementation = null
    ssl.trustmanager.algorithm = PKIX
    ssl.truststore.location = null
    ssl.truststore.password = null
    ssl.truststore.type = JKS
    value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
 (org.apache.kafka.clients.consumer.ConsumerConfig:347)
[2020-05-22 03:45:32,169] INFO [Consumer clientId=connector-consumer-data-output-stream-4, groupId=connect-data-output-stream] Subscribed to topic(s): data-output-stream (org.apache.kafka.clients.consumer.KafkaConsumer:964)
[2020-05-22 03:45:32,180] INFO Kafka version: 2.3.0 (org.apache.kafka.common.utils.AppInfoParser:117)
[2020-05-22 03:45:32,181] INFO Kafka commitId: fc1aaa116b661c8a (org.apache.kafka.common.utils.AppInfoParser:118)
[2020-05-22 03:45:32,181] INFO Kafka startTimeMs: 1590119132180 (org.apache.kafka.common.utils.AppInfoParser:119)
[2020-05-22 03:45:32,191] INFO Creating task data-output-stream-6 (org.apache.kafka.connect.runtime.Worker:414)
[2020-05-22 03:45:32,197] INFO [Consumer clientId=connector-consumer-data-output-stream-5, groupId=connect-data-output-stream] Subscribed to topic(s): data-output-stream (org.apache.kafka.clients.consumer.KafkaConsumer:964)
[2020-05-22 03:45:32,198] INFO ConnectorConfig values: 
    config.action.reload = restart
    connector.class = com.google.pubsub.kafka.sink.CloudPubSubSinkConnector
    errors.log.enable = false
    errors.log.include.messages = false
    errors.retry.delay.max.ms = 60000
    errors.retry.timeout = 0
    errors.tolerance = none
    header.converter = null
    key.converter = null
    name = data-output-stream
    tasks.max = 10
    transforms = []
    value.converter = null
 (org.apache.kafka.connect.runtime.ConnectorConfig:347)
[2020-05-22 03:45:32,199] INFO EnrichedConnectorConfig values: 
    config.action.reload = restart
    connector.class = com.google.pubsub.kafka.sink.CloudPubSubSinkConnector
    errors.log.enable = false
    errors.log.include.messages = false
    errors.retry.delay.max.ms = 60000
    errors.retry.timeout = 0
    errors.tolerance = none
    header.converter = null
    key.converter = null
    name = data-output-stream
    tasks.max = 10
    transforms = []
    value.converter = null
 (org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig:347)
[2020-05-22 03:45:32,199] INFO TaskConfig values: 
    task.class = class com.google.pubsub.kafka.sink.CloudPubSubSinkTask
 (org.apache.kafka.connect.runtime.TaskConfig:347)
[2020-05-22 03:45:32,200] INFO Instantiated task data-output-stream-6 with version 2.3.0 of type com.google.pubsub.kafka.sink.CloudPubSubSinkTask (org.apache.kafka.connect.runtime.Worker:428)
[2020-05-22 03:45:32,206] INFO StringConverterConfig values: 
    converter.encoding = UTF8
    converter.type = key
 (org.apache.kafka.connect.storage.StringConverterConfig:347)
[2020-05-22 03:45:32,206] INFO Set up the key converter class org.apache.kafka.connect.storage.StringConverter for task data-output-stream-6 using the worker config (org.apache.kafka.connect.runtime.Worker:441)
[2020-05-22 03:45:32,211] INFO StringConverterConfig values: 
    converter.encoding = UTF8
    converter.type = value
 (org.apache.kafka.connect.storage.StringConverterConfig:347)
[2020-05-22 03:45:32,212] INFO Set up the value converter class org.apache.kafka.connect.storage.StringConverter for task data-output-stream-6 using the worker config (org.apache.kafka.connect.runtime.Worker:447)
[2020-05-22 03:45:32,213] INFO Set up the header converter class org.apache.kafka.connect.storage.SimpleHeaderConverter for task data-output-stream-6 using the worker config (org.apache.kafka.connect.runtime.Worker:454)
[2020-05-22 03:45:32,215] INFO Initializing: org.apache.kafka.connect.runtime.TransformationChain{} (org.apache.kafka.connect.runtime.Worker:522)
[2020-05-22 03:45:32,216] INFO SinkConnectorConfig values: 
    config.action.reload = restart
    connector.class = com.google.pubsub.kafka.sink.CloudPubSubSinkConnector
    errors.deadletterqueue.context.headers.enable = false
    errors.deadletterqueue.topic.name = 
    errors.deadletterqueue.topic.replication.factor = 3
    errors.log.enable = false
    errors.log.include.messages = false
    errors.retry.delay.max.ms = 60000
    errors.retry.timeout = 0
    errors.tolerance = none
    header.converter = null
    key.converter = null
    name = data-output-stream
    tasks.max = 10
    topics = [data-output-stream]
    topics.regex = 
    transforms = []
    value.converter = null
 (org.apache.kafka.connect.runtime.SinkConnectorConfig:347)
[2020-05-22 03:45:32,217] INFO EnrichedConnectorConfig values: 
    config.action.reload = restart
    connector.class = com.google.pubsub.kafka.sink.CloudPubSubSinkConnector
    errors.deadletterqueue.context.headers.enable = false
    errors.deadletterqueue.topic.name = 
    errors.deadletterqueue.topic.replication.factor = 3
    errors.log.enable = false
    errors.log.include.messages = false
    errors.retry.delay.max.ms = 60000
    errors.retry.timeout = 0
    errors.tolerance = none
    header.converter = null
    key.converter = null
    name = data-output-stream
    tasks.max = 10
    topics = [data-output-stream]
    topics.regex = 
    transforms = []
    value.converter = null
 (org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig:347)
[2020-05-22 03:45:32,231] INFO ConsumerConfig values: 
    allow.auto.create.topics = true
    auto.commit.interval.ms = 5000
    auto.offset.reset = earliest
    bootstrap.servers = [kafka:29092]
    check.crcs = true
    client.dns.lookup = default
    client.id = connector-consumer-data-output-stream-6
    client.rack = 
    connections.max.idle.ms = 540000
    default.api.timeout.ms = 60000
    enable.auto.commit = false
    exclude.internal.topics = true
    fetch.max.bytes = 52428800
    fetch.max.wait.ms = 500
    fetch.min.bytes = 1
    group.id = connect-data-output-stream
    group.instance.id = null
    heartbeat.interval.ms = 3000
    interceptor.classes = []
    internal.leave.group.on.close = true
    isolation.level = read_uncommitted
    key.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
    max.partition.fetch.bytes = 1048576
    max.poll.interval.ms = 300000
    max.poll.records = 500
    metadata.max.age.ms = 300000
    metric.reporters = []
    metrics.num.samples = 2
    metrics.recording.level = INFO
    metrics.sample.window.ms = 30000
    partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor]
    receive.buffer.bytes = 65536
    reconnect.backoff.max.ms = 1000
    reconnect.backoff.ms = 50
    request.timeout.ms = 30000
    retry.backoff.ms = 100
    sasl.client.callback.handler.class = null
    sasl.jaas.config = null
    sasl.kerberos.kinit.cmd = /usr/bin/kinit
    sasl.kerberos.min.time.before.relogin = 60000
    sasl.kerberos.service.name = null
    sasl.kerberos.ticket.renew.jitter = 0.05
    sasl.kerberos.ticket.renew.window.factor = 0.8
    sasl.login.callback.handler.class = null
    sasl.login.class = null
    sasl.login.refresh.buffer.seconds = 300
    sasl.login.refresh.min.period.seconds = 60
    sasl.login.refresh.window.factor = 0.8
    sasl.login.refresh.window.jitter = 0.05
    sasl.mechanism = GSSAPI
    security.protocol = PLAINTEXT
    send.buffer.bytes = 131072
    session.timeout.ms = 10000
    ssl.cipher.suites = null
    ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
    ssl.endpoint.identification.algorithm = https
    ssl.key.password = null
    ssl.keymanager.algorithm = SunX509
    ssl.keystore.location = null
    ssl.keystore.password = null
    ssl.keystore.type = JKS
    ssl.protocol = TLS
    ssl.provider = null
    ssl.secure.random.implementation = null
    ssl.trustmanager.algorithm = PKIX
    ssl.truststore.location = null
    ssl.truststore.password = null
    ssl.truststore.type = JKS
    value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
 (org.apache.kafka.clients.consumer.ConsumerConfig:347)
[2020-05-22 03:45:32,244] INFO Kafka version: 2.3.0 (org.apache.kafka.common.utils.AppInfoParser:117)
[2020-05-22 03:45:32,244] INFO Kafka commitId: fc1aaa116b661c8a (org.apache.kafka.common.utils.AppInfoParser:118)
[2020-05-22 03:45:32,245] INFO Kafka startTimeMs: 1590119132244 (org.apache.kafka.common.utils.AppInfoParser:119)
[2020-05-22 03:45:32,248] INFO Creating task data-output-stream-7 (org.apache.kafka.connect.runtime.Worker:414)
[2020-05-22 03:45:32,249] INFO ConnectorConfig values: 
    config.action.reload = restart
    connector.class = com.google.pubsub.kafka.sink.CloudPubSubSinkConnector
    errors.log.enable = false
    errors.log.include.messages = false
    errors.retry.delay.max.ms = 60000
    errors.retry.timeout = 0
    errors.tolerance = none
    header.converter = null
    key.converter = null
    name = data-output-stream
    tasks.max = 10
    transforms = []
    value.converter = null
 (org.apache.kafka.connect.runtime.ConnectorConfig:347)
[2020-05-22 03:45:32,250] INFO EnrichedConnectorConfig values: 
    config.action.reload = restart
    connector.class = com.google.pubsub.kafka.sink.CloudPubSubSinkConnector
    errors.log.enable = false
    errors.log.include.messages = false
    errors.retry.delay.max.ms = 60000
    errors.retry.timeout = 0
    errors.tolerance = none
    header.converter = null
    key.converter = null
    name = data-output-stream
    tasks.max = 10
    transforms = []
    value.converter = null
 (org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig:347)
[2020-05-22 03:45:32,250] INFO TaskConfig values: 
    task.class = class com.google.pubsub.kafka.sink.CloudPubSubSinkTask
 (org.apache.kafka.connect.runtime.TaskConfig:347)
[2020-05-22 03:45:32,251] INFO Instantiated task data-output-stream-7 with version 2.3.0 of type com.google.pubsub.kafka.sink.CloudPubSubSinkTask (org.apache.kafka.connect.runtime.Worker:428)
[2020-05-22 03:45:32,251] INFO StringConverterConfig values: 
    converter.encoding = UTF8
    converter.type = key
 (org.apache.kafka.connect.storage.StringConverterConfig:347)
[2020-05-22 03:45:32,252] INFO Set up the key converter class org.apache.kafka.connect.storage.StringConverter for task data-output-stream-7 using the worker config (org.apache.kafka.connect.runtime.Worker:441)
[2020-05-22 03:45:32,253] INFO StringConverterConfig values: 
    converter.encoding = UTF8
    converter.type = value
 (org.apache.kafka.connect.storage.StringConverterConfig:347)
[2020-05-22 03:45:32,253] INFO Set up the value converter class org.apache.kafka.connect.storage.StringConverter for task data-output-stream-7 using the worker config (org.apache.kafka.connect.runtime.Worker:447)
[2020-05-22 03:45:32,253] INFO Set up the header converter class org.apache.kafka.connect.storage.SimpleHeaderConverter for task data-output-stream-7 using the worker config (org.apache.kafka.connect.runtime.Worker:454)
[2020-05-22 03:45:32,255] INFO Initializing: org.apache.kafka.connect.runtime.TransformationChain{} (org.apache.kafka.connect.runtime.Worker:522)
[2020-05-22 03:45:32,255] INFO SinkConnectorConfig values: 
    config.action.reload = restart
    connector.class = com.google.pubsub.kafka.sink.CloudPubSubSinkConnector
    errors.deadletterqueue.context.headers.enable = false
    errors.deadletterqueue.topic.name = 
    errors.deadletterqueue.topic.replication.factor = 3
    errors.log.enable = false
    errors.log.include.messages = false
    errors.retry.delay.max.ms = 60000
    errors.retry.timeout = 0
    errors.tolerance = none
    header.converter = null
    key.converter = null
    name = data-output-stream
    tasks.max = 10
    topics = [data-output-stream]
    topics.regex = 
    transforms = []
    value.converter = null
 (org.apache.kafka.connect.runtime.SinkConnectorConfig:347)
[2020-05-22 03:45:32,256] INFO EnrichedConnectorConfig values: 
    config.action.reload = restart
    connector.class = com.google.pubsub.kafka.sink.CloudPubSubSinkConnector
    errors.deadletterqueue.context.headers.enable = false
    errors.deadletterqueue.topic.name = 
    errors.deadletterqueue.topic.replication.factor = 3
    errors.log.enable = false
    errors.log.include.messages = false
    errors.retry.delay.max.ms = 60000
    errors.retry.timeout = 0
    errors.tolerance = none
    header.converter = null
    key.converter = null
    name = data-output-stream
    tasks.max = 10
    topics = [data-output-stream]
    topics.regex = 
    transforms = []
    value.converter = null
 (org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig:347)
[2020-05-22 03:45:32,257] INFO ConsumerConfig values: 
    allow.auto.create.topics = true
    auto.commit.interval.ms = 5000
    auto.offset.reset = earliest
    bootstrap.servers = [kafka:29092]
    check.crcs = true
    client.dns.lookup = default
    client.id = connector-consumer-data-output-stream-7
    client.rack = 
    connections.max.idle.ms = 540000
    default.api.timeout.ms = 60000
    enable.auto.commit = false
    exclude.internal.topics = true
    fetch.max.bytes = 52428800
    fetch.max.wait.ms = 500
    fetch.min.bytes = 1
    group.id = connect-data-output-stream
    group.instance.id = null
    heartbeat.interval.ms = 3000
    interceptor.classes = []
    internal.leave.group.on.close = true
    isolation.level = read_uncommitted
    key.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
    max.partition.fetch.bytes = 1048576
    max.poll.interval.ms = 300000
    max.poll.records = 500
    metadata.max.age.ms = 300000
    metric.reporters = []
    metrics.num.samples = 2
    metrics.recording.level = INFO
    metrics.sample.window.ms = 30000
    partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor]
    receive.buffer.bytes = 65536
    reconnect.backoff.max.ms = 1000
    reconnect.backoff.ms = 50
    request.timeout.ms = 30000
    retry.backoff.ms = 100
    sasl.client.callback.handler.class = null
    sasl.jaas.config = null
    sasl.kerberos.kinit.cmd = /usr/bin/kinit
    sasl.kerberos.min.time.before.relogin = 60000
    sasl.kerberos.service.name = null
    sasl.kerberos.ticket.renew.jitter = 0.05
    sasl.kerberos.ticket.renew.window.factor = 0.8
    sasl.login.callback.handler.class = null
    sasl.login.class = null
    sasl.login.refresh.buffer.seconds = 300
    sasl.login.refresh.min.period.seconds = 60
    sasl.login.refresh.window.factor = 0.8
    sasl.login.refresh.window.jitter = 0.05
    sasl.mechanism = GSSAPI
    security.protocol = PLAINTEXT
    send.buffer.bytes = 131072
    session.timeout.ms = 10000
    ssl.cipher.suites = null
    ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
    ssl.endpoint.identification.algorithm = https
    ssl.key.password = null
    ssl.keymanager.algorithm = SunX509
    ssl.keystore.location = null
    ssl.keystore.password = null
    ssl.keystore.type = JKS
    ssl.protocol = TLS
    ssl.provider = null
    ssl.secure.random.implementation = null
    ssl.trustmanager.algorithm = PKIX
    ssl.truststore.location = null
    ssl.truststore.password = null
    ssl.truststore.type = JKS
    value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
 (org.apache.kafka.clients.consumer.ConsumerConfig:347)
[2020-05-22 03:45:32,270] INFO [Consumer clientId=connector-consumer-data-output-stream-6, groupId=connect-data-output-stream] Subscribed to topic(s): data-output-stream (org.apache.kafka.clients.consumer.KafkaConsumer:964)
[2020-05-22 03:45:32,273] INFO Kafka version: 2.3.0 (org.apache.kafka.common.utils.AppInfoParser:117)
[2020-05-22 03:45:32,274] INFO Kafka commitId: fc1aaa116b661c8a (org.apache.kafka.common.utils.AppInfoParser:118)
[2020-05-22 03:45:32,274] INFO Kafka startTimeMs: 1590119132272 (org.apache.kafka.common.utils.AppInfoParser:119)
[2020-05-22 03:45:32,284] INFO Creating task data-output-stream-8 (org.apache.kafka.connect.runtime.Worker:414)
[2020-05-22 03:45:32,284] INFO ConnectorConfig values: 
    config.action.reload = restart
    connector.class = com.google.pubsub.kafka.sink.CloudPubSubSinkConnector
    errors.log.enable = false
    errors.log.include.messages = false
    errors.retry.delay.max.ms = 60000
    errors.retry.timeout = 0
    errors.tolerance = none
    header.converter = null
    key.converter = null
    name = data-output-stream
    tasks.max = 10
    transforms = []
    value.converter = null
 (org.apache.kafka.connect.runtime.ConnectorConfig:347)
[2020-05-22 03:45:32,284] INFO EnrichedConnectorConfig values: 
    config.action.reload = restart
    connector.class = com.google.pubsub.kafka.sink.CloudPubSubSinkConnector
    errors.log.enable = false
    errors.log.include.messages = false
    errors.retry.delay.max.ms = 60000
    errors.retry.timeout = 0
    errors.tolerance = none
    header.converter = null
    key.converter = null
    name = data-output-stream
    tasks.max = 10
    transforms = []
    value.converter = null
 (org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig:347)
[2020-05-22 03:45:32,285] INFO TaskConfig values: 
    task.class = class com.google.pubsub.kafka.sink.CloudPubSubSinkTask
 (org.apache.kafka.connect.runtime.TaskConfig:347)
[2020-05-22 03:45:32,285] INFO Instantiated task data-output-stream-8 with version 2.3.0 of type com.google.pubsub.kafka.sink.CloudPubSubSinkTask (org.apache.kafka.connect.runtime.Worker:428)
[2020-05-22 03:45:32,285] INFO StringConverterConfig values: 
    converter.encoding = UTF8
    converter.type = key
 (org.apache.kafka.connect.storage.StringConverterConfig:347)
[2020-05-22 03:45:32,286] INFO Set up the key converter class org.apache.kafka.connect.storage.StringConverter for task data-output-stream-8 using the worker config (org.apache.kafka.connect.runtime.Worker:441)
[2020-05-22 03:45:32,286] INFO StringConverterConfig values: 
    converter.encoding = UTF8
    converter.type = value
 (org.apache.kafka.connect.storage.StringConverterConfig:347)
[2020-05-22 03:45:32,286] INFO Set up the value converter class org.apache.kafka.connect.storage.StringConverter for task data-output-stream-8 using the worker config (org.apache.kafka.connect.runtime.Worker:447)
[2020-05-22 03:45:32,286] INFO Set up the header converter class org.apache.kafka.connect.storage.SimpleHeaderConverter for task data-output-stream-8 using the worker config (org.apache.kafka.connect.runtime.Worker:454)
[2020-05-22 03:45:32,287] INFO Initializing: org.apache.kafka.connect.runtime.TransformationChain{} (org.apache.kafka.connect.runtime.Worker:522)
[2020-05-22 03:45:32,288] INFO SinkConnectorConfig values: 
    config.action.reload = restart
    connector.class = com.google.pubsub.kafka.sink.CloudPubSubSinkConnector
    errors.deadletterqueue.context.headers.enable = false
    errors.deadletterqueue.topic.name = 
    errors.deadletterqueue.topic.replication.factor = 3
    errors.log.enable = false
    errors.log.include.messages = false
    errors.retry.delay.max.ms = 60000
    errors.retry.timeout = 0
    errors.tolerance = none
    header.converter = null
    key.converter = null
    name = data-output-stream
    tasks.max = 10
    topics = [data-output-stream]
    topics.regex = 
    transforms = []
    value.converter = null
 (org.apache.kafka.connect.runtime.SinkConnectorConfig:347)
[2020-05-22 03:45:32,288] INFO EnrichedConnectorConfig values: 
    config.action.reload = restart
    connector.class = com.google.pubsub.kafka.sink.CloudPubSubSinkConnector
    errors.deadletterqueue.context.headers.enable = false
    errors.deadletterqueue.topic.name = 
    errors.deadletterqueue.topic.replication.factor = 3
    errors.log.enable = false
    errors.log.include.messages = false
    errors.retry.delay.max.ms = 60000
    errors.retry.timeout = 0
    errors.tolerance = none
    header.converter = null
    key.converter = null
    name = data-output-stream
    tasks.max = 10
    topics = [data-output-stream]
    topics.regex = 
    transforms = []
    value.converter = null
 (org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig:347)
[2020-05-22 03:45:32,291] INFO ConsumerConfig values: 
    allow.auto.create.topics = true
    auto.commit.interval.ms = 5000
    auto.offset.reset = earliest
    bootstrap.servers = [kafka:29092]
    check.crcs = true
    client.dns.lookup = default
    client.id = connector-consumer-data-output-stream-8
    client.rack = 
    connections.max.idle.ms = 540000
    default.api.timeout.ms = 60000
    enable.auto.commit = false
    exclude.internal.topics = true
    fetch.max.bytes = 52428800
    fetch.max.wait.ms = 500
    fetch.min.bytes = 1
    group.id = connect-data-output-stream
    group.instance.id = null
    heartbeat.interval.ms = 3000
    interceptor.classes = []
    internal.leave.group.on.close = true
    isolation.level = read_uncommitted
    key.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
    max.partition.fetch.bytes = 1048576
    max.poll.interval.ms = 300000
    max.poll.records = 500
    metadata.max.age.ms = 300000
    metric.reporters = []
    metrics.num.samples = 2
    metrics.recording.level = INFO
    metrics.sample.window.ms = 30000
    partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor]
    receive.buffer.bytes = 65536
    reconnect.backoff.max.ms = 1000
    reconnect.backoff.ms = 50
    request.timeout.ms = 30000
    retry.backoff.ms = 100
    sasl.client.callback.handler.class = null
    sasl.jaas.config = null
    sasl.kerberos.kinit.cmd = /usr/bin/kinit
    sasl.kerberos.min.time.before.relogin = 60000
    sasl.kerberos.service.name = null
    sasl.kerberos.ticket.renew.jitter = 0.05
    sasl.kerberos.ticket.renew.window.factor = 0.8
    sasl.login.callback.handler.class = null
    sasl.login.class = null
    sasl.login.refresh.buffer.seconds = 300
    sasl.login.refresh.min.period.seconds = 60
    sasl.login.refresh.window.factor = 0.8
    sasl.login.refresh.window.jitter = 0.05
    sasl.mechanism = GSSAPI
    security.protocol = PLAINTEXT
    send.buffer.bytes = 131072
    session.timeout.ms = 10000
    ssl.cipher.suites = null
    ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
    ssl.endpoint.identification.algorithm = https
    ssl.key.password = null
    ssl.keymanager.algorithm = SunX509
    ssl.keystore.location = null
    ssl.keystore.password = null
    ssl.keystore.type = JKS
    ssl.protocol = TLS
    ssl.provider = null
    ssl.secure.random.implementation = null
    ssl.trustmanager.algorithm = PKIX
    ssl.truststore.location = null
    ssl.truststore.password = null
    ssl.truststore.type = JKS
    value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
 (org.apache.kafka.clients.consumer.ConsumerConfig:347)
[2020-05-22 03:45:32,305] INFO [Consumer clientId=connector-consumer-data-output-stream-7, groupId=connect-data-output-stream] Subscribed to topic(s): data-output-stream (org.apache.kafka.clients.consumer.KafkaConsumer:964)
[2020-05-22 03:45:32,311] INFO Kafka version: 2.3.0 (org.apache.kafka.common.utils.AppInfoParser:117)
[2020-05-22 03:45:32,312] INFO Kafka commitId: fc1aaa116b661c8a (org.apache.kafka.common.utils.AppInfoParser:118)
[2020-05-22 03:45:32,312] INFO Kafka startTimeMs: 1590119132311 (org.apache.kafka.common.utils.AppInfoParser:119)
[2020-05-22 03:45:32,318] INFO Creating task data-output-stream-9 (org.apache.kafka.connect.runtime.Worker:414)
[2020-05-22 03:45:32,319] INFO ConnectorConfig values: 
    config.action.reload = restart
    connector.class = com.google.pubsub.kafka.sink.CloudPubSubSinkConnector
    errors.log.enable = false
    errors.log.include.messages = false
    errors.retry.delay.max.ms = 60000
    errors.retry.timeout = 0
    errors.tolerance = none
    header.converter = null
    key.converter = null
    name = data-output-stream
    tasks.max = 10
    transforms = []
    value.converter = null
 (org.apache.kafka.connect.runtime.ConnectorConfig:347)
[2020-05-22 03:45:32,319] INFO EnrichedConnectorConfig values: 
    config.action.reload = restart
    connector.class = com.google.pubsub.kafka.sink.CloudPubSubSinkConnector
    errors.log.enable = false
    errors.log.include.messages = false
    errors.retry.delay.max.ms = 60000
    errors.retry.timeout = 0
    errors.tolerance = none
    header.converter = null
    key.converter = null
    name = data-output-stream
    tasks.max = 10
    transforms = []
    value.converter = null
 (org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig:347)
[2020-05-22 03:45:32,320] INFO TaskConfig values: 
    task.class = class com.google.pubsub.kafka.sink.CloudPubSubSinkTask
 (org.apache.kafka.connect.runtime.TaskConfig:347)
[2020-05-22 03:45:32,320] INFO Instantiated task data-output-stream-9 with version 2.3.0 of type com.google.pubsub.kafka.sink.CloudPubSubSinkTask (org.apache.kafka.connect.runtime.Worker:428)
[2020-05-22 03:45:32,320] INFO StringConverterConfig values: 
    converter.encoding = UTF8
    converter.type = key
 (org.apache.kafka.connect.storage.StringConverterConfig:347)
[2020-05-22 03:45:32,320] INFO Set up the key converter class org.apache.kafka.connect.storage.StringConverter for task data-output-stream-9 using the worker config (org.apache.kafka.connect.runtime.Worker:441)
[2020-05-22 03:45:32,321] INFO StringConverterConfig values: 
    converter.encoding = UTF8
    converter.type = value
 (org.apache.kafka.connect.storage.StringConverterConfig:347)
[2020-05-22 03:45:32,321] INFO Set up the value converter class org.apache.kafka.connect.storage.StringConverter for task data-output-stream-9 using the worker config (org.apache.kafka.connect.runtime.Worker:447)
[2020-05-22 03:45:32,321] INFO Set up the header converter class org.apache.kafka.connect.storage.SimpleHeaderConverter for task data-output-stream-9 using the worker config (org.apache.kafka.connect.runtime.Worker:454)
[2020-05-22 03:45:32,327] INFO Initializing: org.apache.kafka.connect.runtime.TransformationChain{} (org.apache.kafka.connect.runtime.Worker:522)
[2020-05-22 03:45:32,329] INFO [Consumer clientId=connector-consumer-data-output-stream-8, groupId=connect-data-output-stream] Subscribed to topic(s): data-output-stream (org.apache.kafka.clients.consumer.KafkaConsumer:964)
[2020-05-22 03:45:32,337] INFO SinkConnectorConfig values: 
    config.action.reload = restart
    connector.class = com.google.pubsub.kafka.sink.CloudPubSubSinkConnector
    errors.deadletterqueue.context.headers.enable = false
    errors.deadletterqueue.topic.name = 
    errors.deadletterqueue.topic.replication.factor = 3
    errors.log.enable = false
    errors.log.include.messages = false
    errors.retry.delay.max.ms = 60000
    errors.retry.timeout = 0
    errors.tolerance = none
    header.converter = null
    key.converter = null
    name = data-output-stream
    tasks.max = 10
    topics = [data-output-stream]
    topics.regex = 
    transforms = []
    value.converter = null
 (org.apache.kafka.connect.runtime.SinkConnectorConfig:347)
[2020-05-22 03:45:32,337] INFO EnrichedConnectorConfig values: 
    config.action.reload = restart
    connector.class = com.google.pubsub.kafka.sink.CloudPubSubSinkConnector
    errors.deadletterqueue.context.headers.enable = false
    errors.deadletterqueue.topic.name = 
    errors.deadletterqueue.topic.replication.factor = 3
    errors.log.enable = false
    errors.log.include.messages = false
    errors.retry.delay.max.ms = 60000
    errors.retry.timeout = 0
    errors.tolerance = none
    header.converter = null
    key.converter = null
    name = data-output-stream
    tasks.max = 10
    topics = [data-output-stream]
    topics.regex = 
    transforms = []
    value.converter = null
 (org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig:347)
[2020-05-22 03:45:32,338] INFO ConsumerConfig values: 
    allow.auto.create.topics = true
    auto.commit.interval.ms = 5000
    auto.offset.reset = earliest
    bootstrap.servers = [kafka:29092]
    check.crcs = true
    client.dns.lookup = default
    client.id = connector-consumer-data-output-stream-9
    client.rack = 
    connections.max.idle.ms = 540000
    default.api.timeout.ms = 60000
    enable.auto.commit = false
    exclude.internal.topics = true
    fetch.max.bytes = 52428800
    fetch.max.wait.ms = 500
    fetch.min.bytes = 1
    group.id = connect-data-output-stream
    group.instance.id = null
    heartbeat.interval.ms = 3000
    interceptor.classes = []
    internal.leave.group.on.close = true
    isolation.level = read_uncommitted
    key.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
    max.partition.fetch.bytes = 1048576
    max.poll.interval.ms = 300000
    max.poll.records = 500
    metadata.max.age.ms = 300000
    metric.reporters = []
    metrics.num.samples = 2
    metrics.recording.level = INFO
    metrics.sample.window.ms = 30000
    partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor]
    receive.buffer.bytes = 65536
    reconnect.backoff.max.ms = 1000
    reconnect.backoff.ms = 50
    request.timeout.ms = 30000
    retry.backoff.ms = 100
    sasl.client.callback.handler.class = null
    sasl.jaas.config = null
    sasl.kerberos.kinit.cmd = /usr/bin/kinit
    sasl.kerberos.min.time.before.relogin = 60000
    sasl.kerberos.service.name = null
    sasl.kerberos.ticket.renew.jitter = 0.05
    sasl.kerberos.ticket.renew.window.factor = 0.8
    sasl.login.callback.handler.class = null
    sasl.login.class = null
    sasl.login.refresh.buffer.seconds = 300
    sasl.login.refresh.min.period.seconds = 60
    sasl.login.refresh.window.factor = 0.8
    sasl.login.refresh.window.jitter = 0.05
    sasl.mechanism = GSSAPI
    security.protocol = PLAINTEXT
    send.buffer.bytes = 131072
    session.timeout.ms = 10000
    ssl.cipher.suites = null
    ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
    ssl.endpoint.identification.algorithm = https
    ssl.key.password = null
    ssl.keymanager.algorithm = SunX509
    ssl.keystore.location = null
    ssl.keystore.password = null
    ssl.keystore.type = JKS
    ssl.protocol = TLS
    ssl.provider = null
    ssl.secure.random.implementation = null
    ssl.trustmanager.algorithm = PKIX
    ssl.truststore.location = null
    ssl.truststore.password = null
    ssl.truststore.type = JKS
    value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
 (org.apache.kafka.clients.consumer.ConsumerConfig:347)
[2020-05-22 03:45:32,345] INFO Kafka version: 2.3.0 (org.apache.kafka.common.utils.AppInfoParser:117)
[2020-05-22 03:45:32,345] INFO Kafka commitId: fc1aaa116b661c8a (org.apache.kafka.common.utils.AppInfoParser:118)
[2020-05-22 03:45:32,346] INFO Kafka startTimeMs: 1590119132345 (org.apache.kafka.common.utils.AppInfoParser:119)
[2020-05-22 03:45:32,355] INFO Created connector data-output-stream (org.apache.kafka.connect.cli.ConnectStandalone:112)
[2020-05-22 03:45:32,361] INFO [Consumer clientId=connector-consumer-data-output-stream-9, groupId=connect-data-output-stream] Subscribed to topic(s): data-output-stream (org.apache.kafka.clients.consumer.KafkaConsumer:964)
[2020-05-22 03:45:33,306] INFO Start CloudPubSubSinkTask (com.google.pubsub.kafka.sink.CloudPubSubSinkTask:145)
[2020-05-22 03:45:33,307] INFO Start CloudPubSubSinkTask (com.google.pubsub.kafka.sink.CloudPubSubSinkTask:145)
[2020-05-22 03:45:33,307] INFO WorkerSinkTask{id=data-output-stream-2} Sink task finished initialization and start (org.apache.kafka.connect.runtime.WorkerSinkTask:301)
[2020-05-22 03:45:33,307] INFO WorkerSinkTask{id=data-output-stream-6} Sink task finished initialization and start (org.apache.kafka.connect.runtime.WorkerSinkTask:301)
[2020-05-22 03:45:33,311] INFO Start CloudPubSubSinkTask (com.google.pubsub.kafka.sink.CloudPubSubSinkTask:145)
[2020-05-22 03:45:33,312] INFO Start CloudPubSubSinkTask (com.google.pubsub.kafka.sink.CloudPubSubSinkTask:145)
[2020-05-22 03:45:33,312] INFO Start CloudPubSubSinkTask (com.google.pubsub.kafka.sink.CloudPubSubSinkTask:145)
[2020-05-22 03:45:33,312] INFO WorkerSinkTask{id=data-output-stream-0} Sink task finished initialization and start (org.apache.kafka.connect.runtime.WorkerSinkTask:301)
[2020-05-22 03:45:33,313] INFO Start CloudPubSubSinkTask (com.google.pubsub.kafka.sink.CloudPubSubSinkTask:145)
[2020-05-22 03:45:33,313] INFO WorkerSinkTask{id=data-output-stream-1} Sink task finished initialization and start (org.apache.kafka.connect.runtime.WorkerSinkTask:301)
[2020-05-22 03:45:33,314] INFO Start CloudPubSubSinkTask (com.google.pubsub.kafka.sink.CloudPubSubSinkTask:145)
[2020-05-22 03:45:33,314] INFO WorkerSinkTask{id=data-output-stream-5} Sink task finished initialization and start (org.apache.kafka.connect.runtime.WorkerSinkTask:301)
[2020-05-22 03:45:33,314] INFO Start CloudPubSubSinkTask (com.google.pubsub.kafka.sink.CloudPubSubSinkTask:145)
[2020-05-22 03:45:33,312] INFO Start CloudPubSubSinkTask (com.google.pubsub.kafka.sink.CloudPubSubSinkTask:145)
[2020-05-22 03:45:33,315] INFO WorkerSinkTask{id=data-output-stream-7} Sink task finished initialization and start (org.apache.kafka.connect.runtime.WorkerSinkTask:301)
[2020-05-22 03:45:33,315] INFO WorkerSinkTask{id=data-output-stream-4} Sink task finished initialization and start (org.apache.kafka.connect.runtime.WorkerSinkTask:301)
[2020-05-22 03:45:33,314] INFO WorkerSinkTask{id=data-output-stream-3} Sink task finished initialization and start (org.apache.kafka.connect.runtime.WorkerSinkTask:301)
[2020-05-22 03:45:33,313] INFO WorkerSinkTask{id=data-output-stream-9} Sink task finished initialization and start (org.apache.kafka.connect.runtime.WorkerSinkTask:301)
[2020-05-22 03:45:33,318] INFO Start CloudPubSubSinkTask (com.google.pubsub.kafka.sink.CloudPubSubSinkTask:145)
[2020-05-22 03:45:33,319] INFO WorkerSinkTask{id=data-output-stream-8} Sink task finished initialization and start (org.apache.kafka.connect.runtime.WorkerSinkTask:301)
[2020-05-22 03:45:33,365] INFO [Consumer clientId=connector-consumer-data-output-stream-7, groupId=connect-data-output-stream] Cluster ID: 36MPGS-bQEGQTzqv2AM-rg (org.apache.kafka.clients.Metadata:266)
[2020-05-22 03:45:33,368] INFO [Consumer clientId=connector-consumer-data-output-stream-8, groupId=connect-data-output-stream] Cluster ID: 36MPGS-bQEGQTzqv2AM-rg (org.apache.kafka.clients.Metadata:266)
[2020-05-22 03:45:33,368] INFO [Consumer clientId=connector-consumer-data-output-stream-2, groupId=connect-data-output-stream] Cluster ID: 36MPGS-bQEGQTzqv2AM-rg (org.apache.kafka.clients.Metadata:266)
[2020-05-22 03:45:33,368] INFO [Consumer clientId=connector-consumer-data-output-stream-4, groupId=connect-data-output-stream] Cluster ID: 36MPGS-bQEGQTzqv2AM-rg (org.apache.kafka.clients.Metadata:266)
[2020-05-22 03:45:33,365] INFO [Consumer clientId=connector-consumer-data-output-stream-0, groupId=connect-data-output-stream] Cluster ID: 36MPGS-bQEGQTzqv2AM-rg (org.apache.kafka.clients.Metadata:266)
[2020-05-22 03:45:33,365] INFO [Consumer clientId=connector-consumer-data-output-stream-9, groupId=connect-data-output-stream] Cluster ID: 36MPGS-bQEGQTzqv2AM-rg (org.apache.kafka.clients.Metadata:266)
[2020-05-22 03:45:33,365] INFO [Consumer clientId=connector-consumer-data-output-stream-1, groupId=connect-data-output-stream] Cluster ID: 36MPGS-bQEGQTzqv2AM-rg (org.apache.kafka.clients.Metadata:266)
[2020-05-22 03:45:33,372] INFO [Consumer clientId=connector-consumer-data-output-stream-2, groupId=connect-data-output-stream] Discovered group coordinator kafka:29092 (id: 2147482646 rack: null) (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:728)
[2020-05-22 03:45:33,372] INFO [Consumer clientId=connector-consumer-data-output-stream-7, groupId=connect-data-output-stream] Discovered group coordinator kafka:29092 (id: 2147482646 rack: null) (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:728)
[2020-05-22 03:45:33,375] INFO [Consumer clientId=connector-consumer-data-output-stream-1, groupId=connect-data-output-stream] Discovered group coordinator kafka:29092 (id: 2147482646 rack: null) (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:728)
[2020-05-22 03:45:33,377] INFO [Consumer clientId=connector-consumer-data-output-stream-9, groupId=connect-data-output-stream] Discovered group coordinator kafka:29092 (id: 2147482646 rack: null) (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:728)
[2020-05-22 03:45:33,371] INFO [Consumer clientId=connector-consumer-data-output-stream-6, groupId=connect-data-output-stream] Cluster ID: 36MPGS-bQEGQTzqv2AM-rg (org.apache.kafka.clients.Metadata:266)
[2020-05-22 03:45:33,371] INFO [Consumer clientId=connector-consumer-data-output-stream-5, groupId=connect-data-output-stream] Cluster ID: 36MPGS-bQEGQTzqv2AM-rg (org.apache.kafka.clients.Metadata:266)
[2020-05-22 03:45:33,371] INFO [Consumer clientId=connector-consumer-data-output-stream-3, groupId=connect-data-output-stream] Cluster ID: 36MPGS-bQEGQTzqv2AM-rg (org.apache.kafka.clients.Metadata:266)
[2020-05-22 03:45:33,380] INFO [Consumer clientId=connector-consumer-data-output-stream-2, groupId=connect-data-output-stream] Revoking previously assigned partitions [] (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:476)
[2020-05-22 03:45:33,375] INFO [Consumer clientId=connector-consumer-data-output-stream-8, groupId=connect-data-output-stream] Discovered group coordinator kafka:29092 (id: 2147482646 rack: null) (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:728)
[2020-05-22 03:45:33,375] INFO [Consumer clientId=connector-consumer-data-output-stream-4, groupId=connect-data-output-stream] Discovered group coordinator kafka:29092 (id: 2147482646 rack: null) (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:728)
[2020-05-22 03:45:33,374] INFO [Consumer clientId=connector-consumer-data-output-stream-0, groupId=connect-data-output-stream] Discovered group coordinator kafka:29092 (id: 2147482646 rack: null) (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:728)
[2020-05-22 03:45:33,386] INFO [Consumer clientId=connector-consumer-data-output-stream-6, groupId=connect-data-output-stream] Discovered group coordinator kafka:29092 (id: 2147482646 rack: null) (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:728)
[2020-05-22 03:45:33,386] INFO [Consumer clientId=connector-consumer-data-output-stream-4, groupId=connect-data-output-stream] Revoking previously assigned partitions [] (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:476)
[2020-05-22 03:45:33,385] INFO [Consumer clientId=connector-consumer-data-output-stream-8, groupId=connect-data-output-stream] Revoking previously assigned partitions [] (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:476)
[2020-05-22 03:45:33,380] INFO [Consumer clientId=connector-consumer-data-output-stream-7, groupId=connect-data-output-stream] Revoking previously assigned partitions [] (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:476)
[2020-05-22 03:45:33,388] INFO [Consumer clientId=connector-consumer-data-output-stream-7, groupId=connect-data-output-stream] (Re-)joining group (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:505)
[2020-05-22 03:45:33,380] INFO [Consumer clientId=connector-consumer-data-output-stream-1, groupId=connect-data-output-stream] Revoking previously assigned partitions [] (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:476)
[2020-05-22 03:45:33,383] INFO [Consumer clientId=connector-consumer-data-output-stream-3, groupId=connect-data-output-stream] Discovered group coordinator kafka:29092 (id: 2147482646 rack: null) (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:728)
[2020-05-22 03:45:33,383] INFO [Consumer clientId=connector-consumer-data-output-stream-2, groupId=connect-data-output-stream] (Re-)joining group (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:505)
[2020-05-22 03:45:33,383] INFO [Consumer clientId=connector-consumer-data-output-stream-5, groupId=connect-data-output-stream] Discovered group coordinator kafka:29092 (id: 2147482646 rack: null) (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:728)
[2020-05-22 03:45:33,381] INFO [Consumer clientId=connector-consumer-data-output-stream-9, groupId=connect-data-output-stream] Revoking previously assigned partitions [] (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:476)
[2020-05-22 03:45:33,391] INFO [Consumer clientId=connector-consumer-data-output-stream-9, groupId=connect-data-output-stream] (Re-)joining group (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:505)
[2020-05-22 03:45:33,389] INFO [Consumer clientId=connector-consumer-data-output-stream-1, groupId=connect-data-output-stream] (Re-)joining group (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:505)
[2020-05-22 03:45:33,388] INFO [Consumer clientId=connector-consumer-data-output-stream-4, groupId=connect-data-output-stream] (Re-)joining group (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:505)
[2020-05-22 03:45:33,388] INFO [Consumer clientId=connector-consumer-data-output-stream-8, groupId=connect-data-output-stream] (Re-)joining group (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:505)
[2020-05-22 03:45:33,399] INFO [Consumer clientId=connector-consumer-data-output-stream-0, groupId=connect-data-output-stream] Revoking previously assigned partitions [] (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:476)
[2020-05-22 03:45:33,401] INFO [Consumer clientId=connector-consumer-data-output-stream-0, groupId=connect-data-output-stream] (Re-)joining group (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:505)
[2020-05-22 03:45:33,402] INFO [Consumer clientId=connector-consumer-data-output-stream-6, groupId=connect-data-output-stream] Revoking previously assigned partitions [] (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:476)
[2020-05-22 03:45:33,402] INFO [Consumer clientId=connector-consumer-data-output-stream-6, groupId=connect-data-output-stream] (Re-)joining group (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:505)
[2020-05-22 03:45:33,408] INFO [Consumer clientId=connector-consumer-data-output-stream-5, groupId=connect-data-output-stream] Revoking previously assigned partitions [] (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:476)
[2020-05-22 03:45:33,408] INFO [Consumer clientId=connector-consumer-data-output-stream-5, groupId=connect-data-output-stream] (Re-)joining group (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:505)
[2020-05-22 03:45:33,413] INFO [Consumer clientId=connector-consumer-data-output-stream-2, groupId=connect-data-output-stream] (Re-)joining group (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:505)
[2020-05-22 03:45:33,423] INFO [Consumer clientId=connector-consumer-data-output-stream-3, groupId=connect-data-output-stream] Revoking previously assigned partitions [] (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:476)
[2020-05-22 03:45:33,424] INFO [Consumer clientId=connector-consumer-data-output-stream-3, groupId=connect-data-output-stream] (Re-)joining group (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:505)
[2020-05-22 03:45:33,425] INFO [Consumer clientId=connector-consumer-data-output-stream-5, groupId=connect-data-output-stream] (Re-)joining group (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:505)
[2020-05-22 03:45:33,425] INFO [Consumer clientId=connector-consumer-data-output-stream-8, groupId=connect-data-output-stream] (Re-)joining group (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:505)
[2020-05-22 03:45:33,430] INFO [Consumer clientId=connector-consumer-data-output-stream-9, groupId=connect-data-output-stream] (Re-)joining group (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:505)
[2020-05-22 03:45:33,430] INFO [Consumer clientId=connector-consumer-data-output-stream-6, groupId=connect-data-output-stream] (Re-)joining group (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:505)
[2020-05-22 03:45:33,432] INFO [Consumer clientId=connector-consumer-data-output-stream-1, groupId=connect-data-output-stream] (Re-)joining group (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:505)
[2020-05-22 03:45:33,433] INFO [Consumer clientId=connector-consumer-data-output-stream-0, groupId=connect-data-output-stream] (Re-)joining group (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:505)
[2020-05-22 03:45:33,435] INFO [Consumer clientId=connector-consumer-data-output-stream-3, groupId=connect-data-output-stream] (Re-)joining group (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:505)
[2020-05-22 03:45:33,436] INFO [Consumer clientId=connector-consumer-data-output-stream-4, groupId=connect-data-output-stream] (Re-)joining group (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:505)
[2020-05-22 03:45:33,438] INFO [Consumer clientId=connector-consumer-data-output-stream-7, groupId=connect-data-output-stream] (Re-)joining group (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:505)
[2020-05-22 03:45:36,429] INFO [Consumer clientId=connector-consumer-data-output-stream-6, groupId=connect-data-output-stream] Successfully joined group with generation 33 (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:469)
[2020-05-22 03:45:36,429] INFO [Consumer clientId=connector-consumer-data-output-stream-1, groupId=connect-data-output-stream] Successfully joined group with generation 33 (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:469)
[2020-05-22 03:45:36,437] INFO [Consumer clientId=connector-consumer-data-output-stream-1, groupId=connect-data-output-stream] Setting newly assigned partitions:  (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:283)
[2020-05-22 03:45:36,436] INFO [Consumer clientId=connector-consumer-data-output-stream-2, groupId=connect-data-output-stream] Successfully joined group with generation 33 (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:469)
[2020-05-22 03:45:36,436] INFO [Consumer clientId=connector-consumer-data-output-stream-4, groupId=connect-data-output-stream] Successfully joined group with generation 33 (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:469)
[2020-05-22 03:45:36,436] INFO [Consumer clientId=connector-consumer-data-output-stream-8, groupId=connect-data-output-stream] Successfully joined group with generation 33 (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:469)
[2020-05-22 03:45:36,436] INFO [Consumer clientId=connector-consumer-data-output-stream-3, groupId=connect-data-output-stream] Successfully joined group with generation 33 (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:469)
[2020-05-22 03:45:36,436] INFO [Consumer clientId=connector-consumer-data-output-stream-7, groupId=connect-data-output-stream] Successfully joined group with generation 33 (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:469)
[2020-05-22 03:45:36,436] INFO [Consumer clientId=connector-consumer-data-output-stream-0, groupId=connect-data-output-stream] Successfully joined group with generation 33 (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:469)
[2020-05-22 03:45:36,436] INFO [Consumer clientId=connector-consumer-data-output-stream-9, groupId=connect-data-output-stream] Successfully joined group with generation 33 (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:469)
[2020-05-22 03:45:36,435] INFO [Consumer clientId=connector-consumer-data-output-stream-5, groupId=connect-data-output-stream] Successfully joined group with generation 33 (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:469)
[2020-05-22 03:45:36,429] INFO [Consumer clientId=connector-consumer-data-output-stream-6, groupId=connect-data-output-stream] Setting newly assigned partitions:  (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:283)
[2020-05-22 03:45:36,445] INFO [Consumer clientId=connector-consumer-data-output-stream-9, groupId=connect-data-output-stream] Setting newly assigned partitions:  (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:283)
[2020-05-22 03:45:36,445] INFO [Consumer clientId=connector-consumer-data-output-stream-5, groupId=connect-data-output-stream] Setting newly assigned partitions:  (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:283)
[2020-05-22 03:45:36,443] INFO [Consumer clientId=connector-consumer-data-output-stream-7, groupId=connect-data-output-stream] Setting newly assigned partitions:  (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:283)
[2020-05-22 03:45:36,442] INFO [Consumer clientId=connector-consumer-data-output-stream-8, groupId=connect-data-output-stream] Setting newly assigned partitions:  (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:283)
[2020-05-22 03:45:36,442] INFO [Consumer clientId=connector-consumer-data-output-stream-3, groupId=connect-data-output-stream] Setting newly assigned partitions:  (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:283)
[2020-05-22 03:45:36,440] INFO [Consumer clientId=connector-consumer-data-output-stream-2, groupId=connect-data-output-stream] Setting newly assigned partitions:  (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:283)
[2020-05-22 03:45:36,439] INFO [Consumer clientId=connector-consumer-data-output-stream-4, groupId=connect-data-output-stream] Setting newly assigned partitions:  (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:283)
[2020-05-22 03:45:36,455] INFO [Consumer clientId=connector-consumer-data-output-stream-0, groupId=connect-data-output-stream] Setting newly assigned partitions: data-output-stream-0 (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:283)
[2020-05-22 03:45:36,476] INFO [Consumer clientId=connector-consumer-data-output-stream-0, groupId=connect-data-output-stream] Setting offset for partition data-output-stream-0 to the committed offset FetchPosition{offset=600399626, offsetEpoch=Optional.empty, currentLeader=LeaderAndEpoch{leader=kafka:29092 (id: 1001 rack: null), epoch=0}} (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:525)
[2020-05-22 03:45:36,510] INFO [Consumer clientId=connector-consumer-data-output-stream-0, groupId=connect-data-output-stream] Fetch offset 600399626 is out of range for partition data-output-stream-0, resetting offset (org.apache.kafka.clients.consumer.internals.Fetcher:1214)
[2020-05-22 03:45:36,524] INFO [Consumer clientId=connector-consumer-data-output-stream-0, groupId=connect-data-output-stream] Resetting offset for partition data-output-stream-0 to offset 607625010. (org.apache.kafka.clients.consumer.internals.SubscriptionState:348)
[2020-05-22 03:46:37,017] ERROR WorkerSinkTask{id=data-output-stream-0} Offset commit failed, rewinding to last committed offsets (org.apache.kafka.connect.runtime.WorkerSinkTask:384)
java.lang.RuntimeException: java.util.concurrent.ExecutionException: com.google.api.gax.rpc.DeadlineExceededException: io.grpc.StatusRuntimeException: DEADLINE_EXCEEDED: deadline exceeded after 9.999810964s. [remote_addr=pubsub.googleapis.com/216.58.198.74:443]
    at com.google.pubsub.kafka.sink.CloudPubSubSinkTask.flush(CloudPubSubSinkTask.java:324)
    at org.apache.kafka.connect.sink.SinkTask.preCommit(SinkTask.java:125)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.commitOffsets(WorkerSinkTask.java:378)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:208)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:192)
    at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:177)
    at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:227)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:748)
Caused by: java.util.concurrent.ExecutionException: com.google.api.gax.rpc.DeadlineExceededException: io.grpc.StatusRuntimeException: DEADLINE_EXCEEDED: deadline exceeded after 9.999810964s. [remote_addr=pubsub.googleapis.com/216.58.198.74:443]
    at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:500)
    at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:479)
    at com.google.common.util.concurrent.AbstractFuture$TrustedFuture.get(AbstractFuture.java:76)
    at com.google.common.util.concurrent.ForwardingFuture.get(ForwardingFuture.java:62)
    at com.google.pubsub.kafka.sink.CloudPubSubSinkTask.flush(CloudPubSubSinkTask.java:322)
    ... 11 more
Caused by: com.google.api.gax.rpc.DeadlineExceededException: io.grpc.StatusRuntimeException: DEADLINE_EXCEEDED: deadline exceeded after 9.999810964s. [remote_addr=pubsub.googleapis.com/216.58.198.74:443]
    at com.google.api.gax.rpc.ApiExceptionFactory.createException(ApiExceptionFactory.java:51)
    at com.google.api.gax.grpc.GrpcApiExceptionFactory.create(GrpcApiExceptionFactory.java:72)
    at com.google.api.gax.grpc.GrpcApiExceptionFactory.create(GrpcApiExceptionFactory.java:60)
    at com.google.api.gax.grpc.GrpcExceptionCallable$ExceptionTransformingFuture.onFailure(GrpcExceptionCallable.java:97)
    at com.google.api.core.ApiFutures$1.onFailure(ApiFutures.java:68)
    at com.google.common.util.concurrent.Futures$4.run(Futures.java:1123)
    at com.google.common.util.concurrent.MoreExecutors$DirectExecutor.execute(MoreExecutors.java:435)
    at com.google.common.util.concurrent.AbstractFuture.executeListener(AbstractFuture.java:900)
    at com.google.common.util.concurrent.AbstractFuture.complete(AbstractFuture.java:811)
    at com.google.common.util.concurrent.AbstractFuture.setException(AbstractFuture.java:675)
    at io.grpc.stub.ClientCalls$GrpcFuture.setException(ClientCalls.java:526)
    at io.grpc.stub.ClientCalls$UnaryStreamToFuture.onClose(ClientCalls.java:501)
    at io.grpc.internal.ClientCallImpl.closeObserver(ClientCallImpl.java:426)
    at io.grpc.internal.ClientCallImpl.access$500(ClientCallImpl.java:66)
    at io.grpc.internal.ClientCallImpl$1CloseInContext.runInContext(ClientCallImpl.java:416)
    at io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
    at io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
    ... 3 more
Caused by: io.grpc.StatusRuntimeException: DEADLINE_EXCEEDED: deadline exceeded after 9.999810964s. [remote_addr=pubsub.googleapis.com/216.58.198.74:443]
    at io.grpc.Status.asRuntimeException(Status.java:533)
    ... 13 more
[2020-05-22 03:46:37,030] INFO [Consumer clientId=connector-consumer-data-output-stream-0, groupId=connect-data-output-stream] Seeking to offset 600399626 for partition data-output-stream-0 (org.apache.kafka.clients.consumer.KafkaConsumer:1545)
[2020-05-22 03:46:37,032] ERROR WorkerSinkTask{id=data-output-stream-0} Commit of offsets threw an unexpected exception for sequence number 1: null (org.apache.kafka.connect.runtime.WorkerSinkTask:259)
java.lang.RuntimeException: java.util.concurrent.ExecutionException: com.google.api.gax.rpc.DeadlineExceededException: io.grpc.StatusRuntimeException: DEADLINE_EXCEEDED: deadline exceeded after 9.999810964s. [remote_addr=pubsub.googleapis.com/216.58.198.74:443]
    at com.google.pubsub.kafka.sink.CloudPubSubSinkTask.flush(CloudPubSubSinkTask.java:324)
    at org.apache.kafka.connect.sink.SinkTask.preCommit(SinkTask.java:125)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.commitOffsets(WorkerSinkTask.java:378)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:208)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:192)
    at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:177)
    at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:227)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:748)
Caused by: java.util.concurrent.ExecutionException: com.google.api.gax.rpc.DeadlineExceededException: io.grpc.StatusRuntimeException: DEADLINE_EXCEEDED: deadline exceeded after 9.999810964s. [remote_addr=pubsub.googleapis.com/216.58.198.74:443]
    at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:500)
    at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:479)
    at com.google.common.util.concurrent.AbstractFuture$TrustedFuture.get(AbstractFuture.java:76)
    at com.google.common.util.concurrent.ForwardingFuture.get(ForwardingFuture.java:62)
    at com.google.pubsub.kafka.sink.CloudPubSubSinkTask.flush(CloudPubSubSinkTask.java:322)
    ... 11 more
Caused by: com.google.api.gax.rpc.DeadlineExceededException: io.grpc.StatusRuntimeException: DEADLINE_EXCEEDED: deadline exceeded after 9.999810964s. [remote_addr=pubsub.googleapis.com/216.58.198.74:443]
    at com.google.api.gax.rpc.ApiExceptionFactory.createException(ApiExceptionFactory.java:51)
    at com.google.api.gax.grpc.GrpcApiExceptionFactory.create(GrpcApiExceptionFactory.java:72)
    at com.google.api.gax.grpc.GrpcApiExceptionFactory.create(GrpcApiExceptionFactory.java:60)
    at com.google.api.gax.grpc.GrpcExceptionCallable$ExceptionTransformingFuture.onFailure(GrpcExceptionCallable.java:97)
    at com.google.api.core.ApiFutures$1.onFailure(ApiFutures.java:68)
    at com.google.common.util.concurrent.Futures$4.run(Futures.java:1123)
    at com.google.common.util.concurrent.MoreExecutors$DirectExecutor.execute(MoreExecutors.java:435)
    at com.google.common.util.concurrent.AbstractFuture.executeListener(AbstractFuture.java:900)
    at com.google.common.util.concurrent.AbstractFuture.complete(AbstractFuture.java:811)
    at com.google.common.util.concurrent.AbstractFuture.setException(AbstractFuture.java:675)
    at io.grpc.stub.ClientCalls$GrpcFuture.setException(ClientCalls.java:526)
    at io.grpc.stub.ClientCalls$UnaryStreamToFuture.onClose(ClientCalls.java:501)
    at io.grpc.internal.ClientCallImpl.closeObserver(ClientCallImpl.java:426)
    at io.grpc.internal.ClientCallImpl.access$500(ClientCallImpl.java:66)
    at io.grpc.internal.ClientCallImpl$1CloseInContext.runInContext(ClientCallImpl.java:416)
    at io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
    at io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
    ... 3 more
Caused by: io.grpc.StatusRuntimeException: DEADLINE_EXCEEDED: deadline exceeded after 9.999810964s. [remote_addr=pubsub.googleapis.com/216.58.198.74:443]
    at io.grpc.Status.asRuntimeException(Status.java:533)
    ... 13 more
[2020-05-22 03:46:37,038] INFO [Consumer clientId=connector-consumer-data-output-stream-0, groupId=connect-data-output-stream] Fetch offset 600399626 is out of range for partition data-output-stream-0, resetting offset (org.apache.kafka.clients.consumer.internals.Fetcher:1214)
[2020-05-22 03:46:37,040] INFO [Consumer clientId=connector-consumer-data-output-stream-0, groupId=connect-data-output-stream] Resetting offset for partition data-output-stream-0 to offset 607836681. (org.apache.kafka.clients.consumer.internals.SubscriptionState:348)
[2020-05-22 03:46:43,644] INFO WorkerSinkTask{id=data-output-stream-0} Committing offsets asynchronously using sequence number 2: {data-output-stream-0=OffsetAndMetadata{offset=607836887, leaderEpoch=null, metadata=''}} (org.apache.kafka.connect.runtime.WorkerSinkTask:344)
[2020-05-22 03:46:43,677] WARN WorkerSinkTask{id=data-output-stream-0} Commit of offsets timed out (org.apache.kafka.connect.runtime.WorkerSinkTask:217)
dariocazas commented 3 years ago

We found the DEADLINE_EXCEEDED in our systems in some cases, but at the moment we have not been able to reproduce them in a controlled environment. I subscribe because we are also after this case.