confluentinc / cp-docker-images

[DEPRECATED] Docker images for Confluent Platform.
Apache License 2.0
1.14k stars 703 forks source link

Kafka Connect v4.0.0 - Linkage Error / Classloader problem #595

Open odowdj opened 6 years ago

odowdj commented 6 years ago

Hi there,

I am experiencing an intermittent problem during a deployment of Kafka Connect v4.0.0. I have two connectors:

1: io.confluent.connect.elasticsearch.ElasticsearchSinkConnector
2: io.confluent.connect.s3.S3SinkConnector

Sometimes the S3SinkConnector randomly encounters the following error during a Kafka Connect deployment: Caused by: java.lang.LinkageError: loader (instance of org/apache/kafka/connect/runtime/isolation/PluginClassLoader): attempted duplicate class definition for name: "org/apache/xerces/impl/dv/dtd/DTDDVFactoryImpl"

I've posted the full stack trace below.

DTDDVFactoryImpl class belongs to the jar xercesImpl-2.9.1.jar.

It appears to be a class loader issue - i.e. two different class loaders are using the same class.

There is only one occurrence of xercesImpl-2.9.1.jar in the kafka connect docker image:

/usr/share/java/kafka-connect-storage-common/xercesImpl-2.9.1.jar
/usr/share/java/kafka-connect-s3/storage-common/xercesImpl-2.9.1.jar (symlink to the above file)
/usr/share/java/kafka-connect-hdfs/storage-common/xercesImpl-2.9.1.jar (symlink to the above file)

I am passing the parameter to CONNECT_PLUGIN_PATH=/usr/share/java to Kafka Connect during start up. I am not passing any CLASSPATH parameter.

I've found one other occurrence similar to it (different connector) and it was noted here https://groups.google.com/forum/#!topic/confluent-platform/WY9kR5RBcyI but no resolution was posted.

Have you any ideas how to resolve this issue?

Thanks James.

Full stack trace:

org.apache.kafka.connect.errors.ConnectException: Exiting WorkerSinkTask due to unrecoverable exception.,
org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:517),
org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:288),
org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:198),
org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:166),
org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:170),
org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:214),
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511),
java.util.concurrent.FutureTask.run(FutureTask.java:266),
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142),
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617),
java.lang.Thread.run(Thread.java:745),
Caused by: java.lang.LinkageError: loader (instance of  org/apache/kafka/connect/runtime/isolation/PluginClassLoader): attempted  duplicate class definition for name: \"org/apache/xerces/impl/dv/dtd/DTDDVFactoryImpl\",
java.lang.ClassLoader.defineClass1(Native Method),
java.lang.ClassLoader.defineClass(ClassLoader.java:763),
java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142),
java.net.URLClassLoader.defineClass(URLClassLoader.java:467),
java.net.URLClassLoader.access$100(URLClassLoader.java:73),
java.net.URLClassLoader$1.run(URLClassLoader.java:368),
java.net.URLClassLoader$1.run(URLClassLoader.java:362),
java.security.AccessController.doPrivileged(Native Method),
java.net.URLClassLoader.findClass(URLClassLoader.java:361),
org.apache.kafka.connect.runtime.isolation.PluginClassLoader.loadClass(PluginClassLoader.java:54),
java.lang.ClassLoader.loadClass(ClassLoader.java:357),
org.apache.xerces.impl.dv.ObjectFactory.findProviderClass(Unknown Source),
org.apache.xerces.impl.dv.ObjectFactory.newInstance(Unknown Source),
org.apache.xerces.impl.dv.DTDDVFactory.getInstance(Unknown Source),
org.apache.xerces.impl.dv.DTDDVFactory.getInstance(Unknown Source),
org.apache.xerces.parsers.XML11Configuration.<init>(Unknown Source),
org.apache.xerces.parsers.XIncludeAwareParserConfiguration.<init>(Unknown Source),
org.apache.xerces.parsers.XIncludeAwareParserConfiguration.<init>(Unknown Source),
sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method),
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62),
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45),
java.lang.reflect.Constructor.newInstance(Constructor.java:423),
java.lang.Class.newInstance(Class.java:442),
org.apache.xerces.parsers.ObjectFactory.newInstance(Unknown Source),
org.apache.xerces.parsers.ObjectFactory.createObject(Unknown Source),
org.apache.xerces.parsers.ObjectFactory.createObject(Unknown Source),
org.apache.xerces.parsers.SAXParser.<init>(Unknown Source),
org.apache.xerces.parsers.SAXParser.<init>(Unknown Source),
sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method),
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62),
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45),
java.lang.reflect.Constructor.newInstance(Constructor.java:423),
java.lang.Class.newInstance(Class.java:442),
org.xml.sax.helpers.NewInstance.newInstance(NewInstance.java:84),
org.xml.sax.helpers.XMLReaderFactory.loadClass(XMLReaderFactory.java:228),
org.xml.sax.helpers.XMLReaderFactory.createXMLReader(XMLReaderFactory.java:191),
com.amazonaws.services.s3.model.transform.XmlResponsesSaxParser.<init>(XmlResponsesSaxParser.java:111),
com.amazonaws.services.s3.model.transform.Unmarshallers$InitiateMultipartUploadResultUnmarshaller.unmarshall(Unmarshallers.java:237),
com.amazonaws.services.s3.model.transform.Unmarshallers$InitiateMultipartUploadResultUnmarshaller.unmarshall(Unmarshallers.java:234),
com.amazonaws.services.s3.internal.S3XmlResponseHandler.handle(S3XmlResponseHandler.java:62),
com.amazonaws.services.s3.internal.ResponseHeaderHandlerChain.handle(ResponseHeaderHandlerChain.java:44),
com.amazonaws.services.s3.internal.ResponseHeaderHandlerChain.handle(ResponseHeaderHandlerChain.java:30),
com.amazonaws.http.response.AwsResponseHandlerAdapter.handle(AwsResponseHandlerAdapter.java:70),
com.amazonaws.http.AmazonHttpClient$RequestExecutor.handleResponse(AmazonHttpClient.java:1501),
com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeOneRequest(AmazonHttpClient.java:1222),
com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeHelper(AmazonHttpClient.java:1035),
com.amazonaws.http.AmazonHttpClient$RequestExecutor.doExecute(AmazonHttpClient.java:747),
com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeWithTimer(AmazonHttpClient.java:721),
com.amazonaws.http.AmazonHttpClient$RequestExecutor.execute(AmazonHttpClient.java:704),
com.amazonaws.http.AmazonHttpClient$RequestExecutor.access$500(AmazonHttpClient.java:672),
com.amazonaws.http.AmazonHttpClient$RequestExecutionBuilderImpl.execute(AmazonHttpClient.java:654),
com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:518),
com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4185),
com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4132),
com.amazonaws.services.s3.AmazonS3Client.initiateMultipartUpload(AmazonS3Client.java:3017),
io.confluent.connect.s3.storage.S3OutputStream.newMultipartUpload(S3OutputStream.java:192),
io.confluent.connect.s3.storage.S3OutputStream.uploadPart(S3OutputStream.java:119),
io.confluent.connect.s3.storage.S3OutputStream.commit(S3OutputStream.java:149),
io.confluent.connect.s3.format.avro.AvroRecordWriterProvider$1.commit(AvroRecordWriterProvider.java:97),
io.confluent.connect.s3.TopicPartitionWriter.commitFile(TopicPartitionWriter.java:493),
io.confluent.connect.s3.TopicPartitionWriter.commitFiles(TopicPartitionWriter.java:473),
io.confluent.connect.s3.TopicPartitionWriter.commitOnTimeIfNoData(TopicPartitionWriter.java:287),
io.confluent.connect.s3.TopicPartitionWriter.write(TopicPartitionWriter.java:177),
io.confluent.connect.s3.S3SinkTask.put(S3SinkTask.java:195),
org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:495)
hashhar commented 4 years ago

Any updates here? We are facing the same while deploying a custom connector but the LinkageError is for JsonNode class.

hashhar commented 4 years ago

For me the fix was to exclude all dependencies that are provided by Kafka or the Kafka Connect runtime.