Closed knut-bw closed 4 months ago
I am having the exact same issue. The funny thing is that it was working fine yesterday. I added few load balancers here and there (for Kafka, Schema, Connect, and ControlCenter), and it does not work anymore.
Ok, I think I figured it out. For some reason, if you choose Kafka replicas to be 1 in your Kafka CRD deployment, Kafka Connect CRD deployment does not adapt its default configuration values, and thus creates the error. Try to add the following configuration overrides to your Kafka Connect CRD deployment:
config.storage.replication.factor=1
offset.storage.replication.factor=1
status.storage.replication.factor=1
Your final Kafka Connect CRD deployment should look similar to this:
apiVersion: platform.confluent.io/v1beta1
kind: Connect
metadata:
name: connect
namespace: staging-kafka
spec:
replicas: 1
image:
application: confluentinc/cp-server-connect:7.6.1
init: confluentinc/confluent-init-container:2.8.2
configOverrides:
server:
- config.storage.replication.factor=1
- offset.storage.replication.factor=1
- status.storage.replication.factor=1
Make sure to do the same thing for Control Center CRD Deployment since it suffers from the same issue.
apiVersion: platform.confluent.io/v1beta1
kind: ControlCenter
metadata:
name: controlcenter
namespace: staging-kafka
spec:
dataVolumeCapacity: 1Gi
replicas: 1
image:
application: confluentinc/cp-enterprise-control-center:7.6.0
init: confluentinc/confluent-init-container:2.8.0
dependencies:
schemaRegistry:
url: http://schemaregistry:8081
ksqldb:
- name: ksqldb
url: http://ksqldb:8088
connect:
- name: connect
url: http://connect:8083
configOverrides:
server:
- confluent.controlcenter.internal.topics.replication=1
- confluent.controlcenter.command.topic.replication=1
- confluent.monitoring.interceptor.topic.replication=1
- confluent.metrics.topic.replication=1
Ok, I think I figured it out. For some reason, if you choose Kafka replicas to be 1 in your Kafka CRD deployment, Kafka Connect CRD deployment does not adapt its default configuration values, and thus creates the error. Try to add the following configuration overrides to your Kafka Connect CRD deployment:
config.storage.replication.factor=1 offset.storage.replication.factor=1 status.storage.replication.factor=1
Your final Kafka Connect CRD deployment should look similar to this:
apiVersion: platform.confluent.io/v1beta1 kind: Connect metadata: name: connect namespace: staging-kafka spec: replicas: 1 image: application: confluentinc/cp-server-connect:7.6.1 init: confluentinc/confluent-init-container:2.8.2 configOverrides: server: - config.storage.replication.factor=1 - offset.storage.replication.factor=1 - status.storage.replication.factor=1
Additional Note
Make sure to do the same thing for Control Center CRD Deployment since it suffers from the same issue.
apiVersion: platform.confluent.io/v1beta1 kind: ControlCenter metadata: name: controlcenter namespace: staging-kafka spec: dataVolumeCapacity: 1Gi replicas: 1 image: application: confluentinc/cp-enterprise-control-center:7.6.0 init: confluentinc/confluent-init-container:2.8.0 dependencies: schemaRegistry: url: http://schemaregistry:8081 ksqldb: - name: ksqldb url: http://ksqldb:8088 connect: - name: connect url: http://connect:8083 configOverrides: server: - confluent.controlcenter.internal.topics.replication=1 - confluent.controlcenter.command.topic.replication=1 - confluent.monitoring.interceptor.topic.replication=1 - confluent.metrics.topic.replication=1
Thank you, this solution was very helpful for me.
I am trying to install Confluent Platform using the confluent-platform-zookeeper-7.6.0.yaml configuration file Here is my YAML configuration:
When I start Connect, I encounter the following error:
[ERROR] 2024-05-28 08:27:46,321 [main] org.apache.kafka.connect.runtime.isolation.ReflectionScanner getPluginDesc - Failed to discover Converter in /usr/share/java/confluent-metadata-service: Unable to instantiate JsonConverter: Failed to invoke plugin constructor java.lang.reflect.InvocationTargetException at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490) at org.apache.kafka.connect.runtime.isolation.ReflectionScanner.versionFor(ReflectionScanner.java:73) at org.apache.kafka.connect.runtime.isolation.ReflectionScanner.getPluginDesc(ReflectionScanner.java:136) at org.apache.kafka.connect.runtime.isolation.ReflectionScanner.scanPlugins(ReflectionScanner.java:89) at org.apache.kafka.connect.runtime.isolation.PluginScanner.scanUrlsAndAddPlugins(PluginScanner.java:79) at org.apache.kafka.connect.runtime.isolation.PluginScanner.discoverPlugins(PluginScanner.java:67) at org.apache.kafka.connect.runtime.isolation.Plugins.initLoaders(Plugins.java:91) at org.apache.kafka.connect.runtime.isolation.Plugins.(Plugins.java:75)
at org.apache.kafka.connect.runtime.isolation.Plugins.(Plugins.java:64)
at org.apache.kafka.connect.cli.AbstractConnectCli.startConnect(AbstractConnectCli.java:128)
at org.apache.kafka.connect.cli.AbstractConnectCli.run(AbstractConnectCli.java:101)
at org.apache.kafka.connect.cli.ConnectDistributed.main(ConnectDistributed.java:116)
Caused by: java.lang.LinkageError: loader constraint violation: when resolving method 'com.fasterxml.jackson.databind.ObjectMapper com.fasterxml.jackson.databind.ObjectMapper.enable(com.fasterxml.jackson.core.JsonParser$Feature[])' the class loader org.apache.kafka.connect.runtime.isolation.PluginClassLoader @1827a871 of the current class, org/apache/kafka/connect/json/JsonDeserializer, and the class loader 'app' for the method's defining class, com/fasterxml/jackson/databind/ObjectMapper, have different Class objects for the type [Lcom/fasterxml/jackson/core/JsonParser$Feature; used in the signature (org.apache.kafka.connect.json.JsonDeserializer is in unnamed module of loader org.apache.kafka.connect.runtime.isolation.PluginClassLoader @1827a871, parent loader org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader @5d8445d7; com.fasterxml.jackson.databind.ObjectMapper is in unnamed module of loader 'app')
at org.apache.kafka.connect.json.JsonDeserializer.(JsonDeserializer.java:55)
at org.apache.kafka.connect.json.JsonConverter.(JsonConverter.java:244)
... 15 more
[ERROR] 2024-05-28 08:27:46,321 [main] org.apache.kafka.connect.runtime.isolation.ReflectionScanner getPluginDesc - Failed to discover HeaderConverter in /usr/share/java/confluent-metadata-service: Unable to instantiate JsonConverter: Failed to invoke plugin constructor
java.lang.reflect.InvocationTargetException
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490)
at org.apache.kafka.connect.runtime.isolation.ReflectionScanner.versionFor(ReflectionScanner.java:73)
at org.apache.kafka.connect.runtime.isolation.ReflectionScanner.getPluginDesc(ReflectionScanner.java:136)
at org.apache.kafka.connect.runtime.isolation.ReflectionScanner.scanPlugins(ReflectionScanner.java:90)
at org.apache.kafka.connect.runtime.isolation.PluginScanner.scanUrlsAndAddPlugins(PluginScanner.java:79)
at org.apache.kafka.connect.runtime.isolation.PluginScanner.discoverPlugins(PluginScanner.java:67)
at org.apache.kafka.connect.runtime.isolation.Plugins.initLoaders(Plugins.java:91)
at org.apache.kafka.connect.runtime.isolation.Plugins.(Plugins.java:75)
at org.apache.kafka.connect.runtime.isolation.Plugins.(Plugins.java:64)
at org.apache.kafka.connect.cli.AbstractConnectCli.startConnect(AbstractConnectCli.java:128)
at org.apache.kafka.connect.cli.AbstractConnectCli.run(AbstractConnectCli.java:101)
at org.apache.kafka.connect.cli.ConnectDistributed.main(ConnectDistributed.java:116)
Caused by: java.lang.LinkageError: loader constraint violation: when resolving method 'com.fasterxml.jackson.databind.ObjectMapper com.fasterxml.jackson.databind.ObjectMapper.enable(com.fasterxml.jackson.core.JsonParser$Feature[])' the class loader org.apache.kafka.connect.runtime.isolation.PluginClassLoader @1827a871 of the current class, org/apache/kafka/connect/json/JsonDeserializer, and the class loader 'app' for the method's defining class, com/fasterxml/jackson/databind/ObjectMapper, have different Class objects for the type [Lcom/fasterxml/jackson/core/JsonParser$Feature; used in the signature (org.apache.kafka.connect.json.JsonDeserializer is in unnamed module of loader org.apache.kafka.connect.runtime.isolation.PluginClassLoader @1827a871, parent loader org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader @5d8445d7; com.fasterxml.jackson.databind.ObjectMapper is in unnamed module of loader 'app')
at org.apache.kafka.connect.json.JsonDeserializer.(JsonDeserializer.java:55)
at org.apache.kafka.connect.json.JsonConverter.(JsonConverter.java:244)
... 15 more
I haven't found any similar issues online, so I'm at a loss for how to proceed Could you please help me troubleshoot this issue?