Closed andsel closed 1 year ago
The CI error that's happening on 7.x
branch is:
Java::OrgApacheKafkaCommon::KafkaException:
Failed to construct kafka producer
# org.apache.kafka.clients.producer.KafkaProducer.<init>(org/apache/kafka/clients/producer/KafkaProducer.java:468)
# org.apache.kafka.clients.producer.KafkaProducer.<init>(org/apache/kafka/clients/producer/KafkaProducer.java:291)
# org.apache.kafka.clients.producer.KafkaProducer.<init>(org/apache/kafka/clients/producer/KafkaProducer.java:318)
# org.apache.kafka.clients.producer.KafkaProducer.<init>(org/apache/kafka/clients/producer/KafkaProducer.java:303)
# jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
# jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(jdk/internal/reflect/NativeConstructorAccessorImpl.java:62)
# jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(jdk/internal/reflect/DelegatingConstructorAccessorImpl.java:45)
# java.lang.reflect.Constructor.newInstance(java/lang/reflect/Constructor.java:490)
# usr.share.plugins.plugin.spec.integration.inputs.kafka_spec.write_some_data_to(/usr/share/plugins/plugin/spec/integration/inputs/kafka_spec.rb:404)
# usr.share.plugins.plugin.spec.integration.inputs.kafka_spec./usr/share/plugins/plugin/spec/integration/inputs/kafka_spec.rb(/usr/share/plugins/plugin/spec/integration/inputs/kafka_spec.rb:411)
# java.lang.invoke.MethodHandle.invokeWithArguments(java/lang/invoke/MethodHandle.java:710)
# ------------------
# --- Caused by: ---
# Java::JavaLang::ClassNotFoundException:
# com.fasterxml.jackson.databind.json.JsonMapper
# java.base/java.net.URLClassLoader.findClass(URLClassLoader.java:476)
happens also with 8.x
up to 8.2.3
With LS 8.3.0
jrjackson
switched from 0.4.14
to 0.4.15
and this makes the jackson
JSON library to switch from 2.9.10
to 2.13.3
.
Now, the class that can't be found com.fasterxml.jackson.databind.json.JsonMapper
is present in Jackson library only starting from 2.10
.
Kafka Client switched jackson-databind
2.9.9.3
to 2.10.0
with https://github.com/apache/kafka/pull/7411 shipped in 2.x
2.10.x
to 2.12.3
with https://github.com/apache/kafka/pull/10778 which is part of 3.0
Which is summarized as:
Caused by: java.lang.NoClassDefFoundError: com/fasterxml/jackson/databind/json/JsonMapper
at io.confluent.kafka.schemaregistry.utils.JacksonMapper.<clinit>(JacksonMapper.java:27)
at io.confluent.kafka.schemaregistry.client.rest.RestService.<clinit>(RestService.java:156)
at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.<init>(CachedSchemaRegistryClient.java:158)
at io.confluent.kafka.schemaregistry.client.SchemaRegistryClientFactory.newClient(SchemaRegistryClientFactory.java:36)
at io.confluent.kafka.serializers.AbstractKafkaSchemaSerDe.configureClientProperties(AbstractKafkaSchemaSerDe.java:72)
at io.confluent.kafka.serializers.AbstractKafkaAvroSerializer.configure(AbstractKafkaAvroSerializer.java:66)
at io.confluent.kafka.serializers.KafkaAvroSerializer.configure(KafkaAvroSerializer.java:50)
at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:396)
at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:291)
at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:318)
at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:303)
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490)
at org.jruby.dist/org.jruby.javasupport.JavaConstructor.newInstanceDirect(JavaConstructor.java:279)
<JRuby intepreter part>
Caused by: java.lang.ClassNotFoundException: com.fasterxml.jackson.databind.json.JsonMapper
at java.base/java.net.URLClassLoader.findClass(URLClassLoader.java:476)
at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:589)
at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:522)
So it comes from io.confluent:kafka-schema-registry-client
org.apache.kafka:kafka-clients
Looking better at the problem, it originates from the fact that io.confluent.kafka.schemaregistry.client.rest.RestService
which uses a singleton of io.confluent.kafka.schemaregistry.utils.JacksonMapper
fails because the JacksonMapper
class has a dependency on com.fasterxml.jackson.databind.json.JsonMapper
.
The io.confluent:kafka-schema-registry-client:6.2.2
used in previous version of the plugin had a dependency on jackson-databind:2.12.5
which in theory should have presented the class not found problem of JsonMapper
; but it didn't happen because the io.confluent.kafka.schemaregistry.utils.JacksonMapper
switched implementation:
Release notes
Update kafka client to 3.3.1, requires Logstash >= 8.3.0. Deprecated
default
value for settingclient_dns_lookup
, when used it's forced touse_all_dns_ips
What does this PR do?
Updated the Kafka client library used from
2.8.1
to3.3.1
and Schema Registry from6.2.2
to7.3.0
.This PR updates the client library and, due to the removal of
zookeeper
flag in launching integration Kafka instance, update also the bash scripts used in integration testing. The update of the Schema Registry forces the dependency on Logstash 8.3 because it's the first that ships Jackson2.13.3
which is strictly required during the instantiation of the client.Why is it important/What is the impact to the user?
Updating the client requires a library (Jackson) that's available from Logstash
8.3.0
so this is a requirement starting from this version of the plugin. The new Kafka client dropped thedefault
value for configurationclient_dns_lookup
. Starting from Kafka2.6.0
it had been deprecated and was removed in Kafka3.0
. The default value for plugin settingclient_dns_lookup
switched fromdefault
touse_all_dns_ips
which if a user doesn't customize i,t doesn't create any problem. If a user explicitly setclient_dns_lookup
todefault
and updates the plugin to this version then the value is forced touse_all_dns_ips
. Simple explanation whydefault
is not a good choice, see here.Checklist
[ ] I have made corresponding change to the default configuration files (and/or docker env variables)[ ] I have added tests that prove my fix is effective or that my feature worksAuthor's Checklist
How to test this PR locally
cd
-ing into the uncompressed folderlogstash_integration_topic_plain
to be later used into the Logstash pipelineGemfile
output { stdout { codec => rubydebug {metadata => true } } }
The events has to be received and logged by Logstash.
Shutdown and cleanup:
Related issues
Use cases
As a Kafka user I want to be able to use a Logstash plugin aligned to the latest major version of Kafka.
Screenshots
Logs