confluentinc / schema-registry

Confluent Schema Registry for Kafka
https://docs.confluent.io/current/schema-registry/docs/index.html
Other
2.16k stars 1.11k forks source link

Producer with Schema Registry connection not using ssl settings #943

Open madcap opened 5 years ago

madcap commented 5 years ago

I've got a spring boot app that produces json messages without using a schema registry, this works fine.

Now I am attempting to add a new kafka producer that will produce Avro messages using a confluent schema registry. I set up the producer properties the same way for both producers:

properties.put(CommonClientConfigs.SECURITY_PROTOCOL_CONFIG, securityProtocol);
properties.put(SslConfigs.SSL_KEYSTORE_LOCATION_CONFIG, keystorePath);
properties.put(SslConfigs.SSL_KEYSTORE_PASSWORD_CONFIG, keystorePassword);
properties.put(SslConfigs.SSL_TRUSTSTORE_LOCATION_CONFIG, truststorePath);
properties.put(SslConfigs.SSL_TRUSTSTORE_PASSWORD_CONFIG, truststorePassword);
properties.put(SslConfigs.SSL_KEY_PASSWORD_CONFIG, keystorePassword);

However the avro kafka producer doesn't seem to be respect the ssl setting with regard to connecting to the schema registry. Doing a kafka send with the avro producer gives me this stack trace:

org.apache.kafka.common.errors.SerializationException: Error serializing Avro message

Caused by: javax.net.ssl.SSLHandshakeException: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
    at sun.security.ssl.Alerts.getSSLException(Alerts.java:192)
    at sun.security.ssl.SSLSocketImpl.fatal(SSLSocketImpl.java:1959)
    at sun.security.ssl.Handshaker.fatalSE(Handshaker.java:328)
    at sun.security.ssl.Handshaker.fatalSE(Handshaker.java:322)
    at sun.security.ssl.ClientHandshaker.serverCertificate(ClientHandshaker.java:1614)
    at sun.security.ssl.ClientHandshaker.processMessage(ClientHandshaker.java:216)
    at sun.security.ssl.Handshaker.processLoop(Handshaker.java:1052)
    at sun.security.ssl.Handshaker.process_record(Handshaker.java:987)
    at sun.security.ssl.SSLSocketImpl.readRecord(SSLSocketImpl.java:1072)
    at sun.security.ssl.SSLSocketImpl.performInitialHandshake(SSLSocketImpl.java:1385)
    at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1413)
    at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1397)
    at sun.net.www.protocol.https.HttpsClient.afterConnect(HttpsClient.java:559)
    at sun.net.www.protocol.https.AbstractDelegateHttpsURLConnection.connect(AbstractDelegateHttpsURLConnection.java:185)
    at sun.net.www.protocol.http.HttpURLConnection.getOutputStream0(HttpURLConnection.java:1334)
    at sun.net.www.protocol.http.HttpURLConnection.getOutputStream(HttpURLConnection.java:1309)
    at sun.net.www.protocol.https.HttpsURLConnectionImpl.getOutputStream(HttpsURLConnectionImpl.java:259)
    at io.confluent.kafka.schemaregistry.client.rest.RestService.sendHttpRequest(RestService.java:142)
    at io.confluent.kafka.schemaregistry.client.rest.RestService.httpRequest(RestService.java:187)
    at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:238)
    at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:230)
    at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:225)
    at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.registerAndGetId(CachedSchemaRegistryClient.java:59)
    at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.register(CachedSchemaRegistryClient.java:91)
    at io.confluent.kafka.serializers.AbstractKafkaAvroSerializer.serializeImpl(AbstractKafkaAvroSerializer.java:72)
    at io.confluent.kafka.serializers.KafkaAvroSerializer.serialize(KafkaAvroSerializer.java:54)
    at org.apache.kafka.common.serialization.ExtendedSerializer$Wrapper.serialize(ExtendedSerializer.java:65)
    at org.apache.kafka.common.serialization.ExtendedSerializer$Wrapper.serialize(ExtendedSerializer.java:55)
    at org.apache.kafka.clients.producer.KafkaProducer.doSend(KafkaProducer.java:783)
    at org.apache.kafka.clients.producer.KafkaProducer.send(KafkaProducer.java:768)
    at com.mycompany.service.KafkaPocIntegrationSpec.test - send avro message(KafkaPocIntegrationSpec.groovy:55)
Caused by: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
    at sun.security.validator.PKIXValidator.doBuild(PKIXValidator.java:397)
    at sun.security.validator.PKIXValidator.engineValidate(PKIXValidator.java:302)
    at sun.security.validator.Validator.validate(Validator.java:260)
    at sun.security.ssl.X509TrustManagerImpl.validate(X509TrustManagerImpl.java:324)
    at sun.security.ssl.X509TrustManagerImpl.checkTrusted(X509TrustManagerImpl.java:229)
    at sun.security.ssl.X509TrustManagerImpl.checkServerTrusted(X509TrustManagerImpl.java:124)
    at sun.security.ssl.ClientHandshaker.serverCertificate(ClientHandshaker.java:1596)
    ... 26 more
Caused by: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
    at sun.security.provider.certpath.SunCertPathBuilder.build(SunCertPathBuilder.java:141)
    at sun.security.provider.certpath.SunCertPathBuilder.engineBuild(SunCertPathBuilder.java:126)
    at java.security.cert.CertPathBuilder.build(CertPathBuilder.java:280)
    at sun.security.validator.PKIXValidator.doBuild(PKIXValidator.java:392)
    ... 32 more

Following some advice from this page (near the bottom they are experiencing the same issue I'm reporting here): https://groups.google.com/forum/#!topic/confluent-platform/i2N22ZYpplA

I was able to set system properties (note that these are system properties not producer properties):

System.setProperty("javax.net.ssl.trustStore", truststorePath);
System.setProperty("javax.net.ssl.trustStorePassword", truststorePassword);
System.setProperty("javax.net.ssl.keyStore", keystorePath);
System.setProperty("javax.net.ssl.keyStorePassword", keystorePassword);

After setting these system properties the producer works just fine. However his seems like a bug to me, I shouldn't have to do this (it negatively impacts other aspects of my app). The producer is not using the kafka ssl config to talk to the schema registry, instead it's using the jvm default ssl settings.

madcap commented 5 years ago

I can recreate the same behavior from kafka-avro-console-producer

$ cat ~/ssl.config 
security.protocol=SSL
ssl.truststore.location=/path/to/my/truststore/here
ssl.truststore.password=password
ssl.keystore.location=/path/to/my/keystore/here
ssl.keystore.password=password
ssl.key.password=password
kafka-avro-console-producer --broker-list my.brokers.here --topic my.topic.here --producer.config ~/ssl.config --property value.schema='{"type" : "record", "name" : "Example", "namespace" : "com.example", "fields" : [ {"name" : "uniqueId", "type" : [ "null", "string" ]} ]}' --property my.schema.registry

Enter value: {"uniqueId" : {"string":"1234567"}}

StackTrace:

{"uniqueId" : {"string":"1234567"}}
[2018-11-27 10:11:41,415] ERROR Failed to send HTTP request to endpoint: my.schema.registry.here (io.confluent.kafka.schemaregistry.client.rest.RestService:176)
javax.net.ssl.SSLHandshakeException: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
    at sun.security.ssl.Alerts.getSSLException(Alerts.java:192)
    at sun.security.ssl.SSLSocketImpl.fatal(SSLSocketImpl.java:1959)
    at sun.security.ssl.Handshaker.fatalSE(Handshaker.java:302)
    at sun.security.ssl.Handshaker.fatalSE(Handshaker.java:296)
    at sun.security.ssl.ClientHandshaker.serverCertificate(ClientHandshaker.java:1514)
    at sun.security.ssl.ClientHandshaker.processMessage(ClientHandshaker.java:216)
    at sun.security.ssl.Handshaker.processLoop(Handshaker.java:1026)
    at sun.security.ssl.Handshaker.process_record(Handshaker.java:961)
    at sun.security.ssl.SSLSocketImpl.readRecord(SSLSocketImpl.java:1072)
    at sun.security.ssl.SSLSocketImpl.performInitialHandshake(SSLSocketImpl.java:1385)
    at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1413)
    at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1397)
    at sun.net.www.protocol.https.HttpsClient.afterConnect(HttpsClient.java:559)
    at sun.net.www.protocol.https.AbstractDelegateHttpsURLConnection.connect(AbstractDelegateHttpsURLConnection.java:185)
    at sun.net.www.protocol.http.HttpURLConnection.getOutputStream0(HttpURLConnection.java:1334)
    at sun.net.www.protocol.http.HttpURLConnection.getOutputStream(HttpURLConnection.java:1309)
    at sun.net.www.protocol.https.HttpsURLConnectionImpl.getOutputStream(HttpsURLConnectionImpl.java:259)
    at io.confluent.kafka.schemaregistry.client.rest.RestService.sendHttpRequest(RestService.java:172)
    at io.confluent.kafka.schemaregistry.client.rest.RestService.httpRequest(RestService.java:229)
    at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:320)
    at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:312)
    at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:307)
    at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.registerAndGetId(CachedSchemaRegistryClient.java:115)
    at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.register(CachedSchemaRegistryClient.java:154)
    at io.confluent.kafka.serializers.AbstractKafkaAvroSerializer.serializeImpl(AbstractKafkaAvroSerializer.java:79)
    at io.confluent.kafka.formatter.AvroMessageReader.readMessage(AvroMessageReader.java:181)
    at kafka.tools.ConsoleProducer$.main(ConsoleProducer.scala:59)
    at kafka.tools.ConsoleProducer.main(ConsoleProducer.scala)
Caused by: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
    at sun.security.validator.PKIXValidator.doBuild(PKIXValidator.java:397)
    at sun.security.validator.PKIXValidator.engineValidate(PKIXValidator.java:302)
    at sun.security.validator.Validator.validate(Validator.java:260)
    at sun.security.ssl.X509TrustManagerImpl.validate(X509TrustManagerImpl.java:324)
    at sun.security.ssl.X509TrustManagerImpl.checkTrusted(X509TrustManagerImpl.java:229)
    at sun.security.ssl.X509TrustManagerImpl.checkServerTrusted(X509TrustManagerImpl.java:124)
    at sun.security.ssl.ClientHandshaker.serverCertificate(ClientHandshaker.java:1496)
    ... 23 more
Caused by: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
    at sun.security.provider.certpath.SunCertPathBuilder.build(SunCertPathBuilder.java:141)
    at sun.security.provider.certpath.SunCertPathBuilder.engineBuild(SunCertPathBuilder.java:126)
    at java.security.cert.CertPathBuilder.build(CertPathBuilder.java:280)
    at sun.security.validator.PKIXValidator.doBuild(PKIXValidator.java:392)
    ... 29 more
org.apache.kafka.common.errors.SerializationException: Error serializing Avro message
Caused by: javax.net.ssl.SSLHandshakeException: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
    at sun.security.ssl.Alerts.getSSLException(Alerts.java:192)
    at sun.security.ssl.SSLSocketImpl.fatal(SSLSocketImpl.java:1959)
    at sun.security.ssl.Handshaker.fatalSE(Handshaker.java:302)
    at sun.security.ssl.Handshaker.fatalSE(Handshaker.java:296)
    at sun.security.ssl.ClientHandshaker.serverCertificate(ClientHandshaker.java:1514)
    at sun.security.ssl.ClientHandshaker.processMessage(ClientHandshaker.java:216)
    at sun.security.ssl.Handshaker.processLoop(Handshaker.java:1026)
    at sun.security.ssl.Handshaker.process_record(Handshaker.java:961)
    at sun.security.ssl.SSLSocketImpl.readRecord(SSLSocketImpl.java:1072)
    at sun.security.ssl.SSLSocketImpl.performInitialHandshake(SSLSocketImpl.java:1385)
    at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1413)
    at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1397)
    at sun.net.www.protocol.https.HttpsClient.afterConnect(HttpsClient.java:559)
    at sun.net.www.protocol.https.AbstractDelegateHttpsURLConnection.connect(AbstractDelegateHttpsURLConnection.java:185)
    at sun.net.www.protocol.http.HttpURLConnection.getOutputStream0(HttpURLConnection.java:1334)
    at sun.net.www.protocol.http.HttpURLConnection.getOutputStream(HttpURLConnection.java:1309)
    at sun.net.www.protocol.https.HttpsURLConnectionImpl.getOutputStream(HttpsURLConnectionImpl.java:259)
    at io.confluent.kafka.schemaregistry.client.rest.RestService.sendHttpRequest(RestService.java:172)
    at io.confluent.kafka.schemaregistry.client.rest.RestService.httpRequest(RestService.java:229)
    at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:320)
    at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:312)
    at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:307)
    at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.registerAndGetId(CachedSchemaRegistryClient.java:115)
    at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.register(CachedSchemaRegistryClient.java:154)
    at io.confluent.kafka.serializers.AbstractKafkaAvroSerializer.serializeImpl(AbstractKafkaAvroSerializer.java:79)
    at io.confluent.kafka.formatter.AvroMessageReader.readMessage(AvroMessageReader.java:181)
    at kafka.tools.ConsoleProducer$.main(ConsoleProducer.scala:59)
    at kafka.tools.ConsoleProducer.main(ConsoleProducer.scala)
Caused by: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
    at sun.security.validator.PKIXValidator.doBuild(PKIXValidator.java:397)
    at sun.security.validator.PKIXValidator.engineValidate(PKIXValidator.java:302)
    at sun.security.validator.Validator.validate(Validator.java:260)
    at sun.security.ssl.X509TrustManagerImpl.validate(X509TrustManagerImpl.java:324)
    at sun.security.ssl.X509TrustManagerImpl.checkTrusted(X509TrustManagerImpl.java:229)
    at sun.security.ssl.X509TrustManagerImpl.checkServerTrusted(X509TrustManagerImpl.java:124)
    at sun.security.ssl.ClientHandshaker.serverCertificate(ClientHandshaker.java:1496)
    ... 23 more
Caused by: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
    at sun.security.provider.certpath.SunCertPathBuilder.build(SunCertPathBuilder.java:141)
    at sun.security.provider.certpath.SunCertPathBuilder.engineBuild(SunCertPathBuilder.java:126)
    at java.security.cert.CertPathBuilder.build(CertPathBuilder.java:280)
    at sun.security.validator.PKIXValidator.doBuild(PKIXValidator.java:392)
    ... 29 more

Now try with system properties set:

$ SCHEMA_REGISTRY_OPTS="-Djavax.net.ssl.keyStore=/path/to/my/keystore/here -Djavax.net.ssl.trustStore=/path/to/my/truststore/here -Djavax.net.ssl.keyStorePassword=password -Djavax.net.ssl.trustStorePassword=password" kafka-avro-console-producer --broker-list my.broker --topic my.topic --producer.config ~/ssl.config --property value.schema='{"type" : "record", "name" : "Example", "namespace" : "com.example", "fields" : [ {"name" : "uniqueId", "type" : [ "null", "string" ]} ]}' --property schema.registry.url=my.schema.registry
{"uniqueId" : {"string":"1234567"}}
^C
$ 

No issues when setting the vm system properties.

zigarn commented 5 years ago

I can confirm this problem: when using kafka-rest with schema-registry on SSL, we need to set the javax.net.ssl.* properties at the JVM level to be able to authenticate on schema-registry.

On some applications, this can be problematic as we may want the JVM to use one keystore to serve request but use another to call schema-registry

Having some specific properties for SSL for schema-registry client would be nice, or at least use the SslConfigs.SSL_* ones.

sergeiwaigant commented 5 years ago

We can confirm this problem too.

The AVRO Serializer connects to the schema registry using SSL/TLS and the JVM keystores / truststores. This is different from what the rest of the Kafka clients (consumer, producer, admin, …) do: They use the keystore and truststore set by the kafka parameters (ssl.keystore… / ssl.truststore… ).

The AVRO serializer should have its own properties for specifying a key- and truststore or it should just use the standard client parameters for that.

mageshn commented 5 years ago

@zigarn @sergeiwaigant As discussed the SR client current relies on the JVM level properties to set the SSL configuration. There is an outstanding PR that should address this https://github.com/confluentinc/schema-registry/pull/957. Let me know if you have any thoughts on the PR.

unixunion commented 4 years ago

Is this making any progress? As I am stuck on this issue.

cah-nathan-zender commented 4 years ago

Now that #957 is merged is there any chance this will make it into 5.4-beta or is that closed to new features now?? We are also impacted by this issue now. We do have a workaround by running our client app in a new jvm but its extra overhead as far as code/deployment/etc goes.

sqilz commented 4 years ago

Bumping this for some answers of how to resolve this as I have the same issues, I have Kafka on SSL and Schema registry using HTTPS + Avro, it works when schema is on http but not https.

Any updates on this?

ashwinkhai commented 4 years ago

Hi All,

Tried to configure Schema registry with SSL and basic auth. When attempted to use with AVRO Producer, it failed with error "No SAN Name found". I had configured to ignore the host name checking. but still didnt work. Similar configuration had worked on KAFKA REST, CONNECT and KSQL.

ALso tried to run query from KSQL which uses schema registr. Failed with same error that No SAN Name found.

Below were configuration on Schema Registry for HTTPS;

listeners=https://0.0.0.0:8081 ssl.keystore.location=/confluent-5.5.0/cert/kafka.server.keystore.jks ssl.keystore.password=password ssl.key.password=password

On avro producer and KSQL i had configured truststore and specific ssl.endpoint.identification.algorithm=

Can you please help confirm if this is the same open issue we have here?

meticulo3366 commented 3 years ago

still an issue today

PatrickEifler commented 3 years ago

Same issue here.

singhfulda commented 3 years ago

Same issue here.

manpreet1992 commented 3 years ago

Facing same issue with kafka-connect v5.5.2. Any updates ?

walkamongus commented 3 years ago

Just leaving this here -- I discovered the settings included in https://github.com/confluentinc/cp-demo/pull/140/files#diff-8d068e8797e88947c320f79e856c3e16a72b730124a8f9d7031e2c4680dfa534 allowed both kafka-avro-console-consumer and kafka-avro-console-producer to work for me with no Java system properties needed.

        --property schema.registry.ssl.truststore.location
        --property schema.registry.ssl.truststore.password
        --property schema.registry.ssl.keystore.location
        --property schema.registry.ssl.keystore.password
manpreet1992 commented 3 years ago

@ashwinkhai, I tried creating connector using below mentioned connect.json but it didn't work and got the same error. { "AvroConnector": { "connector.class": "io.confluent.connect.hdfs.HdfsSinkConnector", "hadoop.conf.dir": "/etc/kafka-connect/hadoop", "flush.size": "1", "tasks.max": "1", "topics": "av-check", "key.converter.schema.registry.url": "https://manpreetsr-ckaf-schema-registry.manpreet.svc.cluster.local:8081", "value.converter": "io.confluent.connect.avro.AvroConverter", "value.converter.schema.registry.url": "https://manpreetsr-ckaf-schema-registry.manpreet.svc.cluster.local:8081", "value.converter.schema.registry.ssl.key.password":"******", "value.converter.schema.registry.ssl.keystore.location":"/etc/kafka-connect/ssl/schema/keyStore", "value.converter.schema.registry.ssl.keystore.password":"*****", "value.converter.schema.registry.ssl.truststore.location":"/etc/kafka-connect/ssl/schema/trustStore", "value.converter.schema.registry.ssl.truststore.password":"******", "hdfs.url": "hdfs://namenodeHA:8020/user/cloud-user/performance/kconnect/topics" } } Also i tried passing the above configuration directly in properties file. None of the ways worked for us. Can you please provide some configuration with which you had tested it?

vasaguin commented 3 years ago

My collegues found that link,

https://github.com/hortonworks/registry/blob/master/schema-registry/client/src/main/resources/default-schema-registry-client.yaml

It seems there are many config attributes to set the registry client properly

schema.registry.url : "http://localhost:9191/api/v1" schema.registry.client.local.jars.path : "/tmp/schema-registry/local-jars" schema.registry.client.class.loader.cache.size : 1024 schema.registry.client.class.loader.cache.expiry.interval : 3600 schema.registry.client.schema.version.cache.size : 1024 schema.registry.client.schema.version.cache.expiry.interval : 300 schema.registry.client.schema.metadata.cache.expiry.interval : 300 schema.registry.client.schema.text.cache.size : 1024 schema.registry.client.schema.text.cache.expiry.interval : 300 schema.registry.client.url.selector : "com.hortonworks.registries.schemaregistry.client.FailoverUrlSelector"

schema.registry.client.ssl:

protocol: SSL

hostnameVerifierClass:

keyStoreType:

keyStorePath:

keyStorePassword:

keyStoreProvider:

keyPassword:

keyManagerFactoryProvider:

keyManagerFactoryAlgorithm:

trustStoreType:

trustStorePath:

trustStorePassword:

trustStoreProvider:

trustManagerFactoryProvider:

trustManagerFactoryAlgorithm:

schema.registry.auth.username:

schema.registry.auth.password:

lw-mcno commented 2 years ago

I've also encountered the problem and it was very surprising that the "global" settings weren't picked up by the producer's Schema Registry HTTP client. The workaround with overriding system properties works, but can have serious side effects, so it's not for everyone.

I've had luck with passing the following properties (no constants for trust store?) directly during SerDe configuration (version 6.1.1):

{
    Serde<KeyType> keySerde = new SpecificAvroSerde<>();
    Serde<ValueType> valueSerde = new SpecificAvroSerde<>();

    Map<String, ?> config = Map.of(
            AbstractKafkaSchemaSerDeConfig.SCHEMA_REGISTRY_URL_CONFIG, schemaRegistryUrl,
            "schema.registry.ssl.truststore.location", trustStoreLocation,
            "schema.registry.ssl.truststore.password", trustStorePassword);

    keySerde.configure(config, true);
    valueSerde.configure(config, false);
}
TheKnowles commented 2 years ago

We just got burned by this moving from 6.2.2 to 7.1.1. At some point between these releases, the JVM SSL settings no longer worked. The CachedSchemaRegistryClient requires a schema.registry. prefix, but this class is created in the converter (in our case AvroConverter) and thus needs an additional prefix of value.converter.

Worker properties requires this:

bootstrap.servers=...

key.converter=org.apache.kafka.connect.storage.StringConverter
value.converter=io.confluent.connect.avro.AvroConverter
value.converter.schema.registry.url=...

#value converter schema registry ssl settings
value.converter.schema.registry.ssl.truststore.location=/path/to/store
value.converter.schema.registry.ssl.truststore.password=password
value.converter.schema.registry.ssl.truststore.type=type
value.converter.schema.registry.ssl.keystore.location=/path/to/store
value.converter.schema.registry.ssl.keystore.password=password
value.converter.schema.registry.ssl.key.password=password
value.converter.schema.registry.ssl.keystore.type=type
value.converter.schema.registry.ssl.enabled.protocols=protocols
value.converter.schema.registry.ssl.endpoint.identification.algorithm=https # or empty if you don't need to verify the SR host

This is in addition to the normal ssl items (for the worker server itself) and any other prefixed ssl items for other ecosystem usage like producer. for source and consumer. for sinks.

babloo844pal commented 2 years ago

========================

Thread-9:ERROR:i.c.k.s.c.r.RestService(272): Failed to send HTTP request to endpoint: https://local:8081/subjects/abc-value/versions?normalize=false javax.net.ssl.SSLHandshakeException: No name matching local found at sun.security.ssl.Alert.createSSLException(Alert.java:131) ~[?:?] at sun.security.ssl.TransportContext.fatal(TransportContext.java:353) ~[?:?] at sun.security.ssl.TransportContext.fatal(TransportContext.java:296) ~[?:?] at sun.security.ssl.TransportContext.fatal(TransportContext.java:291) ~[?:?] at sun.security.ssl.CertificateMessage$T13CertificateConsumer.checkServerCerts(CertificateMessage.java:1357) ~[?:?] at sun.security.ssl.CertificateMessage$T13CertificateConsumer.onConsumeCertificate(CertificateMessage.java:1232) ~[?:?] at sun.security.ssl.CertificateMessage$T13CertificateConsumer.consume(CertificateMessage.java:1175) ~[?:?] at sun.security.ssl.SSLHandshake.consume(SSLHandshake.java:392) ~[?:?] at sun.security.ssl.HandshakeContext.dispatch(HandshakeContext.java:443) ~[?:?] at sun.security.ssl.HandshakeContext.dispatch(HandshakeContext.java:421) ~[?:?] at sun.security.ssl.TransportContext.dispatch(TransportContext.java:183) ~[?:?] at sun.security.ssl.SSLTransport.decode(SSLTransport.java:172) ~[?:?] at sun.security.ssl.SSLSocketImpl.decode(SSLSocketImpl.java:1506) ~[?:?] at sun.security.ssl.SSLSocketImpl.readHandshakeRecord(SSLSocketImpl.java:1416) ~[?:?] at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:456) ~[?:?] at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:427) ~[?:?] at sun.net.www.protocol.https.HttpsClient.afterConnect(HttpsClient.java:572) ~[?:?] at sun.net.www.protocol.https.AbstractDelegateHttpsURLConnection.connect(AbstractDelegateHttpsURLConnection.java:197) ~[?:?] at sun.net.www.protocol.http.HttpURLConnection.getOutputStream0(HttpURLConnection.java:1367) ~[?:?] at sun.net.www.protocol.http.HttpURLConnection.getOutputStream(HttpURLConnection.java:1342) ~[?:?] at sun.net.www.protocol.https.HttpsURLConnectionImpl.getOutputStream(HttpsURLConnectionImpl.java:246) ~[?:?] at io.confluent.kafka.schemaregistry.client.rest.RestService.sendHttpRequest(RestService.java:268) [kafka-schema-registry-client-7.1.1.jar!/:?] at io.confluent.kafka.schemaregistry.client.rest.RestService.httpRequest(RestService.java:367) [kafka-schema-registry-client-7.1.1.jar!/:?] at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:544) [kafka-schema-registry-client-7.1.1.jar!/:?] at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:532) [kafka-schema-registry-client-7.1.1.jar!/:?] at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:490) [kafka-schema-registry-client-7.1.1.jar!/:?] at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.registerAndGetId(CachedSchemaRegistryClient.java:257) [kafka-schema-registry-client-7.1.1.jar!/:?] at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.register(CachedSchemaRegistryClient.java:366) [kafka-schema-registry-client-7.1.1.jar!/:?] at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.register(CachedSchemaRegistryClient.java:337) [kafka-schema-registry-client-7.1.1.jar!/:?] at io.confluent.kafka.serializers.AbstractKafkaAvroSerializer.serializeImpl(AbstractKafkaAvroSerializer.java:115) [kafka-avro-serializer-7.1.1.jar!/:?] at io.confluent.kafka.serializers.KafkaAvroSerializer.serialize(KafkaAvroSerializer.java:61) [kafka-avro-serializer-7.1.1.jar!/:?] at org.apache.kafka.common.serialization.Serializer.serialize(Serializer.java:62) [kafka-clients-2.8.1.jar!/:?] at org.apache.kafka.clients.producer.KafkaProducer.doSend(KafkaProducer.java:925) [kafka-clients-2.8.1.jar!/:?] at org.apache.kafka.clients.producer.KafkaProducer.send(KafkaProducer.java:885) [kafka-clients-2.8.1.jar!/:?] at org.apache.kafka.clients.producer.KafkaProducer.send(KafkaProducer.java:773) [kafka-clients-2.8.1.jar!/:?]

Caused by: java.security.cert.CertificateException: local found at sun.security.util.HostnameChecker.matchDNS(HostnameChecker.java:234) ~[?:?] at sun.security.util.HostnameChecker.match(HostnameChecker.java:103) ~[?:?] at sun.security.ssl.X509TrustManagerImpl.checkIdentity(X509TrustManagerImpl.java:455) ~[?:?] at sun.security.ssl.X509TrustManagerImpl.checkIdentity(X509TrustManagerImpl.java:415) ~[?:?] at sun.security.ssl.X509TrustManagerImpl.checkTrusted(X509TrustManagerImpl.java:229) ~[?:?] at sun.security.ssl.X509TrustManagerImpl.checkServerTrusted(X509TrustManagerImpl.java:129) ~[?:?] at sun.security.ssl.CertificateMessage$T13CertificateConsumer.checkServerCerts(CertificateMessage.java:1341) ~[?:?] ... 33 more Exception in thread "Thread-9" org.apache.kafka.common.errors.SerializationException: Error serializing Avro message Caused by: javax.net.ssl.SSLHandshakeException: No name matching local found at java.base/sun.security.ssl.Alert.createSSLException(Alert.java:131) at java.base/sun.security.ssl.TransportContext.fatal(TransportContext.java:353) at java.base/sun.security.ssl.TransportContext.fatal(TransportContext.java:296) at java.base/sun.security.ssl.TransportContext.fatal(TransportContext.java:291) at java.base/sun.security.ssl.CertificateMessage$T13CertificateConsumer.checkServerCerts(CertificateMessage.java:1357) at java.base/sun.security.ssl.CertificateMessage$T13CertificateConsumer.onConsumeCertificate(CertificateMessage.java:1232) at java.base/sun.security.ssl.CertificateMessage$T13CertificateConsumer.consume(CertificateMessage.java:1175) at java.base/sun.security.ssl.SSLHandshake.consume(SSLHandshake.java:392) at java.base/sun.security.ssl.HandshakeContext.dispatch(HandshakeContext.java:443) at java.base/sun.security.ssl.HandshakeContext.dispatch(HandshakeContext.java:421) at java.base/sun.security.ssl.TransportContext.dispatch(TransportContext.java:183) at java.base/sun.security.ssl.SSLTransport.decode(SSLTransport.java:172) at java.base/sun.security.ssl.SSLSocketImpl.decode(SSLSocketImpl.java:1506) at java.base/sun.security.ssl.SSLSocketImpl.readHandshakeRecord(SSLSocketImpl.java:1416) at java.base/sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:456) at java.base/sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:427) at java.base/sun.net.www.protocol.https.HttpsClient.afterConnect(HttpsClient.java:572) at java.base/sun.net.www.protocol.https.AbstractDelegateHttpsURLConnection.connect(AbstractDelegateHttpsURLConnection.java:197) at java.base/sun.net.www.protocol.http.HttpURLConnection.getOutputStream0(HttpURLConnection.java:1367) at java.base/sun.net.www.protocol.http.HttpURLConnection.getOutputStream(HttpURLConnection.java:1342) at java.base/sun.net.www.protocol.https.HttpsURLConnectionImpl.getOutputStream(HttpsURLConnectionImpl.java:246) at io.confluent.kafka.schemaregistry.client.rest.RestService.sendHttpRequest(RestService.java:268) at io.confluent.kafka.schemaregistry.client.rest.RestService.httpRequest(RestService.java:367) at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:544) at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:532) at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:490) at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.registerAndGetId(CachedSchemaRegistryClient.java:257) at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.register(CachedSchemaRegistryClient.java:366) at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.register(CachedSchemaRegistryClient.java:337) at io.confluent.kafka.serializers.AbstractKafkaAvroSerializer.serializeImpl(AbstractKafkaAvroSerializer.java:115) at io.confluent.kafka.serializers.KafkaAvroSerializer.serialize(KafkaAvroSerializer.java:61) at org.apache.kafka.common.serialization.Serializer.serialize(Serializer.java:62) at org.apache.kafka.clients.producer.KafkaProducer.doSend(KafkaProducer.java:925) at org.apache.kafka.clients.producer.KafkaProducer.send(KafkaProducer.java:885) at org.apache.kafka.clients.producer.KafkaProducer.send(KafkaProducer.java:773)

Caused by: java.security.cert.CertificateException: No name matching local found at java.base/sun.security.util.HostnameChecker.matchDNS(HostnameChecker.java:234) at java.base/sun.security.util.HostnameChecker.match(HostnameChecker.java:103) at java.base/sun.security.ssl.X509TrustManagerImpl.checkIdentity(X509TrustManagerImpl.java:455) at java.base/sun.security.ssl.X509TrustManagerImpl.checkIdentity(X509TrustManagerImpl.java:415) at java.base/sun.security.ssl.X509TrustManagerImpl.checkTrusted(X509TrustManagerImpl.java:229) at java.base/sun.security.ssl.X509TrustManagerImpl.checkServerTrusted(X509TrustManagerImpl.java:129) at java.base/sun.security.ssl.CertificateMessage$T13CertificateConsumer.checkServerCerts(CertificateMessage.java:1341)

========================

I am also getting above error while used 7.1.1 jar for kafka-avro-serializer i used self signed certificate and want to ignore hostname while avro- serializer try to connect with schema registry . i am using below config to connect avro producer to schema registry :

main:INFO:i.c.k.s.KafkaAvroSerializerConfig(372): KafkaAvroSerializerConfig values: auto.register.schemas = true avro.reflection.allow.null = false avro.remove.java.properties = false avro.use.logical.type.converters = false basic.auth.credentials.source = URL basic.auth.user.info = [hidden] bearer.auth.credentials.source = STATIC_TOKEN bearer.auth.token = [hidden] context.name.strategy = class io.confluent.kafka.serializers.context.NullContextNameStrategy id.compatibility.strict = true key.subject.name.strategy = class io.confluent.kafka.serializers.subject.TopicNameStrategy latest.compatibility.strict = true max.schemas.per.subject = 1000 normalize.schemas = false proxy.host = proxy.port = -1 schema.reflection = false schema.registry.basic.auth.user.info = [hidden] schema.registry.ssl.cipher.suites = null schema.registry.ssl.enabled.protocols = [TLSv1.2] schema.registry.ssl.endpoint.identification.algorithm = schema.registry.ssl.engine.factory.class = null schema.registry.ssl.key.password = null schema.registry.ssl.keymanager.algorithm = SunX509 schema.registry.ssl.keystore.certificate.chain = null schema.registry.ssl.keystore.key = null schema.registry.ssl.keystore.location = /cert-kafka/kafka01.keystore.jks schema.registry.ssl.keystore.password = [hidden] schema.registry.ssl.keystore.type = JKS schema.registry.ssl.protocol = TLSv1.3 schema.registry.ssl.provider = null schema.registry.ssl.secure.random.implementation = null schema.registry.ssl.trustmanager.algorithm = PKIX schema.registry.ssl.truststore.certificates = null schema.registry.ssl.truststore.location =/cert-kafka/kafka.truststore.jks schema.registry.ssl.truststore.password = [hidden] schema.registry.ssl.truststore.type = JKS schema.registry.url = [https://local:8081] use.latest.version = false use.schema.id = -1 value.subject.name.strategy = class io.confluent.kafka.serializers.subject.TopicNameStrategy

some extra config tried for schema register for avro producer

//properties for schema registry props.put("schema.registry.ssl.truststore.location",sslTruststoreLocation); props.put("schema.registry.ssl.truststore.password",sslTruststorePassword); props.put("schema.registry.security.protocal",SSL); props.put("schema.registry.ssl.client.auth",none); props.put("schema.registry.ssl.endpoint.identification.algorithm",""); props.put("schema.registry.ssl.truststore.type",JKS); props.put("schema.registry.ssl.enabled.protocols",TLSv1.2);

    //properties for key and value converter
    props.put("key.converter.schema.registry.ssl.truststore.location",sslTruststoreLocation);
    props.put("key.converter.schema.registry.ssl.truststore.password",sslTruststorePassword);
    props.put("key.converter.schema.registry.security.protocal",SSL);
    props.put("key.converter.schema.registry.ssl.client.auth",none);
    props.put("key.converter.schema.registry.ssl.endpoint.identification.algorithm","");
    props.put("key.converter.schema.registry.ssl.truststore.type",JKS);
    props.put("key.converter.schema.registry.ssl.enabled.protocols",TLSv1.2);

    props.put("value.converter.schema.registry.ssl.truststore.location",sslTruststoreLocation);
    props.put("value.converter.schema.registry.ssl.truststore.password",sslTruststorePassword);
    props.put("value.converter.schema.registry.security.protocal",SSL);
    props.put("value.converter.schema.registry.ssl.client.auth",none);
    props.put("value.converter.schema.registry.ssl.endpoint.identification.algorithm","");
    props.put("value.converter.schema.registry.ssl.truststore.type",JKS);
    props.put("value.converter.schema.registry.ssl.enabled.protocols",TLSv1.2);

*if someone already solve this issue pls let me know. Appreciate your help.

why when we can expect the fix

manishkaushik29 commented 1 year ago

Thread-9:ERROR:i.c.k.s.c.r.RestService(272): Failed to send HTTP request to endpoint: https://local:8081/subjects/abc-value/versions?normalize=false javax.net.ssl.SSLHandshakeException: No name matching local found at sun.security.ssl.Alert.createSSLException(Alert.java:131) ~[?:?] at sun.security.ssl.TransportContext.fatal(TransportContext.java:353) ~[?:?] at sun.security.ssl.TransportContext.fatal(TransportContext.java:296) ~[?:?] at sun.security.ssl.TransportContext.fatal(TransportContext.java:291) ~[?:?] at sun.security.ssl.CertificateMessage$T13CertificateConsumer.checkServerCerts(CertificateMessage.java:1357) ~[?:?] at sun.security.ssl.CertificateMessage$T13CertificateConsumer.onConsumeCertificate(CertificateMessage.java:1232) ~[?:?] at sun.security.ssl.CertificateMessage$T13CertificateConsumer.consume(CertificateMessage.java:1175) ~[?:?] at sun.security.ssl.SSLHandshake.consume(SSLHandshake.java:392) ~[?:?] at sun.security.ssl.HandshakeContext.dispatch(HandshakeContext.java:443) ~[?:?] at sun.security.ssl.HandshakeContext.dispatch(HandshakeContext.java:421) ~[?:?] at sun.security.ssl.TransportContext.dispatch(TransportContext.java:183) ~[?:?] at sun.security.ssl.SSLTransport.decode(SSLTransport.java:172) ~[?:?] at sun.security.ssl.SSLSocketImpl.decode(SSLSocketImpl.java:1506) ~[?:?] at sun.security.ssl.SSLSocketImpl.readHandshakeRecord(SSLSocketImpl.java:1416) ~[?:?] at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:456) ~[?:?] at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:427) ~[?:?] at sun.net.www.protocol.https.HttpsClient.afterConnect(HttpsClient.java:572) ~[?:?] at sun.net.www.protocol.https.AbstractDelegateHttpsURLConnection.connect(AbstractDelegateHttpsURLConnection.java:197) ~[?:?] at sun.net.www.protocol.http.HttpURLConnection.getOutputStream0(HttpURLConnection.java:1367) ~[?:?] at sun.net.www.protocol.http.HttpURLConnection.getOutputStream(HttpURLConnection.java:1342) ~[?:?] at sun.net.www.protocol.https.HttpsURLConnectionImpl.getOutputStream(HttpsURLConnectionImpl.java:246) ~[?:?] at io.confluent.kafka.schemaregistry.client.rest.RestService.sendHttpRequest(RestService.java:268) [kafka-schema-registry-client-7.1.1.jar!/:?] at io.confluent.kafka.schemaregistry.client.rest.RestService.httpRequest(RestService.java:367) [kafka-schema-registry-client-7.1.1.jar!/:?] at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:544) [kafka-schema-registry-client-7.1.1.jar!/:?] at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:532) [kafka-schema-registry-client-7.1.1.jar!/:?] at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:490) [kafka-schema-registry-client-7.1.1.jar!/:?] at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.registerAndGetId(CachedSchemaRegistryClient.java:257) [kafka-schema-registry-client-7.1.1.jar!/:?] at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.register(CachedSchemaRegistryClient.java:366) [kafka-schema-registry-client-7.1.1.jar!/:?] at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.register(CachedSchemaRegistryClient.java:337) [kafka-schema-registry-client-7.1.1.jar!/:?] at io.confluent.kafka.serializers.AbstractKafkaAvroSerializer.serializeImpl(AbstractKafkaAvroSerializer.java:115) [kafka-avro-serializer-7.1.1.jar!/:?] at io.confluent.kafka.serializers.KafkaAvroSerializer.serialize(KafkaAvroSerializer.java:61) [kafka-avro-serializer-7.1.1.jar!/:?] at org.apache.kafka.common.serialization.Serializer.serialize(Serializer.java:62) [kafka-clients-2.8.1.jar!/:?] at org.apache.kafka.clients.producer.KafkaProducer.doSend(KafkaProducer.java:925) [kafka-clients-2.8.1.jar!/:?] at org.apache.kafka.clients.producer.KafkaProducer.send(KafkaProducer.java:885) [kafka-clients-2.8.1.jar!/:?] at org.apache.kafka.clients.producer.KafkaProducer.send(KafkaProducer.java:773) [kafka-clients-2.8.1.jar!/:?]

Caused by: java.security.cert.CertificateException: local found at sun.security.util.HostnameChecker.matchDNS(HostnameChecker.java:234) ~[?:?] at sun.security.util.HostnameChecker.match(HostnameChecker.java:103) ~[?:?] at sun.security.ssl.X509TrustManagerImpl.checkIdentity(X509TrustManagerImpl.java:455) ~[?:?] at sun.security.ssl.X509TrustManagerImpl.checkIdentity(X509TrustManagerImpl.java:415) ~[?:?] at sun.security.ssl.X509TrustManagerImpl.checkTrusted(X509TrustManagerImpl.java:229) ~[?:?] at sun.security.ssl.X509TrustManagerImpl.checkServerTrusted(X509TrustManagerImpl.java:129) ~[?:?] at sun.security.ssl.CertificateMessage$T13CertificateConsumer.checkServerCerts(CertificateMessage.java:1341) ~[?:?] ... 33 more Exception in thread "Thread-9" org.apache.kafka.common.errors.SerializationException: Error serializing Avro message Caused by: javax.net.ssl.SSLHandshakeException: No name matching local found at java.base/sun.security.ssl.Alert.createSSLException(Alert.java:131) at java.base/sun.security.ssl.TransportContext.fatal(TransportContext.java:353) at java.base/sun.security.ssl.TransportContext.fatal(TransportContext.java:296) at java.base/sun.security.ssl.TransportContext.fatal(TransportContext.java:291) at java.base/sun.security.ssl.CertificateMessage$T13CertificateConsumer.checkServerCerts(CertificateMessage.java:1357) at java.base/sun.security.ssl.CertificateMessage$T13CertificateConsumer.onConsumeCertificate(CertificateMessage.java:1232) at java.base/sun.security.ssl.CertificateMessage$T13CertificateConsumer.consume(CertificateMessage.java:1175) at java.base/sun.security.ssl.SSLHandshake.consume(SSLHandshake.java:392) at java.base/sun.security.ssl.HandshakeContext.dispatch(HandshakeContext.java:443) at java.base/sun.security.ssl.HandshakeContext.dispatch(HandshakeContext.java:421) at java.base/sun.security.ssl.TransportContext.dispatch(TransportContext.java:183) at java.base/sun.security.ssl.SSLTransport.decode(SSLTransport.java:172) at java.base/sun.security.ssl.SSLSocketImpl.decode(SSLSocketImpl.java:1506) at java.base/sun.security.ssl.SSLSocketImpl.readHandshakeRecord(SSLSocketImpl.java:1416) at java.base/sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:456) at java.base/sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:427) at java.base/sun.net.www.protocol.https.HttpsClient.afterConnect(HttpsClient.java:572) at java.base/sun.net.www.protocol.https.AbstractDelegateHttpsURLConnection.connect(AbstractDelegateHttpsURLConnection.java:197) at java.base/sun.net.www.protocol.http.HttpURLConnection.getOutputStream0(HttpURLConnection.java:1367) at java.base/sun.net.www.protocol.http.HttpURLConnection.getOutputStream(HttpURLConnection.java:1342) at java.base/sun.net.www.protocol.https.HttpsURLConnectionImpl.getOutputStream(HttpsURLConnectionImpl.java:246) at io.confluent.kafka.schemaregistry.client.rest.RestService.sendHttpRequest(RestService.java:268) at io.confluent.kafka.schemaregistry.client.rest.RestService.httpRequest(RestService.java:367) at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:544) at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:532) at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:490) at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.registerAndGetId(CachedSchemaRegistryClient.java:257) at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.register(CachedSchemaRegistryClient.java:366) at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.register(CachedSchemaRegistryClient.java:337) at io.confluent.kafka.serializers.AbstractKafkaAvroSerializer.serializeImpl(AbstractKafkaAvroSerializer.java:115) at io.confluent.kafka.serializers.KafkaAvroSerializer.serialize(KafkaAvroSerializer.java:61) at org.apache.kafka.common.serialization.Serializer.serialize(Serializer.java:62) at org.apache.kafka.clients.producer.KafkaProducer.doSend(KafkaProducer.java:925) at org.apache.kafka.clients.producer.KafkaProducer.send(KafkaProducer.java:885) at org.apache.kafka.clients.producer.KafkaProducer.send(KafkaProducer.java:773)

Caused by: java.security.cert.CertificateException: No name matching local found at java.base/sun.security.util.HostnameChecker.matchDNS(HostnameChecker.java:234) at java.base/sun.security.util.HostnameChecker.match(HostnameChecker.java:103) at java.base/sun.security.ssl.X509TrustManagerImpl.checkIdentity(X509TrustManagerImpl.java:455) at java.base/sun.security.ssl.X509TrustManagerImpl.checkIdentity(X509TrustManagerImpl.java:415) at java.base/sun.security.ssl.X509TrustManagerImpl.checkTrusted(X509TrustManagerImpl.java:229) at java.base/sun.security.ssl.X509TrustManagerImpl.checkServerTrusted(X509TrustManagerImpl.java:129) at java.base/sun.security.ssl.CertificateMessage$T13CertificateConsumer.checkServerCerts(CertificateMessage.java:1341)

========================

I am also getting above error while used 7.1.1 jar for kafka-avro-serializer i used self signed certificate and want to ignore hostname while avro- serializer try to connect with schema registry . i am using below config to connect avro producer to schema registry :

main:INFO:i.c.k.s.KafkaAvroSerializerConfig(372): KafkaAvroSerializerConfig values: auto.register.schemas = true avro.reflection.allow.null = false avro.remove.java.properties = false avro.use.logical.type.converters = false basic.auth.credentials.source = URL basic.auth.user.info = [hidden] bearer.auth.credentials.source = STATIC_TOKEN bearer.auth.token = [hidden] context.name.strategy = class io.confluent.kafka.serializers.context.NullContextNameStrategy id.compatibility.strict = true key.subject.name.strategy = class io.confluent.kafka.serializers.subject.TopicNameStrategy latest.compatibility.strict = true max.schemas.per.subject = 1000 normalize.schemas = false proxy.host = proxy.port = -1 schema.reflection = false schema.registry.basic.auth.user.info = [hidden] schema.registry.ssl.cipher.suites = null schema.registry.ssl.enabled.protocols = [TLSv1.2] schema.registry.ssl.endpoint.identification.algorithm = schema.registry.ssl.engine.factory.class = null schema.registry.ssl.key.password = null schema.registry.ssl.keymanager.algorithm = SunX509 schema.registry.ssl.keystore.certificate.chain = null schema.registry.ssl.keystore.key = null schema.registry.ssl.keystore.location = /cert-kafka/kafka01.keystore.jks schema.registry.ssl.keystore.password = [hidden] schema.registry.ssl.keystore.type = JKS schema.registry.ssl.protocol = TLSv1.3 schema.registry.ssl.provider = null schema.registry.ssl.secure.random.implementation = null schema.registry.ssl.trustmanager.algorithm = PKIX schema.registry.ssl.truststore.certificates = null schema.registry.ssl.truststore.location =/cert-kafka/kafka.truststore.jks schema.registry.ssl.truststore.password = [hidden] schema.registry.ssl.truststore.type = JKS schema.registry.url = [https://local:8081] use.latest.version = false use.schema.id = -1 value.subject.name.strategy = class io.confluent.kafka.serializers.subject.TopicNameStrategy

some extra config tried for schema register for avro producer

//properties for schema registry props.put("schema.registry.ssl.truststore.location",sslTruststoreLocation); props.put("schema.registry.ssl.truststore.password",sslTruststorePassword); props.put("schema.registry.security.protocal",SSL); props.put("schema.registry.ssl.client.auth",none); props.put("schema.registry.ssl.endpoint.identification.algorithm",""); props.put("schema.registry.ssl.truststore.type",JKS); props.put("schema.registry.ssl.enabled.protocols",TLSv1.2);

//properties for key and value converter
props.put("key.converter.schema.registry.ssl.truststore.location",sslTruststoreLocation);
props.put("key.converter.schema.registry.ssl.truststore.password",sslTruststorePassword);
props.put("key.converter.schema.registry.security.protocal",SSL);
props.put("key.converter.schema.registry.ssl.client.auth",none);
props.put("key.converter.schema.registry.ssl.endpoint.identification.algorithm","");
props.put("key.converter.schema.registry.ssl.truststore.type",JKS);
props.put("key.converter.schema.registry.ssl.enabled.protocols",TLSv1.2);

props.put("value.converter.schema.registry.ssl.truststore.location",sslTruststoreLocation);
props.put("value.converter.schema.registry.ssl.truststore.password",sslTruststorePassword);
props.put("value.converter.schema.registry.security.protocal",SSL);
props.put("value.converter.schema.registry.ssl.client.auth",none);
props.put("value.converter.schema.registry.ssl.endpoint.identification.algorithm","");
props.put("value.converter.schema.registry.ssl.truststore.type",JKS);
props.put("value.converter.schema.registry.ssl.enabled.protocols",TLSv1.2);

*if someone already solve this issue pls let me know. Appreciate your help.

why when we can expect the fix

jainp32 commented 2 months ago

Hi is there any solution to this problem? I am using confluent version 7.5.0 and still getting the PKIX error, the only solution worked for me is setting the properties at jvm level, which is not recommended.