datahub-project / datahub

The Metadata Platform for your Data Stack
https://datahubproject.io
Apache License 2.0
9.69k stars 2.86k forks source link

Unable to connect to secure elastic search in kubernetes. #8433

Closed LilMonk closed 7 months ago

LilMonk commented 1 year ago

Describe the bug I'm trying to set up datahub on Kubernetes. My environment contains Postgres, Kafka (Strimzi), and ElasticSearch (Using the official elastic search operator). I have enabled TLS/SSL on Kafka and ElasticSearch. I'm able to connect to Kafka but not to ElasticSeach.

This is my YAML:

apiVersion: batch/v1
kind: Job
metadata:
  name: datahub-datahub-system-update-job
  labels:
    app.kubernetes.io/managed-by: "Helm"
    app.kubernetes.io/instance: "datahub"
    app.kubernetes.io/version: 0.10.2
    helm.sh/chart: "datahub-0.2.164"
  annotations:
    # This is what defines this resource as a hook. Without this line, the
    # job is considered part of the release.
    "helm.sh/hook": pre-install,pre-upgrade
    "helm.sh/hook-weight": "-4"
    "helm.sh/hook-delete-policy": before-hook-creation
spec:
  template:
    spec:
      volumes:
        - name: datahub-kafka-certs-dir
          secret:
            defaultMode: 0444
            secretName: kafka-user-certs
        - name: datahub-elasticsearch-certs-dir
          secret:
            defaultMode: 0444
            secretName: elasticsearch-user-certs
        - name: cacerts
          emptyDir: {}
        - name: tls
          secret:
            defaultMode: 256
            secretName: root-secret
      restartPolicy: Never
      securityContext:
        fsGroup: 1000
      initContainers:
        - name: init-cacerts
          image: "acryldata/datahub-upgrade:v0.10.4"
          command:
          - sh
          - -c
          - |
            cp -R /etc/ssl/certs/* /cacerts/
            cp /security/ca.crt /cacerts/ca.crt
          volumeMounts:
          - mountPath: /cacerts
            name: cacerts
          - mountPath: /security
            name: tls
      containers:
        - name: datahub-system-update-job
          image: "acryldata/datahub-upgrade:v0.10.4"
          imagePullPolicy: IfNotPresent
          args:
            - "-u"
            - "SystemUpdate"
          env:
            - name: DATAHUB_REVISION
              value: "1"
            - name: ENTITY_REGISTRY_CONFIG_PATH
              value: /datahub/datahub-gms/resources/entity-registry.yml
            - name: DATAHUB_GMS_HOST
              value: datahub-datahub-gms
            - name: DATAHUB_GMS_PORT
              value: "8080"
            - name: DATAHUB_MAE_CONSUMER_HOST
              value: datahub-datahub-mae-consumer
            - name: DATAHUB_MAE_CONSUMER_PORT
              value: "9091"
            - name: EBEAN_DATASOURCE_USERNAME
              value: "postgres"
            - name: EBEAN_DATASOURCE_PASSWORD
              valueFrom:
                secretKeyRef:
                  name: "postgres.postgres.credentials.postgresql.acid.zalan.do"
                  key: "password"
            - name: EBEAN_DATASOURCE_HOST
              value: "postgres.postgres"
            - name: EBEAN_DATASOURCE_URL
              value: "jdbc:postgresql://postgres.postgres:5432/datahub"
            - name: EBEAN_DATASOURCE_DRIVER
              value: "org.postgresql.Driver"
            - name: KAFKA_BOOTSTRAP_SERVER
              value: "kafka-kafka-ingresstls-bootstrap.kafka:9093"
            - name: KAFKA_SCHEMAREGISTRY_URL
              value: "http://schema-registry.kafka:8081"
            - name: ELASTICSEARCH_HOST
              value: "elasticsearch-es-http.elasticsearch"
            - name: ELASTICSEARCH_PORT
              value: "9200"
            - name: SKIP_ELASTICSEARCH_CHECK
              value: "false"
            - name: ELASTICSEARCH_INSECURE
              value: "true"
            - name: ELASTICSEARCH_USE_SSL
              value: "true"
            - name: ELASTICSEARCH_USERNAME
              value: elastic_user
            - name: ELASTICSEARCH_PASSWORD
              value: "elastic_pass"
            - name: ELASTICSEARCH_SSL_PROTOCOL
              value: "TLSv1.2"
            - name: ELASTICSEARCH_SSL_TRUSTSTORE_FILE
              value: "/mnt/datahub/certs/elasticsearch/truststore.jks"
            - name: ELASTICSEARCH_SSL_TRUSTSTORE_PASSWORD
              valueFrom:
                secretKeyRef:
                  name: elasticsearch-user-certs
                  key: truststore.password
            - name: ELASTICSEARCH_SSL_TRUSTSTORE_TYPE
              value: "JKS"
            - name: ELASTICSEARCH_SSL_KEYSTORE_FILE
              value: "/mnt/datahub/certs/elasticsearch/keystore.jks"
            - name: ELASTICSEARCH_SSL_KEYSTORE_PASSWORD
              valueFrom:
                secretKeyRef:
                  name: elasticsearch-user-certs
                  key: keystore.password
            - name: ELASTICSEARCH_SSL_KEYSTORE_TYPE
              value: "JKS"
            - name: ELASTICSEARCH_SSL_KEY_PASSWORD
              valueFrom:
                secretKeyRef:
                  name: elasticsearch-user-certs
                  key: keystore.password
            - name: GRAPH_SERVICE_IMPL
              value: elasticsearch
            - name: SPRING_KAFKA_PROPERTIES_KAFKASTORE_SECURITY_PROTOCOL
              value: "SSL"
            - name: SPRING_KAFKA_PROPERTIES_KAFKASTORE_SSL_TRUSTSTORE_LOCATION
              value: "/mnt/datahub/certs/kafka/truststore.jks"
            - name: SPRING_KAFKA_PROPERTIES_SECURITY_PROTOCOL
              value: "SSL"
            - name: SPRING_KAFKA_PROPERTIES_SSL_ENDPOINT_IDENTIFICATION_ALGORITHM
              value: ""
            - name: SPRING_KAFKA_PROPERTIES_SSL_KEYSTORE_LOCATION
              value: "/mnt/datahub/certs/kafka/keystore.jks"
            - name: SPRING_KAFKA_PROPERTIES_SSL_KEYSTORE_TYPE
              value: "JKS"
            - name: SPRING_KAFKA_PROPERTIES_SSL_PROTOCOL
              value: "TLS"
            - name: SPRING_KAFKA_PROPERTIES_SSL_TRUSTSTORE_LOCATION
              value: "/mnt/datahub/certs/kafka/truststore.jks"
            - name: SPRING_KAFKA_PROPERTIES_SSL_TRUSTSTORE_TYPE
              value: "JKS"
            - name: SPRING_KAFKA_PROPERTIES_SSL_KEYSTORE_PASSWORD
              valueFrom:
                secretKeyRef:
                  name: kafka-user-certs
                  key: keystore.password
            - name: SPRING_KAFKA_PROPERTIES_SSL_KEY_PASSWORD
              valueFrom:
                secretKeyRef:
                  name: kafka-user-certs
                  key: keystore.password
            - name: SPRING_KAFKA_PROPERTIES_SSL_TRUSTSTORE_PASSWORD
              valueFrom:
                secretKeyRef:
                  name: kafka-user-certs
                  key: truststore.password
            - name: METADATA_CHANGE_EVENT_NAME
              value: MetadataChangeEvent_v4
            - name: FAILED_METADATA_CHANGE_EVENT_NAME
              value: FailedMetadataChangeEvent_v4
            - name: METADATA_AUDIT_EVENT_NAME
              value: MetadataAuditEvent_v4
            - name: METADATA_CHANGE_PROPOSAL_TOPIC_NAME
              value: MetadataChangeProposal_v1
            - name: FAILED_METADATA_CHANGE_PROPOSAL_TOPIC_NAME
              value: FailedMetadataChangeProposal_v1
            - name: METADATA_CHANGE_LOG_VERSIONED_TOPIC_NAME
              value: MetadataChangeLog_Versioned_v1
            - name: METADATA_CHANGE_LOG_TIMESERIES_TOPIC_NAME
              value: MetadataChangeLog_Timeseries_v1
            - name: DATAHUB_UPGRADE_HISTORY_TOPIC_NAME
              value: DataHubUpgradeHistory_v1
            - name: DATAHUB_ANALYTICS_ENABLED
              value: "true"
            - name: SCHEMA_REGISTRY_TYPE
              value: "KAFKA"
            - name: ELASTICSEARCH_BUILD_INDICES_CLONE_INDICES
              value: "true"
            - name: ELASTICSEARCH_INDEX_BUILDER_MAPPINGS_REINDEX
              value: "true"
            - name: ELASTICSEARCH_INDEX_BUILDER_SETTINGS_REINDEX
              value: "true"
          securityContext:
            {}
          volumeMounts:
            - name: datahub-kafka-certs-dir
              mountPath: /mnt/datahub/certs/kafka
            - name: datahub-elasticsearch-certs-dir
              mountPath: /mnt/datahub/certs/elasticsearch
            - mountPath: /etc/ssl/certs
              name: cacerts
          resources:
            limits:
              cpu: 500m
              memory: 512Mi
            requests:
              cpu: 300m
              memory: 256Mi

The error log:

ERROR StatusLogger Log4j2 could not find a logging implementation. Please add log4j-core to the classpath. Using SimpleLogger to log to the console...

  .   ____          _            __ _ _
 /\\ / ___'_ __ _ _(_)_ __  __ _ \ \ \ \
( ( )\___ | '_ | '_| | '_ \/ _` | \ \ \ \
 \\/  ___)| |_)| | | | | || (_| |  ) ) ) )
  '  |____| .__|_| |_|_| |_\__, | / / / /
 =========|_|==============|___/=/_/_/_/
 :: Spring Boot ::               (v2.7.11)

2023-07-17 13:16:07,710 [main] INFO  io.ebean.EbeanVersion:31 - ebean version: 11.33.3
2023-07-17 13:16:07,811 [main] INFO  i.e.config.properties.LoadContext:83 - loaded properties from [application.yml]
2023-07-17 13:16:08,023 [main] INFO  i.e.datasource.pool.ConnectionPool:294 - DataSourcePool [gmsEbeanServiceConfig] autoCommit[false] transIsolation[READ_COMMITTED] min[2] max[50]
2023-07-17 13:16:10,202 [main] INFO  io.ebean.internal.DefaultContainer:208 - DatabasePlatform name:gmsEbeanServiceConfig platform:postgres
2023-07-17 13:16:11,624 [main] INFO  c.l.g.f.k.s.KafkaSchemaRegistryFactory:61 - creating schema registry config using url: http://schema-registry.kafka:8081
2023-07-17 13:16:11,915 [main] INFO  o.a.k.c.producer.ProducerConfig:347 - ProducerConfig values: 
    acks = 1
    batch.size = 16384
    bootstrap.servers = [kafka-kafka-ingresstls-bootstrap.kafka:9093]
    buffer.memory = 33554432
    client.dns.lookup = default
    client.id = producer-1
    compression.type = none
    connections.max.idle.ms = 540000
    delivery.timeout.ms = 30000
    enable.idempotence = false
    interceptor.classes = []
    key.serializer = class org.apache.kafka.common.serialization.StringSerializer
    linger.ms = 0
    max.block.ms = 60000
    max.in.flight.requests.per.connection = 5
    max.request.size = 1048576
    metadata.max.age.ms = 300000
    metadata.max.idle.ms = 300000
    metric.reporters = []
    metrics.num.samples = 2
    metrics.recording.level = INFO
    metrics.sample.window.ms = 30000
    partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner
    receive.buffer.bytes = 32768
    reconnect.backoff.max.ms = 1000
    reconnect.backoff.ms = 50
    request.timeout.ms = 3000
    retries = 3
    retry.backoff.ms = 500
    sasl.client.callback.handler.class = null
    sasl.jaas.config = null
    sasl.kerberos.kinit.cmd = /usr/bin/kinit
    sasl.kerberos.min.time.before.relogin = 60000
    sasl.kerberos.service.name = null
    sasl.kerberos.ticket.renew.jitter = 0.05
    sasl.kerberos.ticket.renew.window.factor = 0.8
    sasl.login.callback.handler.class = null
    sasl.login.class = null
    sasl.login.refresh.buffer.seconds = 300
    sasl.login.refresh.min.period.seconds = 60
    sasl.login.refresh.window.factor = 0.8
    sasl.login.refresh.window.jitter = 0.05
    sasl.mechanism = GSSAPI
    security.protocol = SSL
    security.providers = null
    send.buffer.bytes = 131072
    ssl.cipher.suites = null
    ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
    ssl.endpoint.identification.algorithm = 
    ssl.key.password = [hidden]
    ssl.keymanager.algorithm = SunX509
    ssl.keystore.location = /mnt/datahub/certs/kafka/keystore.jks
    ssl.keystore.password = [hidden]
    ssl.keystore.type = JKS
    ssl.protocol = TLS
    ssl.provider = null
    ssl.secure.random.implementation = null
    ssl.trustmanager.algorithm = PKIX
    ssl.truststore.location = /mnt/datahub/certs/kafka/truststore.jks
    ssl.truststore.password = [hidden]
    ssl.truststore.type = JKS
    transaction.timeout.ms = 60000
    transactional.id = null
    value.serializer = class io.confluent.kafka.serializers.KafkaAvroSerializer

2023-07-17 13:16:11,915 [main] INFO  o.a.k.c.producer.ProducerConfig:347 - ProducerConfig values: 
    acks = 1
    batch.size = 16384
    bootstrap.servers = [kafka-kafka-ingresstls-bootstrap.kafka:9093]
    buffer.memory = 33554432
    client.dns.lookup = default
    client.id = producer-1
    compression.type = none
    connections.max.idle.ms = 540000
    delivery.timeout.ms = 30000
    enable.idempotence = false
    interceptor.classes = []
    key.serializer = class org.apache.kafka.common.serialization.StringSerializer
    linger.ms = 0
    max.block.ms = 60000
    max.in.flight.requests.per.connection = 5
    max.request.size = 1048576
    metadata.max.age.ms = 300000
    metadata.max.idle.ms = 300000
    metric.reporters = []
    metrics.num.samples = 2
    metrics.recording.level = INFO
    metrics.sample.window.ms = 30000
    partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner
    receive.buffer.bytes = 32768
    reconnect.backoff.max.ms = 1000
    reconnect.backoff.ms = 50
    request.timeout.ms = 3000
    retries = 3
    retry.backoff.ms = 500
    sasl.client.callback.handler.class = null
    sasl.jaas.config = null
    sasl.kerberos.kinit.cmd = /usr/bin/kinit
    sasl.kerberos.min.time.before.relogin = 60000
    sasl.kerberos.service.name = null
    sasl.kerberos.ticket.renew.jitter = 0.05
    sasl.kerberos.ticket.renew.window.factor = 0.8
    sasl.login.callback.handler.class = null
    sasl.login.class = null
    sasl.login.refresh.buffer.seconds = 300
    sasl.login.refresh.min.period.seconds = 60
    sasl.login.refresh.window.factor = 0.8
    sasl.login.refresh.window.jitter = 0.05
    sasl.mechanism = GSSAPI
    security.protocol = SSL
    security.providers = null
    send.buffer.bytes = 131072
    ssl.cipher.suites = null
    ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
    ssl.endpoint.identification.algorithm = 
    ssl.key.password = [hidden]
    ssl.keymanager.algorithm = SunX509
    ssl.keystore.location = /mnt/datahub/certs/kafka/keystore.jks
    ssl.keystore.password = [hidden]
    ssl.keystore.type = JKS
    ssl.protocol = TLS
    ssl.provider = null
    ssl.secure.random.implementation = null
    ssl.trustmanager.algorithm = PKIX
    ssl.truststore.location = /mnt/datahub/certs/kafka/truststore.jks
    ssl.truststore.password = [hidden]
    ssl.truststore.type = JKS
    transaction.timeout.ms = 60000
    transactional.id = null
    value.serializer = class io.confluent.kafka.serializers.KafkaAvroSerializer

2023-07-17 13:16:12,006 [main] INFO  i.c.k.s.KafkaAvroSerializerConfig:179 - KafkaAvroSerializerConfig values: 
    bearer.auth.token = [hidden]
    proxy.port = -1
    schema.reflection = false
    auto.register.schemas = true
    max.schemas.per.subject = 1000
    basic.auth.credentials.source = URL
    value.subject.name.strategy = class io.confluent.kafka.serializers.subject.TopicNameStrategy
    schema.registry.url = [http://schema-registry.kafka:8081]
    basic.auth.user.info = [hidden]
    proxy.host = 
    use.latest.version = false
    schema.registry.basic.auth.user.info = [hidden]
    bearer.auth.credentials.source = STATIC_TOKEN
    key.subject.name.strategy = class io.confluent.kafka.serializers.subject.TopicNameStrategy

2023-07-17 13:16:12,819 [main] WARN  o.a.k.c.producer.ProducerConfig:355 - The configuration 'schema.registry.security.protocol' was supplied but isn't a known config.
2023-07-17 13:16:12,819 [main] WARN  o.a.k.c.producer.ProducerConfig:355 - The configuration 'schema.registry.security.protocol' was supplied but isn't a known config.
2023-07-17 13:16:12,819 [main] WARN  o.a.k.c.producer.ProducerConfig:355 - The configuration 'kafkastore.ssl.truststore.location' was supplied but isn't a known config.
2023-07-17 13:16:12,819 [main] WARN  o.a.k.c.producer.ProducerConfig:355 - The configuration 'kafkastore.ssl.truststore.location' was supplied but isn't a known config.
2023-07-17 13:16:12,820 [main] WARN  o.a.k.c.producer.ProducerConfig:355 - The configuration 'kafkastore.security.protocol' was supplied but isn't a known config.
2023-07-17 13:16:12,820 [main] WARN  o.a.k.c.producer.ProducerConfig:355 - The configuration 'kafkastore.security.protocol' was supplied but isn't a known config.
2023-07-17 13:16:12,901 [main] INFO  o.a.kafka.common.utils.AppInfoParser:117 - Kafka version: 5.5.1-ccs
2023-07-17 13:16:12,902 [main] INFO  o.a.kafka.common.utils.AppInfoParser:118 - Kafka commitId: cb1873c1fdf5f5f9
2023-07-17 13:16:12,903 [main] INFO  o.a.kafka.common.utils.AppInfoParser:119 - Kafka startTimeMs: 1689599772821
2023-07-17 13:16:13,118 [main] INFO  c.l.m.m.r.ConfigEntityRegistry:88 - Loading bare config entity registry file at /datahub/datahub-gms/resources/entity-registry.yml
2023-07-17 13:16:15,607 [kafka-producer-network-thread | producer-1] INFO  org.apache.kafka.clients.Metadata:277 - [Producer clientId=producer-1] Cluster ID: K3DZrqfxRhOmUjTYbE1xlg
2023-07-17 13:16:15,607 [kafka-producer-network-thread | producer-1] INFO  org.apache.kafka.clients.Metadata:277 - [Producer clientId=producer-1] Cluster ID: K3DZrqfxRhOmUjTYbE1xlg
2023-07-17 13:16:24,920 [main] WARN  c.l.m.m.r.PluginEntityRegistryLoader:44 - /etc/datahub/plugins/models directory does not exist or is not a directory. Plugin scanning will be disabled.
2023-07-17 13:16:25,612 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - dataHubPolicyKey schema is compatible with previous schema due to 
2023-07-17 13:16:25,614 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - dataHubPolicyInfo schema is compatible with previous schema due to 
2023-07-17 13:16:25,614 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - corpGroupInfo schema is compatible with previous schema due to 
2023-07-17 13:16:25,614 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - corpGroupKey schema is compatible with previous schema due to 
2023-07-17 13:16:25,614 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - globalTags schema is compatible with previous schema due to 
2023-07-17 13:16:25,614 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - status schema is compatible with previous schema due to 
2023-07-17 13:16:25,615 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - dataProcessKey schema is compatible with previous schema due to 
2023-07-17 13:16:25,615 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - ownership schema is compatible with previous schema due to 
2023-07-17 13:16:25,615 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - dataProcessInfo schema is compatible with previous schema due to 
2023-07-17 13:16:25,615 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - status schema is compatible with previous schema due to 
2023-07-17 13:16:25,615 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - browsePaths schema is compatible with previous schema due to 
2023-07-17 13:16:25,616 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - ownership schema is compatible with previous schema due to 
2023-07-17 13:16:25,616 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - dataPlatformInstance schema is compatible with previous schema due to 
2023-07-17 13:16:25,616 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - mlFeatureTableProperties schema is compatible with previous schema due to 
2023-07-17 13:16:25,616 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - institutionalMemory schema is compatible with previous schema due to 
2023-07-17 13:16:25,616 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - deprecation schema is compatible with previous schema due to 
2023-07-17 13:16:25,616 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - globalTags schema is compatible with previous schema due to 
2023-07-17 13:16:25,617 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - mlFeatureTableKey schema is compatible with previous schema due to 
2023-07-17 13:16:25,617 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - status schema is compatible with previous schema due to 
2023-07-17 13:16:25,617 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - browsePaths schema is compatible with previous schema due to 
2023-07-17 13:16:25,617 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - mlModelGroupProperties schema is compatible with previous schema due to 
2023-07-17 13:16:25,617 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - ownership schema is compatible with previous schema due to 
2023-07-17 13:16:25,617 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - dataPlatformInstance schema is compatible with previous schema due to 
2023-07-17 13:16:25,618 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - mlModelGroupKey schema is compatible with previous schema due to 
2023-07-17 13:16:25,618 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - deprecation schema is compatible with previous schema due to 
2023-07-17 13:16:25,618 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - globalTags schema is compatible with previous schema due to 
2023-07-17 13:16:25,618 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - status schema is compatible with previous schema due to 
2023-07-17 13:16:25,618 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - browsePaths schema is compatible with previous schema due to 
2023-07-17 13:16:25,619 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - dataJobInputOutput schema is compatible with previous schema due to 
2023-07-17 13:16:25,619 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - ownership schema is compatible with previous schema due to 
2023-07-17 13:16:25,619 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - dataPlatformInstance schema is compatible with previous schema due to 
2023-07-17 13:16:25,619 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - glossaryTerms schema is compatible with previous schema due to 
2023-07-17 13:16:25,619 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - institutionalMemory schema is compatible with previous schema due to 
2023-07-17 13:16:25,620 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - globalTags schema is compatible with previous schema due to 
2023-07-17 13:16:25,620 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - dataJobInfo schema is compatible with previous schema due to 
2023-07-17 13:16:25,620 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - editableDataJobProperties schema is compatible with previous schema due to 
2023-07-17 13:16:25,620 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - dataJobKey schema is compatible with previous schema due to 
2023-07-17 13:16:25,620 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - status schema is compatible with previous schema due to 
2023-07-17 13:16:25,620 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - ownership schema is compatible with previous schema due to 
2023-07-17 13:16:25,621 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - tagKey schema is compatible with previous schema due to 
2023-07-17 13:16:25,621 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - tagProperties schema is compatible with previous schema due to 
2023-07-17 13:16:25,621 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - status schema is compatible with previous schema due to 
2023-07-17 13:16:25,621 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - schemaFieldKey schema is compatible with previous schema due to 
2023-07-17 13:16:25,621 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - browsePaths schema is compatible with previous schema due to 
2023-07-17 13:16:25,621 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - glossaryTermInfo schema is compatible with previous schema due to 
2023-07-17 13:16:25,621 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - ownership schema is compatible with previous schema due to 
2023-07-17 13:16:25,622 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - glossaryRelatedTerms schema is compatible with previous schema due to 
2023-07-17 13:16:25,622 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - glossaryTermKey schema is compatible with previous schema due to 
2023-07-17 13:16:25,622 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - status schema is compatible with previous schema due to 
2023-07-17 13:16:25,622 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - ownership schema is compatible with previous schema due to 
2023-07-17 13:16:25,622 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - dataPlatformInstance schema is compatible with previous schema due to 
2023-07-17 13:16:25,622 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - mlPrimaryKeyKey schema is compatible with previous schema due to 
2023-07-17 13:16:25,622 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - institutionalMemory schema is compatible with previous schema due to 
2023-07-17 13:16:25,623 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - mlPrimaryKeyProperties schema is compatible with previous schema due to 
2023-07-17 13:16:25,623 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - deprecation schema is compatible with previous schema due to 
2023-07-17 13:16:25,623 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - globalTags schema is compatible with previous schema due to 
2023-07-17 13:16:25,623 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - status schema is compatible with previous schema due to 
2023-07-17 13:16:25,623 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - browsePaths schema is compatible with previous schema due to 
2023-07-17 13:16:25,623 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - dashboardKey schema is compatible with previous schema due to 
2023-07-17 13:16:25,624 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - ownership schema is compatible with previous schema due to 
2023-07-17 13:16:25,624 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - dataPlatformInstance schema is compatible with previous schema due to 
2023-07-17 13:16:25,624 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - editableDashboardProperties schema is compatible with previous schema due to 
2023-07-17 13:16:25,624 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - glossaryTerms schema is compatible with previous schema due to 
2023-07-17 13:16:25,625 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - institutionalMemory schema is compatible with previous schema due to 
2023-07-17 13:16:25,626 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - dashboardInfo schema is compatible with previous schema due to 
2023-07-17 13:16:25,702 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - globalTags schema is compatible with previous schema due to 
2023-07-17 13:16:25,703 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - status schema is compatible with previous schema due to 
2023-07-17 13:16:25,703 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - ownership schema is compatible with previous schema due to 
2023-07-17 13:16:25,703 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - dataPlatformInstance schema is compatible with previous schema due to 
2023-07-17 13:16:25,703 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - mlModelDeploymentKey schema is compatible with previous schema due to 
2023-07-17 13:16:25,703 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - deprecation schema is compatible with previous schema due to 
2023-07-17 13:16:25,704 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - globalTags schema is compatible with previous schema due to 
2023-07-17 13:16:25,704 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - mlModelDeploymentProperties schema is compatible with previous schema due to 
2023-07-17 13:16:25,704 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - status schema is compatible with previous schema due to 
2023-07-17 13:16:25,704 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - dataPlatformKey schema is compatible with previous schema due to 
2023-07-17 13:16:25,704 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - dataPlatformInfo schema is compatible with previous schema due to 
2023-07-17 13:16:25,704 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - dataHubRetentionConfig schema is compatible with previous schema due to 
2023-07-17 13:16:25,704 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - dataHubRetentionKey schema is compatible with previous schema due to 
2023-07-17 13:16:25,704 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - corpUserKey schema is compatible with previous schema due to 
2023-07-17 13:16:25,705 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - corpUserEditableInfo schema is compatible with previous schema due to 
2023-07-17 13:16:25,705 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - corpUserInfo schema is compatible with previous schema due to 
2023-07-17 13:16:25,705 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - corpUserStatus schema is compatible with previous schema due to 
2023-07-17 13:16:25,705 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - groupMembership schema is compatible with previous schema due to 
2023-07-17 13:16:25,705 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - globalTags schema is compatible with previous schema due to 
2023-07-17 13:16:25,705 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - status schema is compatible with previous schema due to 
2023-07-17 13:16:25,705 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - cost schema is compatible with previous schema due to 
2023-07-17 13:16:25,706 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - mlModelTrainingData schema is compatible with previous schema due to 
2023-07-17 13:16:25,713 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - mlModelCaveatsAndRecommendations schema is compatible with previous schema due to 
2023-07-17 13:16:25,713 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - deprecation schema is compatible with previous schema due to 
2023-07-17 13:16:25,713 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - globalTags schema is compatible with previous schema due to 
2023-07-17 13:16:25,713 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - mlModelKey schema is compatible with previous schema due to 
2023-07-17 13:16:25,713 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - mlModelFactorPrompts schema is compatible with previous schema due to 
2023-07-17 13:16:25,714 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - browsePaths schema is compatible with previous schema due to 
2023-07-17 13:16:25,714 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - sourceCode schema is compatible with previous schema due to 
2023-07-17 13:16:25,714 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - ownership schema is compatible with previous schema due to 
2023-07-17 13:16:25,714 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - mlModelEvaluationData schema is compatible with previous schema due to 
2023-07-17 13:16:25,714 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - mlModelQuantitativeAnalyses schema is compatible with previous schema due to 
2023-07-17 13:16:25,714 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - dataPlatformInstance schema is compatible with previous schema due to 
2023-07-17 13:16:25,715 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - mlModelEthicalConsiderations schema is compatible with previous schema due to 
2023-07-17 13:16:25,715 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - mlModelMetrics schema is compatible with previous schema due to 
2023-07-17 13:16:25,715 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - intendedUse schema is compatible with previous schema due to 
2023-07-17 13:16:25,715 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - mlModelProperties schema is compatible with previous schema due to 
2023-07-17 13:16:25,715 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - institutionalMemory schema is compatible with previous schema due to 
2023-07-17 13:16:25,715 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - status schema is compatible with previous schema due to 
2023-07-17 13:16:25,715 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - browsePaths schema is compatible with previous schema due to 
2023-07-17 13:16:25,715 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - editableDataFlowProperties schema is compatible with previous schema due to 
2023-07-17 13:16:25,716 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - ownership schema is compatible with previous schema due to 
2023-07-17 13:16:25,716 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - dataPlatformInstance schema is compatible with previous schema due to 
2023-07-17 13:16:25,716 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - dataFlowKey schema is compatible with previous schema due to 
2023-07-17 13:16:25,716 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - dataFlowInfo schema is compatible with previous schema due to 
2023-07-17 13:16:25,716 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - glossaryTerms schema is compatible with previous schema due to 
2023-07-17 13:16:25,716 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - institutionalMemory schema is compatible with previous schema due to 
2023-07-17 13:16:25,716 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - globalTags schema is compatible with previous schema due to 
2023-07-17 13:16:25,716 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - status schema is compatible with previous schema due to 
2023-07-17 13:16:25,717 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - glossaryNodeInfo schema is compatible with previous schema due to 
2023-07-17 13:16:25,717 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - glossaryNodeKey schema is compatible with previous schema due to 
2023-07-17 13:16:25,717 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - ownership schema is compatible with previous schema due to 
2023-07-17 13:16:25,717 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - status schema is compatible with previous schema due to 
2023-07-17 13:16:25,717 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - browsePaths schema is compatible with previous schema due to 
2023-07-17 13:16:25,717 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - ownership schema is compatible with previous schema due to 
2023-07-17 13:16:25,717 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - dataPlatformInstance schema is compatible with previous schema due to 
2023-07-17 13:16:25,717 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - mlFeatureProperties schema is compatible with previous schema due to 
2023-07-17 13:16:25,717 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - institutionalMemory schema is compatible with previous schema due to 
2023-07-17 13:16:25,717 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - mlFeatureKey schema is compatible with previous schema due to 
2023-07-17 13:16:25,717 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - deprecation schema is compatible with previous schema due to 
2023-07-17 13:16:25,718 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - globalTags schema is compatible with previous schema due to 
2023-07-17 13:16:25,718 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - status schema is compatible with previous schema due to 
2023-07-17 13:16:25,718 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - editableSchemaMetadata schema is compatible with previous schema due to 
2023-07-17 13:16:25,718 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - datasetKey schema is compatible with previous schema due to 
2023-07-17 13:16:25,718 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - datasetUpstreamLineage schema is compatible with previous schema due to 
2023-07-17 13:16:25,718 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - viewProperties schema is compatible with previous schema due to 
2023-07-17 13:16:25,719 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - datasetProperties schema is compatible with previous schema due to 
2023-07-17 13:16:25,719 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - globalTags schema is compatible with previous schema due to 
2023-07-17 13:16:25,719 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - browsePaths schema is compatible with previous schema due to 
2023-07-17 13:16:25,719 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - ownership schema is compatible with previous schema due to 
2023-07-17 13:16:25,719 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - dataPlatformInstance schema is compatible with previous schema due to 
2023-07-17 13:16:25,719 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - datasetDeprecation schema is compatible with previous schema due to 
2023-07-17 13:16:25,719 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - editableDatasetProperties schema is compatible with previous schema due to 
2023-07-17 13:16:25,719 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - glossaryTerms schema is compatible with previous schema due to 
2023-07-17 13:16:25,719 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - institutionalMemory schema is compatible with previous schema due to 
2023-07-17 13:16:25,719 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - upstreamLineage schema is compatible with previous schema due to 
2023-07-17 13:16:25,720 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - schemaMetadata schema is compatible with previous schema due to 
2023-07-17 13:16:25,720 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - status schema is compatible with previous schema due to 
2023-07-17 13:16:25,720 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - chartQuery schema is compatible with previous schema due to 
2023-07-17 13:16:25,720 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - browsePaths schema is compatible with previous schema due to 
2023-07-17 13:16:25,721 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - ownership schema is compatible with previous schema due to 
2023-07-17 13:16:25,721 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - dataPlatformInstance schema is compatible with previous schema due to 
2023-07-17 13:16:25,721 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - chartInfo schema is compatible with previous schema due to 
2023-07-17 13:16:25,721 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - glossaryTerms schema is compatible with previous schema due to 
2023-07-17 13:16:25,721 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - editableChartProperties schema is compatible with previous schema due to 
2023-07-17 13:16:25,721 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - institutionalMemory schema is compatible with previous schema due to 
2023-07-17 13:16:25,721 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - globalTags schema is compatible with previous schema due to 
2023-07-17 13:16:25,721 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - chartKey schema is compatible with previous schema due to 
2023-07-17 13:16:25,721 [main] INFO  c.l.m.m.r.MergedEntityRegistry:99 - status schema is compatible with previous schema due to 
2023-07-17 13:16:26,704 [main] WARN  c.l.r.t.h.client.HttpClientFactory:917 - No scheduled executor is provided to HttpClientFactory, using it's own scheduled executor.
2023-07-17 13:16:26,705 [main] WARN  c.l.r.t.h.client.HttpClientFactory:926 - No callback executor is provided to HttpClientFactory, using it's own call back executor.
2023-07-17 13:16:26,705 [main] WARN  c.l.r.t.h.client.HttpClientFactory:934 - No Compression executor is provided to HttpClientFactory, using it's own compression executor.
2023-07-17 13:16:26,715 [main] INFO  c.l.r.t.h.client.HttpClientFactory:1394 - The service 'null' has been assigned to the ChannelPoolManager with key 'noSpecifiedNamePrefix 1138266797 ', http.protocolVersion=HTTP_1_1, usePipelineV2=false, requestTimeout=10000ms, streamingTimeout=-1ms
2023-07-17 13:16:32,004 [main] INFO  c.l.g.f.s.ElasticSearchServiceFactory:56 - Search configuration: SearchConfiguration(maxTermBucketSize=20, exactMatch=ExactMatchConfiguration(exclusive=false, withPrefix=true, prefixFactor=1.6, exactFactor=10.0, caseSensitivityFactor=0.7, enableStructured=true), partial=PartialConfiguration(urnFactor=0.5, factor=0.4), custom=CustomConfiguration(enabled=false, file=search_config.yml), graph=GraphQueryConfiguration(timeoutSeconds=50, batchSize=1000, maxResult=10000))
2023-07-17 13:16:32,111 [main] INFO  c.l.m.c.search.CustomConfiguration:40 - Custom search configuration disabled.
2023-07-17 13:16:32,508 [main] INFO  c.l.g.f.k.s.DUHESchemaRegistryFactory:29 - DataHub System Update Registry
2023-07-17 13:16:32,511 [main] INFO  o.a.k.c.producer.ProducerConfig:347 - ProducerConfig values: 
    acks = 1
    batch.size = 16384
    bootstrap.servers = [kafka-kafka-ingresstls-bootstrap.kafka:9093]
    buffer.memory = 33554432
    client.dns.lookup = default
    client.id = producer-2
    compression.type = none
    connections.max.idle.ms = 540000
    delivery.timeout.ms = 30000
    enable.idempotence = false
    interceptor.classes = []
    key.serializer = class org.apache.kafka.common.serialization.StringSerializer
    linger.ms = 0
    max.block.ms = 60000
    max.in.flight.requests.per.connection = 5
    max.request.size = 1048576
    metadata.max.age.ms = 300000
    metadata.max.idle.ms = 300000
    metric.reporters = []
    metrics.num.samples = 2
    metrics.recording.level = INFO
    metrics.sample.window.ms = 30000
    partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner
    receive.buffer.bytes = 32768
    reconnect.backoff.max.ms = 1000
    reconnect.backoff.ms = 50
    request.timeout.ms = 3000
    retries = 3
    retry.backoff.ms = 500
    sasl.client.callback.handler.class = null
    sasl.jaas.config = null
    sasl.kerberos.kinit.cmd = /usr/bin/kinit
    sasl.kerberos.min.time.before.relogin = 60000
    sasl.kerberos.service.name = null
    sasl.kerberos.ticket.renew.jitter = 0.05
    sasl.kerberos.ticket.renew.window.factor = 0.8
    sasl.login.callback.handler.class = null
    sasl.login.class = null
    sasl.login.refresh.buffer.seconds = 300
    sasl.login.refresh.min.period.seconds = 60
    sasl.login.refresh.window.factor = 0.8
    sasl.login.refresh.window.jitter = 0.05
    sasl.mechanism = GSSAPI
    security.protocol = SSL
    security.providers = null
    send.buffer.bytes = 131072
    ssl.cipher.suites = null
    ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
    ssl.endpoint.identification.algorithm = 
    ssl.key.password = [hidden]
    ssl.keymanager.algorithm = SunX509
    ssl.keystore.location = /mnt/datahub/certs/kafka/keystore.jks
    ssl.keystore.password = [hidden]
    ssl.keystore.type = JKS
    ssl.protocol = TLS
    ssl.provider = null
    ssl.secure.random.implementation = null
    ssl.trustmanager.algorithm = PKIX
    ssl.truststore.location = /mnt/datahub/certs/kafka/truststore.jks
    ssl.truststore.password = [hidden]
    ssl.truststore.type = JKS
    transaction.timeout.ms = 60000
    transactional.id = null
    value.serializer = class com.linkedin.metadata.boot.kafka.MockDUHESerializer

2023-07-17 13:16:32,511 [main] INFO  o.a.k.c.producer.ProducerConfig:347 - ProducerConfig values: 
    acks = 1
    batch.size = 16384
    bootstrap.servers = [kafka-kafka-ingresstls-bootstrap.kafka:9093]
    buffer.memory = 33554432
    client.dns.lookup = default
    client.id = producer-2
    compression.type = none
    connections.max.idle.ms = 540000
    delivery.timeout.ms = 30000
    enable.idempotence = false
    interceptor.classes = []
    key.serializer = class org.apache.kafka.common.serialization.StringSerializer
    linger.ms = 0
    max.block.ms = 60000
    max.in.flight.requests.per.connection = 5
    max.request.size = 1048576
    metadata.max.age.ms = 300000
    metadata.max.idle.ms = 300000
    metric.reporters = []
    metrics.num.samples = 2
    metrics.recording.level = INFO
    metrics.sample.window.ms = 30000
    partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner
    receive.buffer.bytes = 32768
    reconnect.backoff.max.ms = 1000
    reconnect.backoff.ms = 50
    request.timeout.ms = 3000
    retries = 3
    retry.backoff.ms = 500
    sasl.client.callback.handler.class = null
    sasl.jaas.config = null
    sasl.kerberos.kinit.cmd = /usr/bin/kinit
    sasl.kerberos.min.time.before.relogin = 60000
    sasl.kerberos.service.name = null
    sasl.kerberos.ticket.renew.jitter = 0.05
    sasl.kerberos.ticket.renew.window.factor = 0.8
    sasl.login.callback.handler.class = null
    sasl.login.class = null
    sasl.login.refresh.buffer.seconds = 300
    sasl.login.refresh.min.period.seconds = 60
    sasl.login.refresh.window.factor = 0.8
    sasl.login.refresh.window.jitter = 0.05
    sasl.mechanism = GSSAPI
    security.protocol = SSL
    security.providers = null
    send.buffer.bytes = 131072
    ssl.cipher.suites = null
    ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
    ssl.endpoint.identification.algorithm = 
    ssl.key.password = [hidden]
    ssl.keymanager.algorithm = SunX509
    ssl.keystore.location = /mnt/datahub/certs/kafka/keystore.jks
    ssl.keystore.password = [hidden]
    ssl.keystore.type = JKS
    ssl.protocol = TLS
    ssl.provider = null
    ssl.secure.random.implementation = null
    ssl.trustmanager.algorithm = PKIX
    ssl.truststore.location = /mnt/datahub/certs/kafka/truststore.jks
    ssl.truststore.password = [hidden]
    ssl.truststore.type = JKS
    transaction.timeout.ms = 60000
    transactional.id = null
    value.serializer = class com.linkedin.metadata.boot.kafka.MockDUHESerializer

2023-07-17 13:16:33,203 [main] INFO  i.c.k.s.KafkaAvroSerializerConfig:179 - KafkaAvroSerializerConfig values: 
    bearer.auth.token = [hidden]
    proxy.port = -1
    schema.reflection = false
    auto.register.schemas = true
    max.schemas.per.subject = 1000
    basic.auth.credentials.source = URL
    value.subject.name.strategy = class io.confluent.kafka.serializers.subject.TopicNameStrategy
    schema.registry.url = [http://schema-registry.kafka:8081]
    basic.auth.user.info = [hidden]
    proxy.host = 
    use.latest.version = false
    schema.registry.basic.auth.user.info = [hidden]
    bearer.auth.credentials.source = STATIC_TOKEN
    key.subject.name.strategy = class io.confluent.kafka.serializers.subject.TopicNameStrategy

2023-07-17 13:16:33,217 [main] WARN  o.a.k.c.producer.ProducerConfig:355 - The configuration 'kafkastore.ssl.truststore.location' was supplied but isn't a known config.
2023-07-17 13:16:33,217 [main] WARN  o.a.k.c.producer.ProducerConfig:355 - The configuration 'kafkastore.ssl.truststore.location' was supplied but isn't a known config.
2023-07-17 13:16:33,217 [main] WARN  o.a.k.c.producer.ProducerConfig:355 - The configuration 'kafkastore.security.protocol' was supplied but isn't a known config.
2023-07-17 13:16:33,217 [main] WARN  o.a.k.c.producer.ProducerConfig:355 - The configuration 'kafkastore.security.protocol' was supplied but isn't a known config.
2023-07-17 13:16:33,217 [main] INFO  o.a.kafka.common.utils.AppInfoParser:117 - Kafka version: 5.5.1-ccs
2023-07-17 13:16:33,217 [main] INFO  o.a.kafka.common.utils.AppInfoParser:118 - Kafka commitId: cb1873c1fdf5f5f9
2023-07-17 13:16:33,217 [main] INFO  o.a.kafka.common.utils.AppInfoParser:119 - Kafka startTimeMs: 1689599793217
2023-07-17 13:16:33,404 [kafka-producer-network-thread | producer-2] INFO  org.apache.kafka.clients.Metadata:277 - [Producer clientId=producer-2] Cluster ID: K3DZrqfxRhOmUjTYbE1xlg
2023-07-17 13:16:33,404 [kafka-producer-network-thread | producer-2] INFO  org.apache.kafka.clients.Metadata:277 - [Producer clientId=producer-2] Cluster ID: K3DZrqfxRhOmUjTYbE1xlg
2023-07-17 13:16:33,520 [main] WARN  c.l.r.t.h.client.HttpClientFactory:917 - No scheduled executor is provided to HttpClientFactory, using it's own scheduled executor.
2023-07-17 13:16:33,520 [main] WARN  c.l.r.t.h.client.HttpClientFactory:926 - No callback executor is provided to HttpClientFactory, using it's own call back executor.
2023-07-17 13:16:33,520 [main] WARN  c.l.r.t.h.client.HttpClientFactory:934 - No Compression executor is provided to HttpClientFactory, using it's own compression executor.
2023-07-17 13:16:33,521 [main] INFO  c.l.r.t.h.client.HttpClientFactory:1394 - The service 'null' has been assigned to the ChannelPoolManager with key 'noSpecifiedNamePrefix 1138266797 ', http.protocolVersion=HTTP_1_1, usePipelineV2=false, requestTimeout=10000ms, streamingTimeout=-1ms
2023-07-17 13:16:37,911 [main] WARN  c.d.p.configuration.ConfigProvider:39 - Configuration config.yml file not found at location /etc/datahub/plugins/auth
2023-07-17 13:16:37,912 [main] INFO  c.l.g.f.auth.AuthorizerChainFactory:75 - Default DataHubAuthorizer is enabled. Appending it to the authorization chain.
2023-07-17 13:16:38,006 [main] INFO  c.l.g.f.k.KafkaEventConsumerFactory:100 - Event-based KafkaListenerContainerFactory built successfully. Consumer concurrency = 1
2023-07-17 13:16:38,013 [main] INFO  c.l.g.f.k.KafkaEventConsumerFactory:116 - Event-based DUHE KafkaListenerContainerFactory built successfully. Consumer concurrency = 1
2023-07-17 13:16:38,016 [main] INFO  c.l.g.f.k.SimpleKafkaConsumerFactory:48 - Simple KafkaListenerContainerFactory built successfully
2023-07-17 13:16:40,521 [main] INFO  c.l.d.u.impl.DefaultUpgradeReport:16 - Starting upgrade with id SystemUpdate...
2023-07-17 13:16:40,523 [main] INFO  c.l.d.u.impl.DefaultUpgradeReport:16 - Executing Step 1/5: BuildIndicesPreStep...
2023-07-17 13:16:41,111 [main] ERROR c.l.d.u.s.e.s.BuildIndicesPreStep:81 - BuildIndicesPreStep failed.
javax.net.ssl.SSLHandshakeException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
    at org.elasticsearch.client.RestClient.extractAndWrapCause(RestClient.java:874)
    at org.elasticsearch.client.RestClient.performRequest(RestClient.java:283)
    at org.elasticsearch.client.RestClient.performRequest(RestClient.java:270)
    at org.elasticsearch.client.RestHighLevelClient.internalPerformRequest(RestHighLevelClient.java:1632)
    at org.elasticsearch.client.RestHighLevelClient.performRequest(RestHighLevelClient.java:1617)
    at org.elasticsearch.client.IndicesClient.exists(IndicesClient.java:974)
    at com.linkedin.metadata.search.elasticsearch.indexbuilder.ESIndexBuilder.buildReindexState(ESIndexBuilder.java:141)
    at com.linkedin.metadata.graph.elastic.ElasticSearchGraphService.getReindexConfigs(ElasticSearchGraphService.java:331)
    at com.linkedin.datahub.upgrade.system.elasticsearch.util.IndexUtils.getAllReindexConfigs(IndexUtils.java:34)
    at com.linkedin.datahub.upgrade.system.elasticsearch.steps.BuildIndicesPreStep.lambda$executable$0(BuildIndicesPreStep.java:53)
    at com.linkedin.datahub.upgrade.impl.DefaultUpgradeManager.executeStepInternal(DefaultUpgradeManager.java:110)
    at com.linkedin.datahub.upgrade.impl.DefaultUpgradeManager.executeInternal(DefaultUpgradeManager.java:68)
    at com.linkedin.datahub.upgrade.impl.DefaultUpgradeManager.executeInternal(DefaultUpgradeManager.java:42)
    at com.linkedin.datahub.upgrade.impl.DefaultUpgradeManager.execute(DefaultUpgradeManager.java:33)
    at com.linkedin.datahub.upgrade.UpgradeCli.run(UpgradeCli.java:80)
    at org.springframework.boot.SpringApplication.callRunner(SpringApplication.java:768)
    at org.springframework.boot.SpringApplication.callRunners(SpringApplication.java:752)
    at org.springframework.boot.SpringApplication.run(SpringApplication.java:314)
    at org.springframework.boot.builder.SpringApplicationBuilder.run(SpringApplicationBuilder.java:164)
    at com.linkedin.datahub.upgrade.UpgradeCliApplication.main(UpgradeCliApplication.java:23)
    at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.base/java.lang.reflect.Method.invoke(Method.java:566)
    at org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:49)
    at org.springframework.boot.loader.Launcher.launch(Launcher.java:108)
    at org.springframework.boot.loader.Launcher.launch(Launcher.java:58)
    at org.springframework.boot.loader.JarLauncher.main(JarLauncher.java:65)
Caused by: javax.net.ssl.SSLHandshakeException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
    at java.base/sun.security.ssl.Alert.createSSLException(Alert.java:131)
    at java.base/sun.security.ssl.TransportContext.fatal(TransportContext.java:360)
    at java.base/sun.security.ssl.TransportContext.fatal(TransportContext.java:303)
    at java.base/sun.security.ssl.TransportContext.fatal(TransportContext.java:298)
    at java.base/sun.security.ssl.CertificateMessage$T13CertificateConsumer.checkServerCerts(CertificateMessage.java:1357)
    at java.base/sun.security.ssl.CertificateMessage$T13CertificateConsumer.onConsumeCertificate(CertificateMessage.java:1232)
    at java.base/sun.security.ssl.CertificateMessage$T13CertificateConsumer.consume(CertificateMessage.java:1175)
    at java.base/sun.security.ssl.SSLHandshake.consume(SSLHandshake.java:392)
    at java.base/sun.security.ssl.HandshakeContext.dispatch(HandshakeContext.java:443)
    at java.base/sun.security.ssl.SSLEngineImpl$DelegatedTask$DelegatedAction.run(SSLEngineImpl.java:1076)
    at java.base/sun.security.ssl.SSLEngineImpl$DelegatedTask$DelegatedAction.run(SSLEngineImpl.java:1063)
    at java.base/java.security.AccessController.doPrivileged(Native Method)
    at java.base/sun.security.ssl.SSLEngineImpl$DelegatedTask.run(SSLEngineImpl.java:1010)
    at org.apache.http.nio.reactor.ssl.SSLIOSession.doRunTask(SSLIOSession.java:285)
    at org.apache.http.nio.reactor.ssl.SSLIOSession.doHandshake(SSLIOSession.java:345)
    at org.apache.http.nio.reactor.ssl.SSLIOSession.isAppInputReady(SSLIOSession.java:523)
    at org.apache.http.impl.nio.reactor.AbstractIODispatch.inputReady(AbstractIODispatch.java:120)
    at org.apache.http.impl.nio.reactor.BaseIOReactor.readable(BaseIOReactor.java:162)
    at org.apache.http.impl.nio.reactor.AbstractIOReactor.processEvent(AbstractIOReactor.java:337)
    at org.apache.http.impl.nio.reactor.AbstractIOReactor.processEvents(AbstractIOReactor.java:315)
    at org.apache.http.impl.nio.reactor.AbstractIOReactor.execute(AbstractIOReactor.java:276)
    at org.apache.http.impl.nio.reactor.BaseIOReactor.execute(BaseIOReactor.java:104)
    at org.apache.http.impl.nio.reactor.AbstractMultiworkerIOReactor$Worker.run(AbstractMultiworkerIOReactor.java:591)
    at java.base/java.lang.Thread.run(Thread.java:829)
Caused by: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
    at java.base/sun.security.validator.PKIXValidator.doBuild(PKIXValidator.java:439)
    at java.base/sun.security.validator.PKIXValidator.engineValidate(PKIXValidator.java:306)
    at java.base/sun.security.validator.Validator.validate(Validator.java:264)
    at java.base/sun.security.ssl.X509TrustManagerImpl.validate(X509TrustManagerImpl.java:313)
    at java.base/sun.security.ssl.X509TrustManagerImpl.checkTrusted(X509TrustManagerImpl.java:276)
    at java.base/sun.security.ssl.X509TrustManagerImpl.checkServerTrusted(X509TrustManagerImpl.java:141)
    at java.base/sun.security.ssl.CertificateMessage$T13CertificateConsumer.checkServerCerts(CertificateMessage.java:1335)
    ... 19 common frames omitted
Caused by: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
    at java.base/sun.security.provider.certpath.SunCertPathBuilder.build(SunCertPathBuilder.java:146)
    at java.base/sun.security.provider.certpath.SunCertPathBuilder.engineBuild(SunCertPathBuilder.java:127)
    at java.base/java.security.cert.CertPathBuilder.build(CertPathBuilder.java:297)
    at java.base/sun.security.validator.PKIXValidator.doBuild(PKIXValidator.java:434)
    ... 25 common frames omitted
2023-07-17 13:16:41,113 [main] INFO  c.l.d.u.impl.DefaultUpgradeReport:16 - Failed Step 1/5: BuildIndicesPreStep. Failed after 0 retries.
2023-07-17 13:16:41,113 [main] INFO  c.l.d.u.impl.DefaultUpgradeReport:16 - Exiting upgrade SystemUpdate with failure.
2023-07-17 13:16:41,114 [main] INFO  c.l.d.u.impl.DefaultUpgradeReport:16 - Upgrade SystemUpdate completed with result FAILED. Exiting...
2023-07-17 13:16:41,205 [SpringApplicationShutdownHook] INFO  c.l.r.t.h.c.c.AbstractNettyClient:249 - Shutdown requested
2023-07-17 13:16:41,205 [SpringApplicationShutdownHook] INFO  c.l.r.t.h.c.c.AbstractNettyClient:252 - Shutting down
2023-07-17 13:16:41,210 [SpringApplicationShutdownHook] INFO  c.l.r.t.h.c.c.ChannelPoolManagerImpl:152 - Shutting down 0 connection pools
2023-07-17 13:16:41,211 [SpringApplicationShutdownHook] INFO  c.l.r.t.h.c.c.ChannelPoolManagerImpl:162 - All connection pools shutdown
2023-07-17 13:16:41,212 [SpringApplicationShutdownHook] INFO  c.l.r.t.h.c.c.ChannelPoolManagerImpl:119 - All connection pools shut down, closing all channels
2023-07-17 13:16:41,310 [SpringApplicationShutdownHook] INFO  c.l.r.t.h.c.c.ChannelPoolManagerImpl:152 - Shutting down 0 connection pools
2023-07-17 13:16:41,310 [R2 Nio Event Loop-3-1] INFO  c.l.r.t.h.c.c.ChannelPoolManagerImpl:103 - Shutdown complete
2023-07-17 13:16:41,310 [SpringApplicationShutdownHook] INFO  c.l.r.t.h.c.c.ChannelPoolManagerImpl:162 - All connection pools shutdown
2023-07-17 13:16:41,310 [SpringApplicationShutdownHook] INFO  c.l.r.t.h.c.c.ChannelPoolManagerImpl:119 - All connection pools shut down, closing all channels
2023-07-17 13:16:41,310 [R2 Nio Event Loop-3-2] INFO  c.l.r.t.h.c.c.ChannelPoolManagerImpl:103 - Shutdown complete
2023-07-17 13:16:41,311 [SpringApplicationShutdownHook] INFO  c.l.r.t.h.c.c.AbstractNettyClient:249 - Shutdown requested
2023-07-17 13:16:41,311 [SpringApplicationShutdownHook] INFO  c.l.r.t.h.c.c.AbstractNettyClient:252 - Shutting down
2023-07-17 13:16:41,311 [SpringApplicationShutdownHook] INFO  c.l.r.t.h.c.c.ChannelPoolManagerImpl:152 - Shutting down 0 connection pools
2023-07-17 13:16:41,311 [SpringApplicationShutdownHook] INFO  c.l.r.t.h.c.c.ChannelPoolManagerImpl:162 - All connection pools shutdown
2023-07-17 13:16:41,311 [SpringApplicationShutdownHook] INFO  c.l.r.t.h.c.c.ChannelPoolManagerImpl:119 - All connection pools shut down, closing all channels
2023-07-17 13:16:41,311 [SpringApplicationShutdownHook] INFO  c.l.r.t.h.c.c.ChannelPoolManagerImpl:152 - Shutting down 0 connection pools
2023-07-17 13:16:41,311 [SpringApplicationShutdownHook] INFO  c.l.r.t.h.c.c.ChannelPoolManagerImpl:162 - All connection pools shutdown
2023-07-17 13:16:41,311 [SpringApplicationShutdownHook] INFO  c.l.r.t.h.c.c.ChannelPoolManagerImpl:119 - All connection pools shut down, closing all channels
2023-07-17 13:16:41,311 [R2 Nio Event Loop-3-1] INFO  c.l.r.t.h.c.c.ChannelPoolManagerImpl:103 - Shutdown complete
2023-07-17 13:16:41,312 [R2 Nio Event Loop-3-2] INFO  c.l.r.t.h.c.c.ChannelPoolManagerImpl:103 - Shutdown complete
2023-07-17 13:16:41,612 [SpringApplicationShutdownHook] INFO  c.l.r.t.h.c.c.AbstractNettyClient:249 - Shutdown requested
2023-07-17 13:16:41,612 [SpringApplicationShutdownHook] INFO  c.l.r.t.h.c.c.AbstractNettyClient:252 - Shutting down
2023-07-17 13:16:41,613 [SpringApplicationShutdownHook] INFO  c.l.r.t.h.c.c.ChannelPoolManagerImpl:152 - Shutting down 0 connection pools
2023-07-17 13:16:41,613 [SpringApplicationShutdownHook] INFO  c.l.r.t.h.c.c.ChannelPoolManagerImpl:162 - All connection pools shutdown
2023-07-17 13:16:41,613 [SpringApplicationShutdownHook] INFO  c.l.r.t.h.c.c.ChannelPoolManagerImpl:119 - All connection pools shut down, closing all channels
2023-07-17 13:16:41,613 [SpringApplicationShutdownHook] INFO  c.l.r.t.h.c.c.ChannelPoolManagerImpl:152 - Shutting down 0 connection pools
2023-07-17 13:16:41,614 [SpringApplicationShutdownHook] INFO  c.l.r.t.h.c.c.ChannelPoolManagerImpl:162 - All connection pools shutdown
2023-07-17 13:16:41,614 [R2 Nio Event Loop-1-1] INFO  c.l.r.t.h.c.c.ChannelPoolManagerImpl:103 - Shutdown complete
2023-07-17 13:16:41,614 [SpringApplicationShutdownHook] INFO  c.l.r.t.h.c.c.ChannelPoolManagerImpl:119 - All connection pools shut down, closing all channels
2023-07-17 13:16:41,614 [SpringApplicationShutdownHook] INFO  c.l.r.t.h.c.c.AbstractNettyClient:249 - Shutdown requested
2023-07-17 13:16:41,614 [SpringApplicationShutdownHook] INFO  c.l.r.t.h.c.c.AbstractNettyClient:252 - Shutting down
2023-07-17 13:16:41,614 [SpringApplicationShutdownHook] INFO  c.l.r.t.h.c.c.ChannelPoolManagerImpl:152 - Shutting down 0 connection pools
2023-07-17 13:16:41,614 [R2 Nio Event Loop-1-2] INFO  c.l.r.t.h.c.c.ChannelPoolManagerImpl:103 - Shutdown complete
2023-07-17 13:16:41,614 [SpringApplicationShutdownHook] INFO  c.l.r.t.h.c.c.ChannelPoolManagerImpl:162 - All connection pools shutdown
2023-07-17 13:16:41,614 [SpringApplicationShutdownHook] INFO  c.l.r.t.h.c.c.ChannelPoolManagerImpl:119 - All connection pools shut down, closing all channels
2023-07-17 13:16:41,615 [SpringApplicationShutdownHook] INFO  c.l.r.t.h.c.c.ChannelPoolManagerImpl:152 - Shutting down 0 connection pools
2023-07-17 13:16:41,615 [R2 Nio Event Loop-1-1] INFO  c.l.r.t.h.c.c.ChannelPoolManagerImpl:103 - Shutdown complete
2023-07-17 13:16:41,615 [SpringApplicationShutdownHook] INFO  c.l.r.t.h.c.c.ChannelPoolManagerImpl:162 - All connection pools shutdown
2023-07-17 13:16:41,615 [SpringApplicationShutdownHook] INFO  c.l.r.t.h.c.c.ChannelPoolManagerImpl:119 - All connection pools shut down, closing all channels
2023-07-17 13:16:41,615 [R2 Nio Event Loop-1-2] INFO  c.l.r.t.h.c.c.ChannelPoolManagerImpl:103 - Shutdown complete
2023-07-17 13:16:41,616 [SpringApplicationShutdownHook] INFO  o.a.k.clients.producer.KafkaProducer:1182 - [Producer clientId=producer-1] Closing the Kafka producer with timeoutMillis = 9223372036854775807 ms.
2023-07-17 13:16:41,616 [SpringApplicationShutdownHook] INFO  o.a.k.clients.producer.KafkaProducer:1182 - [Producer clientId=producer-1] Closing the Kafka producer with timeoutMillis = 9223372036854775807 ms.
ANTLR Tool version 4.5 used for code generation does not match the current runtime version 4.7.2ANTLR Runtime version 4.5 used for parser compilation does not match the current runtime version 4.7.2ANTLR Tool version 4.5 used for code generation does not match the current runtime version 4.7.2ANTLR Runtime version 4.5 used for parser compilation does not match the current runtime version 4.7.2

To Reproduce Steps to reproduce the behavior:

  1. Deploy elastic search using elastic search operator.
    apiVersion: elasticsearch.k8s.elastic.co/v1
    kind: Elasticsearch
    metadata:
    name: elasticsearch
    namespace: elasticsearch
    spec:
    http:
    tls:
      certificate:
        secretName: es-ca-cert
    version: 7.17.7
    auth:
    fileRealm:
      - secretName: elasticsearch-secret
    nodeSets:
    - name: default
      count: 1
      config:
        node.store.allow_mmap: false
      volumeClaimTemplates:
        - metadata:
            name: elasticsearch-data # Do not change this name unless you set up a volume mount for the data path.
          spec:
            accessModes:
              - ReadWriteOnce
            resources:
              requests:
                storage: 2Gi
            storageClassName: es-storage-class
  2. use this script to generate truststore and keystore.
    
    #!/bin/sh

ES_HOME=pwd ES_CRTS=$ES_HOME/crts ES_CLUSTER_NAMESPACE=elasticsearch ES_PASSWORD=es_password TARGET_NAMESPACE=data-hub

cd $ES_HOME

echo "Current working dir: pwd"

makeCrtsDir () { [ -d foo ] || mkdir -p $ES_CRTS }

clearCrts () { rm -r $ES_CRTS makeCrtsDir }

getCrts () { kubectl get secret elasticsearch-es-http-certs-public \ --namespace=$ES_CLUSTER_NAMESPACE \ --output=go-template='{{index .data "ca.crt" | base64decode }}' \

$ES_CRTS/ca.crt

kubectl get secret elasticsearch-es-http-certs-public \
--namespace=$ES_CLUSTER_NAMESPACE \
--output=go-template='{{index .data "tls.crt" | base64decode }}' \
> $ES_CRTS/tls.crt

kubectl get secret elasticsearch-es-http-certs-internal \
--namespace=$ES_CLUSTER_NAMESPACE \
--output=go-template='{{index .data "tls.key" | base64decode }}' \
> $ES_CRTS/tls.key

}

createCrtsStore () { openssl pkcs12 -export \ -in $ES_CRTS/tls.crt \ -inkey $ES_CRTS/tls.key \ -out $ES_CRTS/keystore.p12 \ -name elasticsearch \ -CAfile $ES_CRTS/ca.crt \ -caname elasticsearch \ -password pass:$ES_PASSWORD

keytool -importkeystore \
-deststorepass $ES_PASSWORD \
-destkeypass $ES_PASSWORD \
-destkeystore $ES_CRTS/keystore.jks \
-srckeystore $ES_CRTS/keystore.p12 \
-srcstoretype PKCS12 \
-srcstorepass $ES_PASSWORD \
-alias elasticsearch \
-noprompt

keytool -import \
-trustcacerts \
-alias root \
-file $ES_CRTS/ca.crt \
-keystore $ES_CRTS/truststore.jks \
-storepass $ES_PASSWORD -noprompt

}

createSecretAtTarget() { kubectl create secret generic elasticsearch-user-certs \ --from-file=$ES_CRTS/keystore.jks \ --from-file=$ES_CRTS/truststore.jks \ --from-literal=keystore.password=$ES_PASSWORD \ --from-literal=truststore.password=$ES_PASSWORD \ --namespace=$TARGET_NAMESPACE }

makeCrtsDir clearCrts getCrts createCrtsStore createSecretAtTarget



**Expected behavior**
The job should have been successfully executed.

**Desktop (please complete the following information):**
 - OS: Pop!_OS 22.04 LTS x86_6

Please let me know if any additional information is required.
godocean commented 1 year ago

I met the same issue as you, not sure how to fix it.

cccadet commented 1 year ago

Same here.

shicholas commented 1 year ago

Same here

github-actions[bot] commented 11 months ago

This issue is stale because it has been open for 30 days with no activity. If you believe this is still an issue on the latest DataHub release please leave a comment with the version that you tested it with. If this is a question/discussion please head to https://slack.datahubproject.io. For feature requests please use https://feature-requests.datahubproject.io

Gerrit-K commented 10 months ago

Not stale, same issue here with similar setup.

For the record, because I thought it just didn't pick up my environment variables, I tried these variants:

I could confirm by trial & error that the first two variable variants are indeed picked up, but the client doesn't accept the self-signed certificate from elasticsearch.

wei-jiang-dns53 commented 10 months ago

I run a Nginx HTTP proxy in front of elasticsearch to mitigate this issue

ozmoze commented 10 months ago

I had the same issue using datahub-gms:v0.12.0 along with elasticsearch 8.10.4.

After having tried most of elasticsearch environment variables combinations suggested by @LilMonk and @Gerrit-K, I finally got it to work by setting JAVA_OPTS env variable in datahub-gms section.

datahub-gms:
  enabled: true
  /// truncated ///
  extraEnvs:
    - name: ELASTICSEARCH_SSL_PROTOCOL
      value: SSL
    - name: ELASTICSEARCH_SSL_TRUSTSTORE_TYPE
      value: PKCS12
    - name: ELASTICSEARCH_SSL_TRUSTSTORE_FILE
      value: /elastic-certificates/truststore-elastic.p12
    - name: ELASTICSEARCH_SSL_TRUSTSTORE_PASSWORD
      valueFrom:
        secretKeyRef:
          name: elasticsearch-certs
          key: truststore.password
    - name: JAVA_OPTS
      value: -Djavax.net.ssl.trustStore=$(ELASTICSEARCH_SSL_TRUSTSTORE_FILE) -Djavax.net.ssl.trustStoreType=$(ELASTICSEARCH_SSL_TRUSTSTORE_TYPE) -Djavax.net.ssl.trustStorePassword=$(ELASTICSEARCH_SSL_TRUSTSTORE_PASSWORD)

Definitively something that should be addressed indatahub helm chart.

Gerrit-K commented 9 months ago

Thanks @ozmoze, that was a good hint. Although, it didn't work for me right away, since the JAVA_OPTS variable (contrary to my prior belief) isn't a standardized option picked up by the JVM (see this comment). It's just often used in scripts, including the GMS startup script, but unfortunately not the datahub-system-update-job that I was testing the connection with.

However, the variable JDK_JAVA_OPTIONS is a standardized option and is automatically picked up by the JVM. And with this, I was finally able to get it to work with these values:

values.yaml ```yaml .esSslCaCertVolume: &esSslCaCertVolume name: es-ca-certs secret: secretName: your-eck-elasticsearch-es-http-certs-public .esSslCaCertVolumeMount: &esSslCaCertVolumeMount name: es-ca-certs mountPath: /mnt/es-ca-certs .esSslTruststoreVolume: &esSslTruststoreVolume name: es-truststore emptyDir: {} .esSslTruststoreVolumeMount: &esSslTruststoreVolumeMount name: es-truststore mountPath: /mnt/es-truststore .esSslTruststoreFileEnv: &esSslTruststoreFileEnv name: ELASTICSEARCH_SSL_TRUSTSTORE_FILE value: /mnt/es-truststore/ca.p12 .esSslTruststoreTypeEnv: &esSslTruststoreTypeEnv name: ELASTICSEARCH_SSL_TRUSTSTORE_TYPE value: PKCS12 .esSslTruststorePasswordEnv: &esSslTruststorePasswordEnv name: ELASTICSEARCH_SSL_TRUSTSTORE_PASSWORD value: datahub .esSslJdkJavaOptionsEnv: &esSslJdkJavaOptionsEnv name: JDK_JAVA_OPTIONS value: "-Djavax.net.ssl.trustStore=$(ELASTICSEARCH_SSL_TRUSTSTORE_FILE) -Djavax.net.ssl.trustStoreType=$(ELASTICSEARCH_SSL_TRUSTSTORE_TYPE) -Djavax.net.ssl.trustStorePassword=$(ELASTICSEARCH_SSL_TRUSTSTORE_PASSWORD)" .esSslTruststoreInitContainer: &esSslTruststoreInitContainer name: convert-certs image: openjdk volumeMounts: - *esSslCaCertVolumeMount - *esSslTruststoreVolumeMount env: - *esSslTruststoreFileEnv - *esSslTruststorePasswordEnv command: - sh - -c - 'keytool -importcert -storetype PKCS12 -trustcacerts -noprompt -file /mnt/es-ca-certs/ca.crt -keystore "$ELASTICSEARCH_SSL_TRUSTSTORE_FILE" -storepass "$ELASTICSEARCH_SSL_TRUSTSTORE_PASSWORD"' datahub: global: elasticsearch: host: your-eck-elasticsearch-es-http useSSL: "true" skipcheck: "true" # skips waiting for elasticsearch in "dockerize", as that cannot handle self-signed certs auth: username: &esUser elastic # FIXME: the upstream chart doesn't support reading this from the secret yet password: secretRef: your-eck-elasticsearch-es-elastic-user secretKey: *esUser elasticsearchSetupJob: enabled: true extraVolumes: - *esSslCaCertVolume extraVolumeMounts: - *esSslCaCertVolumeMount extraEnvs: - name: CURL_CA_BUNDLE value: /mnt/es-ca-certs/ca.crt datahub-gms: extraVolumes: - *esSslCaCertVolume - *esSslTruststoreVolume extraVolumeMounts: - *esSslCaCertVolumeMount - *esSslTruststoreVolumeMount extraEnvs: - *esSslTruststoreFileEnv - *esSslTruststoreTypeEnv - *esSslTruststorePasswordEnv - *esSslJdkJavaOptionsEnv extraInitContainers: - *esSslTruststoreInitContainer datahub-mce-consumer: extraVolumes: - *esSslCaCertVolume - *esSslTruststoreVolume extraVolumeMounts: - *esSslCaCertVolumeMount - *esSslTruststoreVolumeMount extraEnvs: - *esSslTruststoreFileEnv - *esSslTruststoreTypeEnv - *esSslTruststorePasswordEnv - *esSslJdkJavaOptionsEnv extraInitContainers: - *esSslTruststoreInitContainer datahub-mae-consumer: extraVolumes: - *esSslCaCertVolume - *esSslTruststoreVolume extraVolumeMounts: - *esSslCaCertVolumeMount - *esSslTruststoreVolumeMount extraEnvs: - *esSslTruststoreFileEnv - *esSslTruststoreTypeEnv - *esSslTruststorePasswordEnv - *esSslJdkJavaOptionsEnv extraInitContainers: - *esSslTruststoreInitContainer datahub-frontend: extraVolumes: - *esSslCaCertVolume - *esSslTruststoreVolume extraVolumeMounts: - *esSslCaCertVolumeMount - *esSslTruststoreVolumeMount extraEnvs: - *esSslTruststoreFileEnv - *esSslTruststoreTypeEnv - *esSslTruststorePasswordEnv - *esSslJdkJavaOptionsEnv extraInitContainers: - *esSslTruststoreInitContainer datahubSystemUpdate: extraVolumes: - *esSslCaCertVolume - *esSslTruststoreVolume extraVolumeMounts: - *esSslTruststoreVolumeMount extraEnvs: - *esSslTruststoreFileEnv - *esSslTruststoreTypeEnv - *esSslTruststorePasswordEnv - *esSslJdkJavaOptionsEnv extraInitContainers: - *esSslTruststoreInitContainer ```

Needless to say, this is waaay too much logic for a Helm "values" file and should be integrated in the Helm chart.

LilMonk commented 9 months ago

I have ended up using Istio service mesh with mtls. So even though all the components are using HTTP but using the envoy proxy the communication between the services is done on HTTPS.

github-actions[bot] commented 8 months ago

This issue is stale because it has been open for 30 days with no activity. If you believe this is still an issue on the latest DataHub release please leave a comment with the version that you tested it with. If this is a question/discussion please head to https://slack.datahubproject.io. For feature requests please use https://feature-requests.datahubproject.io

github-actions[bot] commented 7 months ago

This issue was closed because it has been inactive for 30 days since being marked as stale.

fcecagno commented 1 month ago

Just used instructions on https://github.com/datahub-project/datahub/issues/8433#issuecomment-1843072352 to be able to use Elasticsearch, it should definitely be part of the chart to avoid so many configurations just to allow using a custom CA.