The Spark job fails with the following exception:
javax.net.ssl.SSLHandshakeException: No subject alternative DNS name matching kafka.stackable-products.svc.cluster.local found.
The certificates of the Kafka brokers contain the following SANs:
X509v3 Subject Alternative Name: critical
DNS:kafka-broker-default.stackable-products.svc.cluster.local, DNS:kafka-broker-default-0.kafka-broker-default.stackable-products.svc.cluster.local, DNS:aks-userpool-28268085-
vmss00000p, IP Address:10.224.1.34, IP Address:172.201.136.222
Definition of Done Checklist
Not all of these items are applicable to all PRs, the author should update this template to only leave the boxes in that are relevant
Please make sure all these things are done and tick the boxes
# Author
- [ ] Changes are OpenShift compatible
- [ ] CRD changes approved
- [ ] CRD documentation for all fields, following the [style guide](https://docs.stackable.tech/home/nightly/contributor/docs/style-guide).
- [ ] Helm chart can be installed and deployed operator works
- [ ] Integration tests passed (for non trivial changes)
- [ ] Changes need to be "offline" compatible
# Reviewer
- [x] Code contains useful comments
- [x] Changelog updated
- [x] Cargo.toml only contains references to git tags (not specific commits or branches)
# Acceptance
- [x] Feature Tracker has been updated
- [x] Proper release label has been added
- [x] [Roadmap](https://github.com/orgs/stackabletech/projects/25/views/1) has been updated
Description
Reported in https://stackable-workspace.slack.com/archives/C0312UB3LLE/p1722427738668959.
I am using the broker address from the discovery ConfigMap:
The Spark job fails with the following exception:
javax.net.ssl.SSLHandshakeException: No subject alternative DNS name matching kafka.stackable-products.svc.cluster.local found.
The certificates of the Kafka brokers contain the following SANs:
Definition of Done Checklist