Open samsaida opened 4 years ago
You should include your configuration file(s) and the JSON you send to make the connector (obviously with any sensitive data, such as passwords, removed)
It would make it easier to help you
Please find the configurations below.
- name: KAFKA_DEBUG
value: 'y'
- name: CONNECT_LOG4J_ROOT_LOGLEVEL
value: INFO
- name: CONNECT_TOOLS_LOG4J_LOGLEVEL
value: INFO
- name: KAFKA_OPTS
value: <>
- name: CONNECT_BOOTSTRAP_SERVERS
value: <>
- name: CONNECT_REST_HOST_NAME
value: connect
- name: CONNECT_REST_ADVERTISED_HOST_NAME
Value: connect
- name: CONNECT_REST_PORT
value: "8083"
- name: KAFKA_ZOOKEEPER_CONNECT
value: <>
- name: CONNECT_GROUP_ID
value: "compose-connect-group"
- name: CONNECT_CONFIG_STORAGE_TOPIC
value: "docker-connect-configs"
- name: CONNECT_CONFIG_STORAGE_REPLICATION_FACTOR
value: 1
- name: CONNECT_OFFSET_STORAGE_TOPIC
value: "docker-connect-offsets"
- name: CONNECT_OFFSET_STORAGE_REPLICATION_FACTOR
value: 1
- name: CONNECT_OFFSET_FLUSH_INTERVAL_MS
value: 10000
- name: CONNECT_STATUS_STORAGE_TOPIC
value: "docker-connect-status"
- name: CONNECT_STATUS_STORAGE_REPLICATION_FACTOR
value: 1
- name: CONNECT_KEY_CONVERTER
value: org.apache.kafka.connect.storage.StringConverter
- name: CONNECT_KEY_CONVERTER_SCHEMA_REGISTRY_URL
value: <>
- name: CONNECT_VALUE_CONVERTER
value: io.confluent.connect.avro.AvroConverter
- name: CONNECT_VALUE_CONVERTER_SCHEMA_REGISTRY_URL
value: <>
- name: CONNECT_INTERNAL_KEY_CONVERTER
value: org.apache.kafka.connect.json.JsonConverter
- name: CONNECT_INTERNAL_VALUE_CONVERTER
value: org.apache.kafka.connect.json.JsonConverter
- name: CONNECT_PLUGIN_PATH
value: <>
- name: CONNECT_CONFIG_PROVIDERS
value: <>
- name: CONNECT_CONFIG_PROVIDERS_VAULT_CLASS
value: <>
- name: CONNECT_CONFIG_PROVIDERS_VAULT_PARAM_URI
value: <>
- name: CONNECT_CONFIG_PROVIDERS_VAULT_PARAM_TOKEN
value: <>
- name: CONNECT_SECURITY_PROTOCOL
value: SSL
- name: CONNECT_SSL_KEYSTORE_LOCATION
value: <>
- name: CONNECT_SSL_KEYSTORE_PASSWORD
value: <>
- name: CONNECT_SSL_KEY_PASSWORD
value: <>
- name: CONNECT_SSL_TRUSTSTORE_LOCATION
value: <>
- name: CONNECT_SSL_TRUSTSTORE_PASSWORD
value: <>
Connector configuration:
{
"name": "jdbc-sink",
"config": {
"connector.class": "
I don't see anything obvious right away. You didn't change anything else except adding SSL? Do you have access to run a worker version "locally" and view the output? And I assume you have double checked to make sure the keystore/truststores exist and their paths/passwords are specified correctly?
You need to configure ojdbc.properties file to help the driver to locate truststore.jks and keystore.jks
https://www.oracle.com/database/technologies/java-connectivity-to-atp.html
Hi, I have configured jdbc sink connector in ssl authentication enabled Kafka connect cluster. I am trying to sink data to Oracle DB through wallet. But the sink is not working and it doesn't show any error in logs. The whole set up is deployed in Kubernetes.
I have the connect and worker properties configured according to the documentation. Without ssl, the sink was working properly. Could you please let me know if there are any issues in connect with ssl enabled and it would be a great help if anyone can give pointers to the issue?
Thanks in advance.