exasol / kafka-connect-jdbc-exasol

Exasol dialect for the Kafka Connect JDBC Connector
Apache License 2.0
10 stars 7 forks source link

Null pointer exception after connector configuration #19

Closed mst94 closed 3 years ago

mst94 commented 3 years ago

Hello again,

due to some infrastructural changes, I had to set up the whole system up again.

Unfortunately, it does not work anymore and I have to request your help once again (this time the missing hard disk space should not be the problem;))

After starting up exasol in my docker environment via docker-compose up -d and creating the exasol table successfully, I want to configure the source and sink connector. My exasol-source.json looks this way:

{ "name": "exasol-source", "config": { "connector.class": "io.confluent.connect.jdbc.JdbcSourceConnector", "tasks.max": "1", "connection.url": "jdbc:exa:exasol-db:8563;schema=country_schema;user=sys;password=exasol", "mode": "timestamp+incrementing", "timestamp.column.name": "UPDATED_AT", "incrementing.column.name": "ID", "topic.prefix": "EXASOL_" } }

When checking the status after creation, I get a null pointer exception in the resulting json status message.

{"name":"exasol-source","config":{"connector.class":"io.confluent.connect.jdbc.JdbcSourceConnector","tasks.max":"1","connection.url":"jdbc:exa:exasol-db:8563;schema=country_schema;user=sys;password=exasol","mode":"timestamp+incrementing","timestamp.column.name":"UPDATED_AT","incrementing.column.name":"ID","topic.prefix":"EXASOL_","name":"exasol-source"},"tasks":[],"type":"source"}vagrant@master:/vagrant/spark-cluster$ curl localhost:8083/connectors/exasol-source/status {"name":"exasol-source","connector":{"state":"FAILED","worker_id":"kafka-connect:8083","trace":"java.lang.NullPointerException\n\tat io.confluent.connect.jdbc.dialect.DatabaseDialects.findBestFor(DatabaseDialects.java:134)\n\tat io.confluent.connect.jdbc.JdbcSourceConnector.start(JdbcSourceConnector.java:85)\n\tat org.apache.kafka.connect.runtime.WorkerConnector.doStart(WorkerConnector.java:185)\n\tat org.apache.kafka.connect.runtime.WorkerConnector.start(WorkerConnector.java:210)\n\tat org.apache.kafka.connect.runtime.WorkerConnector.doTransitionTo(WorkerConnector.java:349)\n\tat org.apache.kafka.connect.runtime.WorkerConnector.doTransitionTo(WorkerConnector.java:332)\n\tat org.apache.kafka.connect.runtime.WorkerConnector.doRun(WorkerConnector.java:140)\n\tat org.apache.kafka.connect.runtime.WorkerConnector.run(WorkerConnector.java:117)\n\tat java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)\n\tat java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)\n\tat java.base/java.lang.Thread.run(Thread.java:834)\n"},"tasks":[],"type":"source"

When configuring the sink, the same problem occurs. The EXASOL-topic is not created as well.

I have built the kafka-connect-jdbc-exasol-1.0.0.jar with maven like in explained in the readme, after setting the right version numbers in the belonging POM. I also have added the exasol-jdbc-6.0.8.jar from the maven repository (also tried with the most recent one). Both jars are in the specified jars directory which is also mounted correctly in my docker-compose file and accessible to the docker container. The kafka-connect-jdbc in version 5.5.2 (10.0.0 also tried) is installed via confluent-hub-client automatically by an start-up script in the docker compose file.

But it seems, that the kafka jdbc does not find the exasol jdbc or so?

Thanks in advance for your help again.

Mario

morazow commented 3 years ago

Hello @mst94,

But it seems, that the kafka jdbc does not find the exasol jdbc or so?

Yes, this seems to be the case. Could you please check if the the jar plugins path changed?

You can search for plugin.path in the configuration files in Kafka connect JDBC, and put the jars into that path. Or the environment variable CONNECT_PLUGIN_PATH if you set it with docker container.

mst94 commented 3 years ago

Hi @morazow,

they all should be on the right place.

I set the connect plugin path in my docker compose file: CONNECT_PLUGIN_PATH: /usr/share/java,/etc/kafka-connect/jars,/usr/share/confluent-hub-components/

This path is also set in the kafka-connect properties file and I can find the jars on that place when checking it out via kafka connect bash.

Are there any known restrictions or compatibly issues regarding the different versions of the exasol-jdbc jar which can be downloaded in the linked maven repository? Currently i am using the 6.0.8 version but I have tried the most recent one, too.

morazow commented 3 years ago

Hello @mst94,

Are there any known restrictions or compatibly issues regarding the different versions of the exasol-jdbc jar which can be downloaded in the linked maven repository?

There should not be any compatibility issues. If it is downloaded from maven, then the resolver path should be set to "https://maven.exasol.com/artifactory/exasol-releases". But you can directly copy the JDBC jar into the plugin path together with the connector jar.

We can have a call session if you cannot resolve the issue. Please feel free to write an email to the opensource at exasol dot com.

mst94 commented 3 years ago

Hello @morazow,

thanks for the information. I will try it out one more time now and contact you soon via mail if the problem is still existing. Then a call session will be the most hopeful way.

mst94 commented 3 years ago

I could fix the issue by inserting the kafka-connect-jdbc.jar into the plugin folder, where the jdbc driver jar and kafka-connect-exasol.jar are located. I thought that this step is not necessary because I installed the kafka-connect-jdbc-plugin via confluent-hub. But this additional step seems to be necessary. Thanks for your help!