confluentinc / kafka-connect-jdbc

Kafka Connect connector for JDBC-compatible databases
Other
1.01k stars 953 forks source link

Is it impossible to name the same table in different schemas? #1334

Closed Hyunho-Jung closed 1 year ago

Hyunho-Jung commented 1 year ago

I have a MySQL source database with two schemas, db1 and db2, on a single server. There is a table name is sbtest1 in db1 schema and another sbtest1 in db2 schema.

I am using the Kafka JDBC sink connector to replicate data to PostgreSQL The JDBC sink connector connects to the schemas separately using currentSchema=db1 and currentSchema=db2.

"connection.url": "jdbc:postgresql://xxxxxx:5433/xxxxxx?currentSchema=db1"

"connection.url": "jdbc:postgresql://xxxxxx:5433/xxxxxx?currentSchema=db2"

However, when I try to insert data into sbtest1 in db2 schema and the table does not exist error occurs The connector assumes that the table already exists and tries to insert data, resulting in the error below that the table does not exist.

Caused by: java.sql.SQLException: Exception chain: java.sql.BatchUpdateException: Batch entry 0 INSERT INTO "sbtest1" ("id","k","c","pad") VALUES (...') ON CONFLICT ("id") DO UPDATE SET "k"=EXCLUDED."k","c"=EXCLUDED."c","pad"=EXCLUDED."pad" was aborted: ERROR: relation "sbtest1" does not exist Position: 13 Call getNextException to see other errors in the batch. org.postgresql.util.PSQLException: ERROR: relation "sbtest1" does not exist

What I'm trying to configure is that MySQL db1.sbtest1 is entered as postgresql db1.sbtest1 and MySQL db2.sbtest1 is entered as db2.sbtest2 in postgresql.

In this case, I was wondering if the logic checked that the table already exists is a normal process, and if there are other settings to resolve these issues.