confluentinc / kafka-connect-jdbc

Kafka Connect connector for JDBC-compatible databases
Other
1.01k stars 954 forks source link

is it possible to include the key from a Kafka message in the sink record? #1278

Closed cyberjar09 closed 1 year ago

cyberjar09 commented 1 year ago

hi im trying to find either native support or support via transformers to write the kafka record key in the sink payload

I see that s3 has support for this (link) but I don't see any such option in the JDBC sink connector.

appreciate the guidance, im new to this ecosystem so pls excuse the question if im overlooking something obvious =)

OneCricketeer commented 1 year ago

The primary key of the database should map to the key of the kafka record.

cyberjar09 commented 1 year ago

hey thanks for the response. What im trying to achieve tho is to include the PK of the db row in the record value as well. It only appears in the key and the key is not written to the JDBC sink

OneCricketeer commented 1 year ago

include the PK of the db row in the record value as well.

Use a transform.

the key is not written to the JDBC sink

Are you sure about that? It's the primary key when you use 'pk.mode' = 'record_key'. Refer https://rmoff.net/2021/03/12/kafka-connect-jdbc-sink-deep-dive-working-with-primary-keys/

cyberjar09 commented 1 year ago

im using pk.mode=kafka

OneCricketeer commented 1 year ago

Then change it? You asked me if it was possible to put the key in the database, and it is.

cyberjar09 commented 1 year ago

hey @OneCricketeer thanks for the response. My requirements unfortunately need me to keep track of the kafka offset etc so using kafka is what I must do. No matter, with the help of Robin Moffatt I was directed to

so this should solve my problem

OneCricketeer commented 1 year ago

keep track of the kafka offset etc

You can use InsertField transform to add that offset, partition, etc. That still leaves possibility of using 'pk.mode' = 'record_key'