averemee-si / oracdc

Oracle database CDC (Change Data Capture)
http://a2-solutions.eu/
Apache License 2.0
105 stars 36 forks source link

The connect must be restarted can get the data #31

Closed Gogo-scc closed 1 year ago

Gogo-scc commented 1 year ago

HI I encountered the following problem when testing: 1,Useing oracle 11gr2(11.2.0.4) 2,Archival enable alter database add supplemental log data;

alter table ABC.TEST32 add supplemental log data (ALL) columns;

3,Start the kafka connect bin/connect-standalone etc/kafka/connect-standalone.properties etc/kafka/logminer-source-testapps.properties

4, Create and insert table

create table test32( myid int not null primary key, myname varchar(500) )

insert into test32 values (4,'22'); insert into test32 values (5,'23'); insert into test32 values (6,'24'); insert into test32 values (7,'25'); insert into test32 values (8,'26');

5,The kafka topic get the data when I start kafka connect is ok. But can not get data from topic when continuing to insert data into the table.I can get the topic data when I must be restart the connect

[root@kafka2 confluent-5.4.8]# bin/kafka-console-consumer --bootstrap-server 172.18.6.112:9092 --topic ABC-TEST32 --property print.key=true --property print.value=true --from-beginning [2022-10-24 12:22:39,669] WARN [Consumer clientId=consumer-console-consumer-79494-1, groupId=console-consumer-79494] Error while fetching metadata with correlation id 2 : {ABC-TEST32=LEADER_NOT_AVAILABLE} (org.apache.kafka.clients.NetworkClient) {"schema":{"type":"struct","fields":[{"type":"bytes","optional":false,"name":"org.apache.kafka.connect.data.Decimal","version":1,"parameters":{"scale":"0"},"field":"MYID"}],"optional":false,"name":"ABC.TEST32.Key","version":1},"payload":{"MYID":"BA=="}} {"schema":{"type":"struct","fields":[{"type":"string","optional":true,"field":"MYNAME"}],"optional":true,"name":"ABC.TEST32.Value","version":1},"payload":{"MYNAME":"22"}} {"schema":{"type":"struct","fields":[{"type":"bytes","optional":false,"name":"org.apache.kafka.connect.data.Decimal","version":1,"parameters":{"scale":"0"},"field":"MYID"}],"optional":false,"name":"ABC.TEST32.Key","version":1},"payload":{"MYID":"BQ=="}} {"schema":{"type":"struct","fields":[{"type":"string","optional":true,"field":"MYNAME"}],"optional":true,"name":"ABC.TEST32.Value","version":1},"payload":{"MYNAME":"23"}}

Gogo-scc commented 1 year ago

The kafka Connect log:

2022-10-24 12:22:47,644] INFO Connector logminer-source-testapps connected to Oracle Database 11g Enterprise Edition , 11.2.0.4.0 $ORACLE_SID=orcl1, running on orcl1, OS Linux x86 64-bit. (solutions.a2.cdc.oracle.OraCdcLogMinerTask:247) [2022-10-24 12:22:47,671] INFO No data present in connector's offset storage for orcl1_orcl1:1452920421 (solutions.a2.cdc.oracle.OraCdcLogMinerTask:541) [2022-10-24 12:22:47,671] INFO oracdc will start from minimum available SCN in V$ARCHIVED_LOG = 0. (solutions.a2.cdc.oracle.OraCdcLogMinerTask:559) [2022-10-24 12:22:47,680] INFO Initializing oracdc logminer archivelog worker thread (solutions.a2.cdc.oracle.OraCdcLogMinerWorkerThread:138) [2022-10-24 12:22:47,735] INFO LogMiner will start from SCN 0 (solutions.a2.cdc.oracle.OraCdcV$ArchivedLogImpl:105) [2022-10-24 12:22:47,736] INFO Mining database orcl1 is in OPEN mode (solutions.a2.cdc.oracle.OraCdcLogMinerWorkerThread:250) [2022-10-24 12:22:47,736] INFO Same database will be used for dictionary query and mining (solutions.a2.cdc.oracle.OraCdcLogMinerWorkerThread:252) [2022-10-24 12:22:47,736] INFO RowPrefetch size for accessing V$LOGMNR_CONTENTS set to 32. (solutions.a2.cdc.oracle.OraCdcLogMinerWorkerThread:270) [2022-10-24 12:22:47,827] INFO Initializing oracdc initial load thread (solutions.a2.cdc.oracle.OraCdcInitialLoadThread:64) [2022-10-24 12:22:47,828] INFO DB cores available 2, Kafka Cores available 4. (solutions.a2.cdc.oracle.OraCdcInitialLoadThread:74) [2022-10-24 12:22:47,828] INFO {} parallel loaders for select phase will be used. (solutions.a2.cdc.oracle.OraCdcInitialLoadThread:75) [2022-10-24 12:22:47,828] INFO BEGIN: OraCdcInitialLoadThread.run() (solutions.a2.cdc.oracle.OraCdcInitialLoadThread:86) [2022-10-24 12:22:47,831] INFO WorkerSourceTask{id=logminer-source-testapps-0} Source task finished initialization and start (org.apache.kafka.connect.runtime.WorkerSourceTask:215) [2022-10-24 12:22:47,831] INFO BEGIN: OraCdcLogMinerWorkerThread.run() (solutions.a2.cdc.oracle.OraCdcLogMinerWorkerThread:338) WARNING: An illegal reflective access operation has occurred WARNING: Illegal reflective access using Lookup on net.openhft.chronicle.core.Jvm (file:/kafka/connect/lib/chronicle-core-2.21.95.jar) to class java.lang.reflect.AccessibleObject WARNING: Please consider reporting this to the maintainers of net.openhft.chronicle.core.Jvm WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations WARNING: All illegal access operations will be denied in a future release [2022-10-24 12:22:47,881] INFO Chronicle core loaded from file:/kafka/connect/lib/chronicle-core-2.21.95.jar (net.openhft.chronicle.core.Jvm:169) Oct 24, 2022 12:22:47 PM net.openhft.chronicle.core.cleaner.impl.reflect.ReflectionBasedByteBufferCleanerService WARNING: Make sure you have set the command line option "--illegal-access=permit --add-exports java.base/jdk.internal.ref=ALL-UNNAMED" to enable ReflectionBasedByteBufferCleanerService [2022-10-24 12:22:47,927] INFO Took 6 ms to add mapping for /kafka/tempdir/ABC.TABLE_TEST.15121835068505381446/metadata.cq4t (net.openhft.chronicle.bytes.MappedFile:56) [2022-10-24 12:22:47,981] INFO Running under OpenJDK Runtime Environment 11.0.8+10-LTS with 4 processors reported. (net.openhft.chronicle.core.internal.announcer.InternalAnnouncer:56) [2022-10-24 12:22:47,983] INFO Process id: 18273 :: Chronicle Queue (5.21.99) (net.openhft.chronicle.core.internal.announcer.InternalAnnouncer:56) [2022-10-24 12:22:47,984] INFO Analytics: Chronicle Queue reports usage statistics. Learn more or turn off: https://github.com/OpenHFT/Chronicle-Queue/blob/master/DISCLAIMER.adoc (net.openhft.chronicle.core.internal.announcer.InternalAnnouncer:56) [2022-10-24 12:22:48,003] INFO Table ABC.TABLE_TEST (DEPENDENCY='DISABLED') initial load (read phase) started. (solutions.a2.cdc.oracle.OraTable4InitialLoad:582) [2022-10-24 12:22:48,003] INFO Table ABC.TEST32 (DEPENDENCY='DISABLED') initial load (read phase) started. (solutions.a2.cdc.oracle.OraTable4InitialLoad:582) [2022-10-24 12:22:48,014] INFO Took 1.78 ms to pollDiskSpace for /kafka/tempdir/ABC.TEST32.6097305490013168521 (net.openhft.chronicle.threads.DiskSpaceMonitor:56) [2022-10-24 12:22:48,014] INFO Took 2.725 ms to pollDiskSpace for /kafka/tempdir/ABC.TABLE_TEST.15121835068505381446 (net.openhft.chronicle.threads.DiskSpaceMonitor:56) [2022-10-24 12:22:48,046] INFO Table ABC.TABLE_TEST initial load (read phase) completed. 7 rows read. (solutions.a2.cdc.oracle.OraTable4InitialLoad:597) [2022-10-24 12:22:48,046] INFO Table ABC.TEST32 initial load (read phase) completed. 2 rows read. (solutions.a2.cdc.oracle.OraTable4InitialLoad:597) [2022-10-24 12:22:48,047] INFO END: OraCdcInitialLoadThread.run(), elapsed time 216 ms (solutions.a2.cdc.oracle.OraCdcInitialLoadThread:117) [2022-10-24 12:22:49,832] INFO Table ABC.TABLE_TEST initial load (send to Kafka phase) started. (solutions.a2.cdc.oracle.OraCdcLogMinerTask:807) [2022-10-24 12:22:49,865] WARN [Producer clientId=connector-producer-logminer-source-testapps-0] Error while fetching metadata with correlation id 3 : {ABC-TABLE_TEST=LEADER_NOT_AVAILABLE} (org.apache.kafka.clients.NetworkClient:1075) [2022-10-24 12:22:49,985] INFO Table ABC.TABLE_TEST initial load (send to Kafka phase) completed. (solutions.a2.cdc.oracle.OraCdcLogMinerTask:821) [2022-10-24 12:22:49,987] INFO Table ABC.TEST32 initial load (send to Kafka phase) started. (solutions.a2.cdc.oracle.OraCdcLogMinerTask:807) [2022-10-24 12:22:49,998] INFO Table ABC.TEST32 initial load (send to Kafka phase) completed. (solutions.a2.cdc.oracle.OraCdcLogMinerTask:821) [2022-10-24 12:22:52,049] INFO Initial load completed (solutions.a2.cdc.oracle.OraCdcLogMinerTask:791) [2022-10-24 12:22:56,870] INFO WorkerSourceTask{id=logminer-source-testapps-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:422) [2022-10-24 12:22:56,871] INFO WorkerSourceTask{id=logminer-source-testapps-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:439) [2022-10-24 12:22:56,875] INFO WorkerSourceTask{id=logminer-source-testapps-0} Finished commitOffsets successfully in 4 ms (org.apache.kafka.connect.runtime.WorkerSourceTask:521) [2022-10-24 12:23:06,875] INFO WorkerSourceTask{id=logminer-source-testapps-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:422) [2022-10-24 12:23:06,875] INFO WorkerSourceTask{id=logminer-source-testapps-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:439) [2022-10-24 12:23:16,876] INFO WorkerSourceTask{id=logminer-source-testapps-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:422) [2022-10-24 12:23:16,876] INFO WorkerSourceTask{id=logminer-source-testapps-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:439) [2022-10-24 12:23:26,876] INFO WorkerSourceTask{id=logminer-source-testapps-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:422) [2022-10-24 12:23:26,877] INFO WorkerSourceTask{id=logminer-source-testapps-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:439) [2022-10-24 12:23:36,877] INFO WorkerSourceTask{id=logminer-source-testapps-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:422) [2022-10-24 12:23:36,877] INFO WorkerSourceTask{id=logminer-source-testapps-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:439) [2022-10-24 12:23:46,878] INFO WorkerSourceTask{id=logminer-source-testapps-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:422) [2022-10-24 12:23:46,878] INFO WorkerSourceTask{id=logminer-source-testapps-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:439) [2022-10-24 12:23:56,878] INFO WorkerSourceTask{id=logminer-source-testapps-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:422) [2022-10-24 12:23:56,879] INFO WorkerSourceTask{id=logminer-source-testapps-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:439) [2022-10-24 12:24:06,879] INFO WorkerSourceTask{id=logminer-source-testapps-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:422) [2022-10-24 12:24:06,880] INFO WorkerSourceTask{id=logminer-source-testapps-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:439) [2022-10-24 12:24:16,880] INFO WorkerSourceTask{id=logminer-source-testapps-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:422) [2022-10-24 12:24:16,881] INFO WorkerSourceTask{id=logminer-source-testapps-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:439) [2022-10-24 12:24:26,881] INFO WorkerSourceTask{id=logminer-source-testapps-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:422) [2022-10-24 12:24:26,881] INFO WorkerSourceTask{id=logminer-source-testapps-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:439) [2022-10-24 12:24:36,882] INFO WorkerSourceTask{id=logminer-source-testapps-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:422) [2022-10-24 12:24:36,882] INFO WorkerSourceTask{id=logminer-source-testapps-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:439) [2022-10-24 12:24:46,882] INFO WorkerSourceTask{id=logminer-source-testapps-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:422) [2022-10-24 12:24:46,883] INFO WorkerSourceTask{id=logminer-source-testapps-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:439) [2022-10-24 12:24:56,883] INFO WorkerSourceTask{id=logminer-source-testapps-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:422) [2022-10-24 12:24:56,883] INFO WorkerSourceTask{id=logminer-source-testapps-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:439) [2022-10-24 12:25:06,884] INFO WorkerSourceTask{id=logminer-source-testapps-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:422) [2022-10-24 12:25:06,884] INFO WorkerSourceTask{id=logminer-source-testapps-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:439) [2022-10-24 12:25:16,884] INFO WorkerSourceTask{id=logminer-source-testapps-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:422) [2022-10-24 12:25:16,885] INFO WorkerSourceTask{id=logminer-source-testapps-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:439) [2022-10-24 12:25:26,885] INFO WorkerSourceTask{id=logminer-source-testapps-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:422) [2022-10-24 12:25:26,885] INFO WorkerSourceTask{id=logminer-source-testapps-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:439) [2022-10-24 12:25:36,886] INFO WorkerSourceTask{id=logminer-source-testapps-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:422) [2022-10-24 12:25:36,886] INFO WorkerSourceTask{id=logminer-source-testapps-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:439) [2022-10-24 12:25:46,886] INFO WorkerSourceTask{id=logminer-source-testapps-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:422) [2022-10-24 12:25:46,887] INFO WorkerSourceTask{id=logminer-source-testapps-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:439) [2022-10-24 12:25:56,887] INFO WorkerSourceTask{id=logminer-source-testapps-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:422) [2022-10-24 12:25:56,887] INFO WorkerSourceTask{id=logminer-source-testapps-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:439) [2022-10-24 12:26:06,888] INFO WorkerSourceTask{id=logminer-source-testapps-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:422) [2022-10-24 12:26:06,888] INFO WorkerSourceTask{id=logminer-source-testapps-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:439) [2022-10-24 12:26:16,888] INFO WorkerSourceTask{id=logminer-source-testapps-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:422) [2022-10-24 12:26:16,889] INFO WorkerSourceTask{id=logminer-source-testapps-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:439) [2022-10-24 12:26:26,889] INFO WorkerSourceTask{id=logminer-source-testapps-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:422) [2022-10-24 12:26:26,890] INFO WorkerSourceTask{id=logminer-source-testapps-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:439) [2022-10-24 12:26:36,890] INFO WorkerSourceTask{id=logminer-source-testapps-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:422) [2022-10-24 12:26:36,890] INFO WorkerSourceTask{id=logminer-source-testapps-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:439) [2022-10-24 12:26:46,891] INFO WorkerSourceTask{id=logminer-source-testapps-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:422) [2022-10-24 12:26:46,891] INFO WorkerSourceTask{id=logminer-source-testapps-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:439) [2022-10-24 12:26:56,891] INFO WorkerSourceTask{id=logminer-source-testapps-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:422) [2022-10-24 12:26:56,892] INFO WorkerSourceTask{id=logminer-source-testapps-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:439) [2022-10-24 12:27:06,892] INFO WorkerSourceTask{id=logminer-source-testapps-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:422) [2022-10-24 12:27:06,893] INFO WorkerSourceTask{id=logminer-source-testapps-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:439) [2022-10-24 12:27:16,893] INFO WorkerSourceTask{id=logminer-source-testapps-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:422) [2022-10-24 12:27:16,894] INFO WorkerSourceTask{id=logminer-source-testapps-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:439) [2022-10-24 12:27:26,894] INFO WorkerSourceTask{id=logminer-source-testapps-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:422) [2022-10-24 12:27:26,894] INFO WorkerSourceTask{id=logminer-source-testapps-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:439) [2022-10-24 12:27:36,895] INFO WorkerSourceTask{id=logminer-source-testapps-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:422) [2022-10-24 12:27:36,895] INFO WorkerSourceTask{id=logminer-source-testapps-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:439) [2022-10-24 12:27:46,895] INFO WorkerSourceTask{id=logminer-source-testapps-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:422) [2022-10-24 12:27:46,896] INFO WorkerSourceTask{id=logminer-source-testapps-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:439)

averemee-si commented 1 year ago

Hello,

According to log provided there is no data in V$ARCHIVED_LOG: I didn't see any read from archived redo. oracdc works with data from archived logs only. To test please execute as SYSDBA after commit:

alter system switch logfile
/

and retest issue.

Hope this helps.

Best regards, Aleksei

Gogo-scc commented 1 year ago

Thanks for reply!I get the error message when I execute “alter system switch logfile”

[2022-10-26 15:40:42,078] INFO WorkerSourceTask{id=logminer-source-testapps-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:439) [2022-10-26 15:40:52,078] INFO WorkerSourceTask{id=logminer-source-testapps-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:422) [2022-10-26 15:40:52,078] INFO WorkerSourceTask{id=logminer-source-testapps-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:439) [2022-10-26 15:40:55,193] INFO Adding archived log /home/oracle/archive_log/1_1_1119103460.dbf thread# 1 sequence# 1 first change number 925702 next log first change 940365 (solutions.a2.cdc.oracle.OraCdcV$ArchivedLogImpl:201) [2022-10-26 15:40:55,459] ERROR Data dictionary corruption for OBJECT_ID '87343' (solutions.a2.cdc.oracle.OraCdcLogMinerWorkerThread:656) [2022-10-26 15:40:55,459] ERROR Data dictionary corruption! (solutions.a2.cdc.oracle.OraCdcLogMinerWorkerThread:756) [2022-10-26 15:40:55,459] ERROR SQL errorCode = 0, SQL state = 'null' (solutions.a2.cdc.oracle.OraCdcLogMinerWorkerThread:759) [2022-10-26 15:40:55,459] ERROR Last read row information: SCN=928752, RS_ID=' 0x000001.000012e5.014c ', SSN=0, XID='0100210089020000' (solutions.a2.cdc.oracle.OraCdcLogMinerWorkerThread:762) [2022-10-26 15:40:55,459] ERROR Current query is: select SCN, TIMESTAMP, OPERATION_CODE, XID, RS_ID, SSN, CSF, ROW_ID, DATA_OBJ#, DATA_OBJD#, SQL_REDO from V$LOGMNR_CONTENTS where ((OPERATION_CODE in (1,2,3,5,9,68,70) and (DATA_OBJ# in (87350))) or OPERATION_CODE in (7,36)) or (OPERATION_CODE=0 and DATA_OBJ#=DATA_OBJD# and DATA_OBJ#!=0) (solutions.a2.cdc.oracle.OraCdcLogMinerWorkerThread:765) [2022-10-26 15:40:55,460] ERROR Data dictionary corruption! java.sql.SQLException: Data dictionary corruption! at solutions.a2.cdc.oracle.OraCdcLogMinerWorkerThread.run(OraCdcLogMinerWorkerThread.java:659) (solutions.a2.cdc.oracle.OraCdcLogMinerWorkerThread:767) [2022-10-26 15:40:55,460] INFO Stopping oracdc logminer source task. (solutions.a2.cdc.oracle.OraCdcLogMinerTask:988) Exception in thread "OraCdcLogMinerWorkerThread-4746304871752589" org.apache.kafka.connect.errors.ConnectException: java.sql.SQLException: Data dictionary corruption! at solutions.a2.cdc.oracle.OraCdcLogMinerWorkerThread.run(OraCdcLogMinerWorkerThread.java:773) Caused by: java.sql.SQLException: Data dictionary corruption! at solutions.a2.cdc.oracle.OraCdcLogMinerWorkerThread.run(OraCdcLogMinerWorkerThread.java:659) [2022-10-26 15:41:02,079] INFO WorkerSourceTask{id=logminer-source-testapps-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:422) [2022-10-26 15:41:02,079] INFO WorkerSourceTask{id=logminer-source-testapps-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:439)

averemee-si commented 1 year ago

Hello,

Please send me full connector's log (connect.log) and output of

select OBJECT_NAME,OBJECT_TYPE from DBA_OBJECTS where OBJECT_ID=87350;

Best regards, Aleksei

Gogo-scc commented 1 year ago

Thanks for reply!Now I test it :

restart the connect again

[2022-10-31 14:31:49,460] INFO Took 11 ms to add mapping for /kafka/tempdir/ABC.TEST.7394427838691497766/metadata.cq4t (net.openhft.chronicle.bytes.MappedFile:56) [2022-10-31 14:31:49,518] INFO Running under OpenJDK Runtime Environment 11.0.8+10-LTS with 4 processors reported. (net.openhft.chronicle.core.internal.announcer.InternalAnnouncer:56) [2022-10-31 14:31:49,520] INFO Process id: 25691 :: Chronicle Queue (5.21.99) (net.openhft.chronicle.core.internal.announcer.InternalAnnouncer:56) [2022-10-31 14:31:49,521] INFO Analytics: Chronicle Queue reports usage statistics. Learn more or turn off: https://github.com/OpenHFT/Chronicle-Queue/blob/master/DISCLAIMER.adoc (net.openhft.chronicle.core.internal.announcer.InternalAnnouncer:56) [2022-10-31 14:31:49,537] INFO Table ABC.TEST (DEPENDENCY='DISABLED') initial load (read phase) started. (solutions.a2.cdc.oracle.OraTable4InitialLoad:582) [2022-10-31 14:31:49,549] INFO Took 2.361 ms to pollDiskSpace for /kafka/tempdir/ABC.TEST.7394427838691497766 (net.openhft.chronicle.threads.DiskSpaceMonitor:56) [2022-10-31 14:31:49,559] INFO Table ABC.TEST initial load (read phase) completed. 3 rows read. (solutions.a2.cdc.oracle.OraTable4InitialLoad:597) [2022-10-31 14:31:49,560] INFO END: OraCdcInitialLoadThread.run(), elapsed time 204 ms (solutions.a2.cdc.oracle.OraCdcInitialLoadThread:117) [2022-10-31 14:31:49,638] ERROR Data dictionary corruption for OBJECT_ID '87343' (solutions.a2.cdc.oracle.OraCdcLogMinerWorkerThread:656) [2022-10-31 14:31:49,638] ERROR Data dictionary corruption! (solutions.a2.cdc.oracle.OraCdcLogMinerWorkerThread:756) [2022-10-31 14:31:49,638] ERROR SQL errorCode = 0, SQL state = 'null' (solutions.a2.cdc.oracle.OraCdcLogMinerWorkerThread:759) [2022-10-31 14:31:49,638] ERROR Last read row information: SCN=928752, RS_ID=' 0x000001.000012e5.014c ', SSN=0, XID='0100210089020000' (solutions.a2.cdc.oracle.OraCdcLogMinerWorkerThread:762) [2022-10-31 14:31:49,638] ERROR Current query is: select SCN, TIMESTAMP, OPERATION_CODE, XID, RS_ID, SSN, CSF, ROW_ID, DATA_OBJ#, DATA_OBJD#, SQL_REDO from V$LOGMNR_CONTENTS where ((OPERATION_CODE in (1,2,3,5,9,68,70) and (DATA_OBJ# in (87351))) or OPERATION_CODE in (7,36)) or (OPERATION_CODE=0 and DATA_OBJ#=DATA_OBJD# and DATA_OBJ#!=0) (solutions.a2.cdc.oracle.OraCdcLogMinerWorkerThread:765) [2022-10-31 14:31:49,639] ERROR Data dictionary corruption! java.sql.SQLException: Data dictionary corruption! at solutions.a2.cdc.oracle.OraCdcLogMinerWorkerThread.run(OraCdcLogMinerWorkerThread.java:659) (solutions.a2.cdc.oracle.OraCdcLogMinerWorkerThread:767) [2022-10-31 14:31:49,639] INFO Stopping oracdc logminer source task. (solutions.a2.cdc.oracle.OraCdcLogMinerTask:988) Exception in thread "OraCdcLogMinerWorkerThread-5174421251908545" org.apache.kafka.connect.errors.ConnectException: java.sql.SQLException: Data dictionary corruption! at solutions.a2.cdc.oracle.OraCdcLogMinerWorkerThread.run(OraCdcLogMinerWorkerThread.java:773) Caused by: java.sql.SQLException: Data dictionary corruption! at solutions.a2.cdc.oracle.OraCdcLogMinerWorkerThread.run(OraCdcLogMinerWorkerThread.java:659) [2022-10-31 14:31:58,399] INFO WorkerSourceTask{id=logminer-source-testapps-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:422) [2022-10-31 14:31:58,400] INFO WorkerSourceTask{id=logminer-source-testapps-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:439) [2022-10-31 14:32:08,401] INFO WorkerSourceTask{id=logminer-source-testapps-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:422)

Execute sql output

select OBJECT_NAME,OBJECT_TYPE from DBA_OBJECTS where OBJECT_ID=87350;
OBJECT_NAME OBJECT_TYPE
select OBJECT_NAME,OBJECT_TYPE from DBA_OBJECTS where OBJECT_ID=87351;
OBJECT_TYPE OBJECT_NAME
TABLE   TEST
averemee-si commented 1 year ago

Hello,

Thanks for update. The complete connector log is required, not a fragment of it. Or at least: 1) Connector parameters 2) Connector version

Best regards, Aleksei

averemee-si commented 1 year ago

select OBJECT_NAME,OBJECT_TYPE from DBA_OBJECTS where OBJECT_ID=87350; OBJECT_NAME OBJECT_TYPE

No rows means that in restart information (stored in files or in Kafka topic - depends on configuration) exist information about OBJECT_ID 87350 which does not exist in database.

Best regards, Aleksei

Gogo-scc commented 1 year ago

OK !Thanks averemee,Yes .I restart the kafka connect.

1,the Connect parameters are as follows,Is this setting correct?

root@kafka2 confluent-5.4.8]# cat etc/kafka/logminer-source-testapps.properties 
name=logminer-source-testapps
connector.class=solutions.a2.cdc.oracle.OraCdcLogMinerConnector
tasks.max=1

errors.tolerance=none
errors.log.enable=true
errors.log.include.messages=true

a2.poll.interval=2000
a2.batch.size=1
a2.process.lobs=true
a2.schema.type=kafka
wla2.redo.size=20000
#a2.resiliency.type=fault-tolerant
a2.jdbc.url=jdbc:oracle:thin:@172.18.6.33:1521/orcl1
a2.jdbc.username=abc
a2.jdbc.password=123456
a2.tmpdir=/kafka/tempdir
a2.persistent.state.file=/kafka/tempdir/oracdc.state
#a2.wallet.location=/kafka/wallets
#a2.tns.admin=/lib/oracle/19.9/client64/lib/network/admin
#a2.tns.alias=ORCL1-33
a2.topic.name.delimiter=-
a2.topic.name.style=SCHEMA_TABLE
a2.include=ABC.TEST.initial.load=EXECUTE

2,I test connect by version 1.2.2


[root@kafka2 confluent-5.4.8]# ls ../oracdc/target/oracdc-kafka-1.2.2.jar 
../oracdc/target/oracdc-kafka-1.2.2.jar
averemee-si commented 1 year ago

Thanks!

Please upload content of /kafka/tempdir/oracdc.state Also - if you like we can schedule Google Meet session (I'm located in CET TZ) to speedup resolution of this issue.

Best regards, Aleksei

Gogo-scc commented 1 year ago

Hi averemee: I am sorry!Maybe I wanted to reset the kafka environment so cleared this directory,Now It’s no such the oracdc.state file whenI start the kafka connect again 。can you tell me how to reset this environment? I can do a whole new test

    [root@kafka2 confluent-5.4.8]# ls /kafka/tempdir/
ABC.TEST.14359795331171046859
averemee-si commented 1 year ago

Hello,

Fix for this issue will be ready soon

Best regards, Aleksei

averemee-si commented 1 year ago

Hello,

I've replaced Exception with debug information about missed in data dictionary LOB object in https://github.com/averemee-si/oracdc/commit/43a978cc63428bf690763ebb26faa9a71b6aef41 Please rebuild connector using latest code and retest issue

Best regards, Aleksei

Gogo-scc commented 1 year ago

Hi averemee ,Thanks for quick fix ,I rebuild connector using latest code。But there are no files under my /kafka/tempdir/。The connect log output:

[2022-11-23 18:09:12,039] INFO Connector logminer-source-testapps connected to Oracle Database 11g Enterprise Edition , 11.2.0.4.0
    $ORACLE_SID=orcl1, running on orcl1, OS Linux x86 64-bit. (solutions.a2.cdc.oracle.OraCdcLogMinerTask:247)
[2022-11-23 18:09:12,078] INFO Initial load set to COMPLETED (value from offset) (solutions.a2.cdc.oracle.OraCdcLogMinerTask:491)
[2022-11-23 18:09:12,078] INFO Point in time from offset data to start reading reading from SCN=4732683, RS_ID (RBA)=' 0x000014.0004b3b8.0010 ', SSN=0 (solutions.a2.cdc.oracle.OraCdcLogMinerTask:516)
[2022-11-23 18:09:12,088] INFO Initializing oracdc logminer archivelog worker thread (solutions.a2.cdc.oracle.OraCdcLogMinerWorkerThread:138)
[2022-11-23 18:09:12,144] INFO LogMiner will start from SCN 4732683 (solutions.a2.cdc.oracle.OraCdcV$ArchivedLogImpl:105)
[2022-11-23 18:09:12,144] INFO Mining database orcl1 is in OPEN mode (solutions.a2.cdc.oracle.OraCdcLogMinerWorkerThread:250)
[2022-11-23 18:09:12,144] INFO Same database will be used for dictionary query and mining (solutions.a2.cdc.oracle.OraCdcLogMinerWorkerThread:252)
[2022-11-23 18:09:12,144] INFO RowPrefetch size for accessing V$LOGMNR_CONTENTS set to 32. (solutions.a2.cdc.oracle.OraCdcLogMinerWorkerThread:270)
[2022-11-23 18:09:12,149] INFO Adding archived log /home/oracle/archive_log/1_20_1119103460.dbf thread# 1 sequence# 20 first change number 4621919 next log first change 4811177 (solutions.a2.cdc.oracle.OraCdcV$ArchivedLogImpl:201)
[2022-11-23 18:09:12,155] INFO Rewinding LogMiner ResultSet to first position after SCN = 4732683, RS_ID = ' 0x000014.0004b3b8.0010 ', SSN = 0. (solutions.a2.cdc.oracle.OraCdcLogMinerWorkerThread:291)
[2022-11-23 18:09:12,424] INFO Total records scipped while rewinding: 1, elapsed time ms: 0 (solutions.a2.cdc.oracle.OraCdcLogMinerWorkerThread:329)
[2022-11-23 18:09:12,426] INFO BEGIN: OraCdcLogMinerWorkerThread.run() (solutions.a2.cdc.oracle.OraCdcLogMinerWorkerThread:338)
[2022-11-23 18:09:12,427] INFO WorkerSourceTask{id=logminer-source-testapps-0} Source task finished initialization and start (org.apache.kafka.connect.runtime.WorkerSourceTask:215)
[2022-11-23 18:09:20,302] INFO Adding archived log /home/oracle/archive_log/1_21_1119103460.dbf thread# 1 sequence# 21 first change number 4811177 next log first change 5064790 (solutions.a2.cdc.oracle.OraCdcV$ArchivedLogImpl:201)
[2022-11-23 18:09:21,185] INFO WorkerSourceTask{id=logminer-source-testapps-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:422)
[2022-11-23 18:09:21,185] INFO WorkerSourceTask{id=logminer-source-testapps-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:439)
[2022-11-23 18:09:31,186] INFO WorkerSourceTask{id=logminer-source-testapps-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:422)
[2022-11-23 18:09:31,186] INFO WorkerSourceTask{id=logminer-source-testapps-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:439)
[2022-11-23 18:09:40,680] INFO Adding archived log /home/oracle/archive_log/1_22_1119103460.dbf thread# 1 sequence# 22 first change number 5064790 next log first change 5254983 (solutions.a2.cdc.oracle.OraCdcV$ArchivedLogImpl:201)
[2022-11-23 18:09:41,186] INFO WorkerSourceTask{id=logminer-source-testapps-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:422)
[2022-11-23 18:09:41,187] INFO WorkerSourceTask{id=logminer-source-testapps-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:439)
^C[2022-11-23 18:09:49,059] INFO Kafka Connect stopping (org.apache.kafka.connect.runtime.Connect:66)
[2022-11-23 18:09:49,060] INFO Stopping REST server (org.apache.kafka.connect.runtime.rest.RestServer:321)
[2022-11-23 18:09:49,068] INFO Stopped http_8083@58e85c6f{HTTP/1.1, (http/1.1)}{0.0.0.0:8083} (org.eclipse.jetty.server.AbstractConnector:381)
[2022-11-23 18:09:49,068] INFO node0 Stopped scavenging (org.eclipse.jetty.server.session:149)
[2022-11-23 18:09:49,071] INFO REST server stopped (org.apache.kafka.connect.runtime.rest.RestServer:338)
[2022-11-23 18:09:49,071] INFO Herder stopping (org.apache.kafka.connect.runtime.standalone.StandaloneHerder:100)
[2022-11-23 18:09:49,071] INFO Stopping task logminer-source-testapps-0 (org.apache.kafka.connect.runtime.Worker:706)
[2022-11-23 18:09:49,072] INFO Stopping oracdc logminer source task. (solutions.a2.cdc.oracle.OraCdcLogMinerTask:988)
[2022-11-23 18:09:49,072] INFO Stopping oracdc logminer archivelog worker thread... (solutions.a2.cdc.oracle.OraCdcLogMinerWorkerThread:798)
[2022-11-23 18:09:49,097] INFO END: OraCdcLogMinerWorkerThread.run() (solutions.a2.cdc.oracle.OraCdcLogMinerWorkerThread:778)
[2022-11-23 18:09:50,447] WARN Caught SQLException oracle.ucp.UniversalConnectionPoolException: Invalid life cycle state. Check the status of the Universal Connection Pool while stopping oracdc task. (solutions.a2.cdc.oracle.OraCdcLogMinerTask:967)
[2022-11-23 18:09:50,447] INFO WorkerSourceTask{id=logminer-source-testapps-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:422)
[2022-11-23 18:09:50,447] INFO WorkerSourceTask{id=logminer-source-testapps-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:439)
[2022-11-23 18:09:50,448] INFO [Producer clientId=connector-producer-logminer-source-testapps-0] Closing the Kafka producer with timeoutMillis = 30000 ms. (org.apache.kafka.clients.producer.KafkaProducer:1183)
[2022-11-23 18:09:50,455] INFO Stopping connector logminer-source-testapps (org.apache.kafka.connect.runtime.Worker:360)
[2022-11-23 18:09:50,456] INFO Stopping oracdc logminer source connector (solutions.a2.cdc.oracle.OraCdcLogMinerConnector:243)
[2022-11-23 18:09:50,456] INFO Stopped connector logminer-source-testapps (org.apache.kafka.connect.runtime.Worker:376)
[2022-11-23 18:09:50,456] INFO Worker stopping (org.apache.kafka.connect.runtime.Worker:198)
[2022-11-23 18:09:50,457] INFO Stopped FileOffsetBackingStore (org.apache.kafka.connect.storage.FileOffsetBackingStore:66)
[2022-11-23 18:09:50,457] INFO Worker stopped (org.apache.kafka.connect.runtime.Worker:219)
[2022-11-23 18:09:50,458] INFO Herder stopped (org.apache.kafka.connect.runtime.standalone.StandaloneHerder:117)
[2022-11-23 18:09:50,458] INFO Kafka Connect stopped (org.apache.kafka.connect.runtime.Connect:71)
[root@kafka2 confluent-5.4.8]# bin/connect-standalone etc/kafka/connect-standalone.properties etc/kafka/logminer-source-testapps.properties 
[2022-11-23 18:10:17,309] INFO Kafka Connect standalone worker initializing ... (org.apache.kafka.connect.cli.ConnectStandalone:69)
[2022-11-23 18:10:17,317] INFO WorkerInfo values: 
    jvm.args = -Xms256M, -Xmx2G, -XX:+UseG1GC, -XX:MaxGCPauseMillis=20, -XX:InitiatingHeapOccupancyPercent=35, -XX:+ExplicitGCInvokesConcurrent, -Djava.awt.headless=true, -Dcom.sun.management.jmxremote, -Dcom.sun.management.jmxremote.authenticate=false, -Dcom.sun.management.jmxremote.ssl=false, -Dkafka.logs.dir=/home/confluent-5.4.8/bin/../logs, -Dlog4j.configuration=file:bin/../etc/kafka/connect-log4j.properties
    jvm.spec = Oracle Corporation, OpenJDK 64-Bit Server VM, 11.0.8, 11.0.8+10-LTS
    jvm.classpath = .:/etc/alternatives/java_sdk_11_openjdk/lib:/etc/alternatives/java_sdk_11_openjdk/jre/lib::/home/confluent-5.4.8/share/java/kafka/support-metrics-client-5.4.8-ccs.jar:/home/confluent-5.4.8/share/java/kafka/connect-mirror-client-5.4.8-ccs.jar:/home/confluent-5.4.8/share/java/kafka/commons-compress-1.21.jar:/home/confluent-5.4.8/share/java/kafka/connect-api-5.4.8-ccs.jar:/home/confluent-5.4.8/share/java/kafka/kafka-clients-5.4.8-ccs.jar:/home/confluent-5.4.8/share/java/kafka/jetty-http-9.4.44.v20210927.jar:/home/confluent-5.4.8/share/java/kafka/jakarta.inject-2.6.1.jar:/home/confluent-5.4.8/share/java/kafka/jackson-jaxrs-base-2.13.2.jar:/home/confluent-5.4.8/share/java/kafka/audience-annotations-0.5.0.jar:/home/confluent-5.4.8/share/java/kafka/kafka-streams-examples-5.4.8-ccs.jar:/home/confluent-5.4.8/share/java/kafka/kafka-log4j-appender-5.4.8-ccs.jar:/home/confluent-5.4.8/share/java/kafka/netty-buffer-4.1.73.Final.jar:/home/confluent-5.4.8/share/java/kafka/slf4j-log4j12-1.7.28.jar:/home/confluent-5.4.8/share/java/kafka/zookeeper-jute-3.5.9.jar:/home/confluent-5.4.8/share/java/kafka/connect-transforms-5.4.8-ccs.jar:/home/confluent-5.4.8/share/java/kafka/javassist-3.25.0-GA.jar:/home/confluent-5.4.8/share/java/kafka/jetty-servlet-9.4.44.v20210927.jar:/home/confluent-5.4.8/share/java/kafka/netty-handler-4.1.73.Final.jar:/home/confluent-5.4.8/share/java/kafka/javassist-3.26.0-GA.jar:/home/confluent-5.4.8/share/java/kafka/jetty-util-ajax-9.4.44.v20210927.jar:/home/confluent-5.4.8/share/java/kafka/kafka-streams-5.4.8-ccs.jar:/home/confluent-5.4.8/share/java/kafka/rocksdbjni-5.18.3.jar:/home/confluent-5.4.8/share/java/kafka/commons-codec-1.15.jar:/home/confluent-5.4.8/share/java/kafka/kafka_2.12-5.4.8-ccs-test-sources.jar:/home/confluent-5.4.8/share/java/kafka/commons-logging-1.2.jar:/home/confluent-5.4.8/share/java/kafka/jakarta.ws.rs-api-2.1.6.jar:/home/confluent-5.4.8/share/java/kafka/kafka-tools-5.4.8-ccs.jar:/home/confluent-5.4.8/share/java/kafka/jackson-annotations-2.13.2.jar:/home/confluent-5.4.8/share/java/kafka/zookeeper-3.5.9.jar:/home/confluent-5.4.8/share/java/kafka/commons-cli-1.4.jar:/home/confluent-5.4.8/share/java/kafka/jetty-security-9.4.44.v20210927.jar:/home/confluent-5.4.8/share/java/kafka/jackson-databind-2.13.2.2.jar:/home/confluent-5.4.8/share/java/kafka/connect-mirror-5.4.8-ccs.jar:/home/confluent-5.4.8/share/java/kafka/kafka_2.12-5.4.8-ccs-test.jar:/home/confluent-5.4.8/share/java/kafka/netty-transport-4.1.73.Final.jar:/home/confluent-5.4.8/share/java/kafka/confluent-log4j-1.2.17-cp2.2.jar:/home/confluent-5.4.8/share/java/kafka/jakarta.validation-api-2.0.2.jar:/home/confluent-5.4.8/share/java/kafka/jakarta.xml.bind-api-2.3.3.jar:/home/confluent-5.4.8/share/java/kafka/netty-common-4.1.73.Final.jar:/home/confluent-5.4.8/share/java/kafka/jakarta.annotation-api-1.3.5.jar:/home/confluent-5.4.8/share/java/kafka/kafka-streams-scala_2.12-5.4.8-ccs.jar:/home/confluent-5.4.8/share/java/kafka/jersey-server-2.34.jar:/home/confluent-5.4.8/share/java/kafka/hk2-utils-2.6.1.jar:/home/confluent-5.4.8/share/java/kafka/jackson-datatype-jdk8-2.13.2.jar:/home/confluent-5.4.8/share/java/kafka/netty-codec-4.1.73.Final.jar:/home/confluent-5.4.8/share/java/kafka/jackson-jaxrs-json-provider-2.13.2.jar:/home/confluent-5.4.8/share/java/kafka/jetty-client-9.4.44.v20210927.jar:/home/confluent-5.4.8/share/java/kafka/maven-artifact-3.8.1.jar:/home/confluent-5.4.8/share/java/kafka/scala-library-2.12.10.jar:/home/confluent-5.4.8/share/java/kafka/netty-transport-native-unix-common-4.1.73.Final.jar:/home/confluent-5.4.8/share/java/kafka/metrics-core-2.2.0.jar:/home/confluent-5.4.8/share/java/kafka/httpcore-4.4.13.jar:/home/confluent-5.4.8/share/java/kafka/jackson-module-scala_2.12-2.13.2.jar:/home/confluent-5.4.8/share/java/kafka/scala-java8-compat_2.12-0.9.0.jar:/home/confluent-5.4.8/share/java/kafka/kafka-streams-test-utils-5.4.8-ccs.jar:/home/confluent-5.4.8/share/java/kafka/httpclient-4.5.13.jar:/home/confluent-5.4.8/share/java/kafka/jetty-servlets-9.4.44.v20210927.jar:/home/confluent-5.4.8/share/java/kafka/reflections-0.9.12.jar:/home/confluent-5.4.8/share/java/kafka/jetty-server-9.4.44.v20210927.jar:/home/confluent-5.4.8/share/java/kafka/jackson-module-jaxb-annotations-2.13.2.jar:/home/confluent-5.4.8/share/java/kafka/kafka_2.12-5.4.8-ccs.jar:/home/confluent-5.4.8/share/java/kafka/argparse4j-0.7.0.jar:/home/confluent-5.4.8/share/java/kafka/activation-1.1.1.jar:/home/confluent-5.4.8/share/java/kafka/jetty-util-9.4.44.v20210927.jar:/home/confluent-5.4.8/share/java/kafka/connect-json-5.4.8-ccs.jar:/home/confluent-5.4.8/share/java/kafka/support-metrics-common-5.4.8-ccs.jar:/home/confluent-5.4.8/share/java/kafka/kafka_2.12-5.4.8-ccs-javadoc.jar:/home/confluent-5.4.8/share/java/kafka/jakarta.activation-api-1.2.2.jar:/home/confluent-5.4.8/share/java/kafka/jetty-io-9.4.44.v20210927.jar:/home/confluent-5.4.8/share/java/kafka/jersey-container-servlet-2.34.jar:/home/confluent-5.4.8/share/java/kafka/kafka_2.12-5.4.8-ccs-scaladoc.jar:/home/confluent-5.4.8/share/java/kafka/zstd-jni-1.4.3-1.jar:/home/confluent-5.4.8/share/java/kafka/jersey-container-servlet-core-2.34.jar:/home/confluent-5.4.8/share/java/kafka/netty-transport-native-epoll-4.1.73.Final.jar:/home/confluent-5.4.8/share/java/kafka/httpmime-4.5.13.jar:/home/confluent-5.4.8/share/java/kafka/connect-runtime-5.4.8-ccs.jar:/home/confluent-5.4.8/share/java/kafka/scala-reflect-2.12.10.jar:/home/confluent-5.4.8/share/java/kafka/jaxb-api-2.3.0.jar:/home/confluent-5.4.8/share/java/kafka/jersey-hk2-2.34.jar:/home/confluent-5.4.8/share/java/kafka/javax.servlet-api-3.1.0.jar:/home/confluent-5.4.8/share/java/kafka/jetty-continuation-9.4.44.v20210927.jar:/home/confluent-5.4.8/share/java/kafka/commons-lang3-3.8.1.jar:/home/confluent-5.4.8/share/java/kafka/plexus-utils-3.2.1.jar:/home/confluent-5.4.8/share/java/kafka/jackson-dataformat-csv-2.13.2.jar:/home/confluent-5.4.8/share/java/kafka/netty-tcnative-classes-2.0.46.Final.jar:/home/confluent-5.4.8/share/java/kafka/javax.ws.rs-api-2.1.1.jar:/home/confluent-5.4.8/share/java/kafka/jersey-common-2.34.jar:/home/confluent-5.4.8/share/java/kafka/scala-logging_2.12-3.9.2.jar:/home/confluent-5.4.8/share/java/kafka/connect-basic-auth-extension-5.4.8-ccs.jar:/home/confluent-5.4.8/share/java/kafka/kafka.jar:/home/confluent-5.4.8/share/java/kafka/jackson-core-2.13.2.jar:/home/confluent-5.4.8/share/java/kafka/osgi-resource-locator-1.0.3.jar:/home/confluent-5.4.8/share/java/kafka/aopalliance-repackaged-2.6.1.jar:/home/confluent-5.4.8/share/java/kafka/hk2-locator-2.6.1.jar:/home/confluent-5.4.8/share/java/kafka/netty-resolver-4.1.73.Final.jar:/home/confluent-5.4.8/share/java/kafka/jopt-simple-5.0.4.jar:/home/confluent-5.4.8/share/java/kafka/lz4-java-1.6.0.jar:/home/confluent-5.4.8/share/java/kafka/snappy-java-1.1.7.3.jar:/home/confluent-5.4.8/share/java/kafka/netty-transport-classes-epoll-4.1.73.Final.jar:/home/confluent-5.4.8/share/java/kafka/hk2-api-2.6.1.jar:/home/confluent-5.4.8/share/java/kafka/paranamer-2.8.jar:/home/confluent-5.4.8/share/java/kafka/slf4j-api-1.7.28.jar:/home/confluent-5.4.8/share/java/kafka/kafka_2.12-5.4.8-ccs-sources.jar:/home/confluent-5.4.8/share/java/kafka/avro-1.9.2.jar:/home/confluent-5.4.8/share/java/kafka/jersey-client-2.34.jar:/home/confluent-5.4.8/share/java/kafka/scala-collection-compat_2.12-2.1.2.jar:/home/confluent-5.4.8/share/java/confluent-common/common-metrics-5.4.8.jar:/home/confluent-5.4.8/share/java/confluent-common/common-utils-5.4.8.jar:/home/confluent-5.4.8/share/java/confluent-common/build-tools-5.4.8.jar:/home/confluent-5.4.8/share/java/confluent-common/common-config-5.4.8.jar:/home/confluent-5.4.8/share/java/confluent-common/slf4j-api-1.7.26.jar:/home/confluent-5.4.8/share/java/kafka-serde-tools/failureaccess-1.0.1.jar:/home/confluent-5.4.8/share/java/kafka-serde-tools/commons-compress-1.21.jar:/home/confluent-5.4.8/share/java/kafka-serde-tools/jackson-dataformat-yaml-2.13.2.jar:/home/confluent-5.4.8/share/java/kafka-serde-tools/kafka-avro-serializer-5.4.8.jar:/home/confluent-5.4.8/share/java/kafka-serde-tools/swagger-models-1.6.2.jar:/home/confluent-5.4.8/share/java/kafka-serde-tools/snakeyaml-1.27.jar:/home/confluent-5.4.8/share/java/kafka-serde-tools/jackson-annotations-2.13.2.jar:/home/confluent-5.4.8/share/java/kafka-serde-tools/jackson-databind-2.13.2.2.jar:/home/confluent-5.4.8/share/java/kafka-serde-tools/j2objc-annotations-1.3.jar:/home/confluent-5.4.8/share/java/kafka-serde-tools/guava-30.1.1-jre.jar:/home/confluent-5.4.8/share/java/kafka-serde-tools/swagger-annotations-1.5.22.jar:/home/confluent-5.4.8/share/java/kafka-serde-tools/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/home/confluent-5.4.8/share/java/kafka-serde-tools/kafka-schema-registry-client-5.4.8.jar:/home/confluent-5.4.8/share/java/kafka-serde-tools/checker-qual-3.8.0.jar:/home/confluent-5.4.8/share/java/kafka-serde-tools/jsr305-3.0.2.jar:/home/confluent-5.4.8/share/java/kafka-serde-tools/commons-lang3-3.8.1.jar:/home/confluent-5.4.8/share/java/kafka-serde-tools/swagger-core-1.6.2.jar:/home/confluent-5.4.8/share/java/kafka-serde-tools/kafka-connect-avro-converter-5.4.8.jar:/home/confluent-5.4.8/share/java/kafka-serde-tools/jackson-core-2.13.2.jar:/home/confluent-5.4.8/share/java/kafka-serde-tools/kafka-streams-avro-serde-5.4.8.jar:/home/confluent-5.4.8/share/java/kafka-serde-tools/kafka-json-serializer-5.4.8.jar:/home/confluent-5.4.8/share/java/kafka-serde-tools/error_prone_annotations-2.5.1.jar:/home/confluent-5.4.8/share/java/kafka-serde-tools/avro-1.9.2.jar:/home/confluent-5.4.8/bin/../share/java/kafka/support-metrics-client-5.4.8-ccs.jar:/home/confluent-5.4.8/bin/../share/java/kafka/connect-mirror-client-5.4.8-ccs.jar:/home/confluent-5.4.8/bin/../share/java/kafka/commons-compress-1.21.jar:/home/confluent-5.4.8/bin/../share/java/kafka/connect-api-5.4.8-ccs.jar:/home/confluent-5.4.8/bin/../share/java/kafka/kafka-clients-5.4.8-ccs.jar:/home/confluent-5.4.8/bin/../share/java/kafka/jetty-http-9.4.44.v20210927.jar:/home/confluent-5.4.8/bin/../share/java/kafka/jakarta.inject-2.6.1.jar:/home/confluent-5.4.8/bin/../share/java/kafka/jackson-jaxrs-base-2.13.2.jar:/home/confluent-5.4.8/bin/../share/java/kafka/audience-annotations-0.5.0.jar:/home/confluent-5.4.8/bin/../share/java/kafka/kafka-streams-examples-5.4.8-ccs.jar:/home/confluent-5.4.8/bin/../share/java/kafka/kafka-log4j-appender-5.4.8-ccs.jar:/home/confluent-5.4.8/bin/../share/java/kafka/netty-buffer-4.1.73.Final.jar:/home/confluent-5.4.8/bin/../share/java/kafka/slf4j-log4j12-1.7.28.jar:/home/confluent-5.4.8/bin/../share/java/kafka/zookeeper-jute-3.5.9.jar:/home/confluent-5.4.8/bin/../share/java/kafka/connect-transforms-5.4.8-ccs.jar:/home/confluent-5.4.8/bin/../share/java/kafka/javassist-3.25.0-GA.jar:/home/confluent-5.4.8/bin/../share/java/kafka/jetty-servlet-9.4.44.v20210927.jar:/home/confluent-5.4.8/bin/../share/java/kafka/netty-handler-4.1.73.Final.jar:/home/confluent-5.4.8/bin/../share/java/kafka/javassist-3.26.0-GA.jar:/home/confluent-5.4.8/bin/../share/java/kafka/jetty-util-ajax-9.4.44.v20210927.jar:/home/confluent-5.4.8/bin/../share/java/kafka/kafka-streams-5.4.8-ccs.jar:/home/confluent-5.4.8/bin/../share/java/kafka/rocksdbjni-5.18.3.jar:/home/confluent-5.4.8/bin/../share/java/kafka/commons-codec-1.15.jar:/home/confluent-5.4.8/bin/../share/java/kafka/kafka_2.12-5.4.8-ccs-test-sources.jar:/home/confluent-5.4.8/bin/../share/java/kafka/commons-logging-1.2.jar:/home/confluent-5.4.8/bin/../share/java/kafka/jakarta.ws.rs-api-2.1.6.jar:/home/confluent-5.4.8/bin/../share/java/kafka/kafka-tools-5.4.8-ccs.jar:/home/confluent-5.4.8/bin/../share/java/kafka/jackson-annotations-2.13.2.jar:/home/confluent-5.4.8/bin/../share/java/kafka/zookeeper-3.5.9.jar:/home/confluent-5.4.8/bin/../share/java/kafka/commons-cli-1.4.jar:/home/confluent-5.4.8/bin/../share/java/kafka/jetty-security-9.4.44.v20210927.jar:/home/confluent-5.4.8/bin/../share/java/kafka/jackson-databind-2.13.2.2.jar:/home/confluent-5.4.8/bin/../share/java/kafka/connect-mirror-5.4.8-ccs.jar:/home/confluent-5.4.8/bin/../share/java/kafka/kafka_2.12-5.4.8-ccs-test.jar:/home/confluent-5.4.8/bin/../share/java/kafka/netty-transport-4.1.73.Final.jar:/home/confluent-5.4.8/bin/../share/java/kafka/confluent-log4j-1.2.17-cp2.2.jar:/home/confluent-5.4.8/bin/../share/java/kafka/jakarta.validation-api-2.0.2.jar:/home/confluent-5.4.8/bin/../share/java/kafka/jakarta.xml.bind-api-2.3.3.jar:/home/confluent-5.4.8/bin/../share/java/kafka/netty-common-4.1.73.Final.jar:/home/confluent-5.4.8/bin/../share/java/kafka/jakarta.annotation-api-1.3.5.jar:/home/confluent-5.4.8/bin/../share/java/kafka/kafka-streams-scala_2.12-5.4.8-ccs.jar:/home/confluent-5.4.8/bin/../share/java/kafka/jersey-server-2.34.jar:/home/confluent-5.4.8/bin/../share/java/kafka/hk2-utils-2.6.1.jar:/home/confluent-5.4.8/bin/../share/java/kafka/jackson-datatype-jdk8-2.13.2.jar:/home/confluent-5.4.8/bin/../share/java/kafka/netty-codec-4.1.73.Final.jar:/home/confluent-5.4.8/bin/../share/java/kafka/jackson-jaxrs-json-provider-2.13.2.jar:/home/confluent-5.4.8/bin/../share/java/kafka/jetty-client-9.4.44.v20210927.jar:/home/confluent-5.4.8/bin/../share/java/kafka/maven-artifact-3.8.1.jar:/home/confluent-5.4.8/bin/../share/java/kafka/scala-library-2.12.10.jar:/home/confluent-5.4.8/bin/../share/java/kafka/netty-transport-native-unix-common-4.1.73.Final.jar:/home/confluent-5.4.8/bin/../share/java/kafka/metrics-core-2.2.0.jar:/home/confluent-5.4.8/bin/../share/java/kafka/httpcore-4.4.13.jar:/home/confluent-5.4.8/bin/../share/java/kafka/jackson-module-scala_2.12-2.13.2.jar:/home/confluent-5.4.8/bin/../share/java/kafka/scala-java8-compat_2.12-0.9.0.jar:/home/confluent-5.4.8/bin/../share/java/kafka/kafka-streams-test-utils-5.4.8-ccs.jar:/home/confluent-5.4.8/bin/../share/java/kafka/httpclient-4.5.13.jar:/home/confluent-5.4.8/bin/../share/java/kafka/jetty-servlets-9.4.44.v20210927.jar:/home/confluent-5.4.8/bin/../share/java/kafka/reflections-0.9.12.jar:/home/confluent-5.4.8/bin/../share/java/kafka/jetty-server-9.4.44.v20210927.jar:/home/confluent-5.4.8/bin/../share/java/kafka/jackson-module-jaxb-annotations-2.13.2.jar:/home/confluent-5.4.8/bin/../share/java/kafka/kafka_2.12-5.4.8-ccs.jar:/home/confluent-5.4.8/bin/../share/java/kafka/argparse4j-0.7.0.jar:/home/confluent-5.4.8/bin/../share/java/kafka/activation-1.1.1.jar:/home/confluent-5.4.8/bin/../share/java/kafka/jetty-util-9.4.44.v20210927.jar:/home/confluent-5.4.8/bin/../share/java/kafka/connect-json-5.4.8-ccs.jar:/home/confluent-5.4.8/bin/../share/java/kafka/support-metrics-common-5.4.8-ccs.jar:/home/confluent-5.4.8/bin/../share/java/kafka/kafka_2.12-5.4.8-ccs-javadoc.jar:/home/confluent-5.4.8/bin/../share/java/kafka/jakarta.activation-api-1.2.2.jar:/home/confluent-5.4.8/bin/../share/java/kafka/jetty-io-9.4.44.v20210927.jar:/home/confluent-5.4.8/bin/../share/java/kafka/jersey-container-servlet-2.34.jar:/home/confluent-5.4.8/bin/../share/java/kafka/kafka_2.12-5.4.8-ccs-scaladoc.jar:/home/confluent-5.4.8/bin/../share/java/kafka/zstd-jni-1.4.3-1.jar:/home/confluent-5.4.8/bin/../share/java/kafka/jersey-container-servlet-core-2.34.jar:/home/confluent-5.4.8/bin/../share/java/kafka/netty-transport-native-epoll-4.1.73.Final.jar:/home/confluent-5.4.8/bin/../share/java/kafka/httpmime-4.5.13.jar:/home/confluent-5.4.8/bin/../share/java/kafka/connect-runtime-5.4.8-ccs.jar:/home/confluent-5.4.8/bin/../share/java/kafka/scala-reflect-2.12.10.jar:/home/confluent-5.4.8/bin/../share/java/kafka/jaxb-api-2.3.0.jar:/home/confluent-5.4.8/bin/../share/java/kafka/jersey-hk2-2.34.jar:/home/confluent-5.4.8/bin/../share/java/kafka/javax.servlet-api-3.1.0.jar:/home/confluent-5.4.8/bin/../share/java/kafka/jetty-continuation-9.4.44.v20210927.jar:/home/confluent-5.4.8/bin/../share/java/kafka/commons-lang3-3.8.1.jar:/home/confluent-5.4.8/bin/../share/java/kafka/plexus-utils-3.2.1.jar:/home/confluent-5.4.8/bin/../share/java/kafka/jackson-dataformat-csv-2.13.2.jar:/home/confluent-5.4.8/bin/../share/java/kafka/netty-tcnative-classes-2.0.46.Final.jar:/home/confluent-5.4.8/bin/../share/java/kafka/javax.ws.rs-api-2.1.1.jar:/home/confluent-5.4.8/bin/../share/java/kafka/jersey-common-2.34.jar:/home/confluent-5.4.8/bin/../share/java/kafka/scala-logging_2.12-3.9.2.jar:/home/confluent-5.4.8/bin/../share/java/kafka/connect-basic-auth-extension-5.4.8-ccs.jar:/home/confluent-5.4.8/bin/../share/java/kafka/kafka.jar:/home/confluent-5.4.8/bin/../share/java/kafka/jackson-core-2.13.2.jar:/home/confluent-5.4.8/bin/../share/java/kafka/osgi-resource-locator-1.0.3.jar:/home/confluent-5.4.8/bin/../share/java/kafka/aopalliance-repackaged-2.6.1.jar:/home/confluent-5.4.8/bin/../share/java/kafka/hk2-locator-2.6.1.jar:/home/confluent-5.4.8/bin/../share/java/kafka/netty-resolver-4.1.73.Final.jar:/home/confluent-5.4.8/bin/../share/java/kafka/jopt-simple-5.0.4.jar:/home/confluent-5.4.8/bin/../share/java/kafka/lz4-java-1.6.0.jar:/home/confluent-5.4.8/bin/../share/java/kafka/snappy-java-1.1.7.3.jar:/home/confluent-5.4.8/bin/../share/java/kafka/netty-transport-classes-epoll-4.1.73.Final.jar:/home/confluent-5.4.8/bin/../share/java/kafka/hk2-api-2.6.1.jar:/home/confluent-5.4.8/bin/../share/java/kafka/paranamer-2.8.jar:/home/confluent-5.4.8/bin/../share/java/kafka/slf4j-api-1.7.28.jar:/home/confluent-5.4.8/bin/../share/java/kafka/kafka_2.12-5.4.8-ccs-sources.jar:/home/confluent-5.4.8/bin/../share/java/kafka/avro-1.9.2.jar:/home/confluent-5.4.8/bin/../share/java/kafka/jersey-client-2.34.jar:/home/confluent-5.4.8/bin/../share/java/kafka/scala-collection-compat_2.12-2.1.2.jar:/home/confluent-5.4.8/bin/../support-metrics-client/build/dependant-libs-2.12.10/*:/home/confluent-5.4.8/bin/../support-metrics-client/build/libs/*:/usr/share/java/support-metrics-client/*
    os.spec = Linux, amd64, 3.10.0-1127.el7.x86_64
    os.vcpus = 4
 (org.apache.kafka.connect.runtime.WorkerInfo:71)
[2022-11-23 18:10:17,321] INFO Scanning for plugin classes. This might take a moment ... (org.apache.kafka.connect.cli.ConnectStandalone:78)
[2022-11-23 18:10:17,334] INFO Loading plugin from: /kafka/connect/lib (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:242)
[2022-11-23 18:10:18,595] INFO Registered loader: PluginClassLoader{pluginLocation=file:/kafka/connect/lib/} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:265)
[2022-11-23 18:10:18,595] INFO Added plugin 'solutions.a2.cdc.oracle.OraCdcLogMinerConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:194)
[2022-11-23 18:10:18,596] INFO Added plugin 'solutions.a2.cdc.oracle.OraCdcSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:194)
[2022-11-23 18:10:18,596] INFO Added plugin 'solutions.a2.cdc.oracle.OraCdcJdbcSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:194)
[2022-11-23 18:10:18,596] INFO Added plugin 'org.apache.kafka.connect.storage.StringConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:194)
[2022-11-23 18:10:18,596] INFO Added plugin 'org.apache.kafka.connect.storage.SimpleHeaderConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:194)
[2022-11-23 18:10:18,596] INFO Added plugin 'org.apache.kafka.common.config.provider.FileConfigProvider' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:194)
[2022-11-23 18:10:18,596] INFO Added plugin 'org.apache.kafka.common.config.provider.DirectoryConfigProvider' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:194)
[2022-11-23 18:10:18,596] INFO Added plugin 'org.apache.kafka.connect.connector.policy.AllConnectorClientConfigOverridePolicy' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:194)
[2022-11-23 18:10:18,596] INFO Added plugin 'org.apache.kafka.connect.connector.policy.PrincipalConnectorClientConfigOverridePolicy' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:194)
[2022-11-23 18:10:18,596] INFO Added plugin 'org.apache.kafka.connect.connector.policy.NoneConnectorClientConfigOverridePolicy' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:194)
[2022-11-23 18:10:20,099] INFO Registered loader: jdk.internal.loader.ClassLoaders$AppClassLoader@75b84c92 (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:265)
[2022-11-23 18:10:20,100] INFO Added plugin 'org.apache.kafka.connect.tools.VerifiableSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:194)
[2022-11-23 18:10:20,100] INFO Added plugin 'org.apache.kafka.connect.tools.VerifiableSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:194)
[2022-11-23 18:10:20,100] INFO Added plugin 'org.apache.kafka.connect.tools.MockSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:194)
[2022-11-23 18:10:20,100] INFO Added plugin 'org.apache.kafka.connect.mirror.MirrorSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:194)
[2022-11-23 18:10:20,100] INFO Added plugin 'org.apache.kafka.connect.mirror.MirrorHeartbeatConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:194)
[2022-11-23 18:10:20,100] INFO Added plugin 'org.apache.kafka.connect.tools.MockConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:194)
[2022-11-23 18:10:20,101] INFO Added plugin 'org.apache.kafka.connect.tools.MockSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:194)
[2022-11-23 18:10:20,101] INFO Added plugin 'org.apache.kafka.connect.tools.SchemaSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:194)
[2022-11-23 18:10:20,101] INFO Added plugin 'org.apache.kafka.connect.mirror.MirrorCheckpointConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:194)
[2022-11-23 18:10:20,101] INFO Added plugin 'org.apache.kafka.connect.converters.FloatConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:194)
[2022-11-23 18:10:20,101] INFO Added plugin 'org.apache.kafka.connect.converters.DoubleConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:194)
[2022-11-23 18:10:20,101] INFO Added plugin 'io.confluent.connect.avro.AvroConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:194)
[2022-11-23 18:10:20,101] INFO Added plugin 'org.apache.kafka.connect.converters.ByteArrayConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:194)
[2022-11-23 18:10:20,101] INFO Added plugin 'org.apache.kafka.connect.converters.LongConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:194)
[2022-11-23 18:10:20,101] INFO Added plugin 'org.apache.kafka.connect.converters.IntegerConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:194)
[2022-11-23 18:10:20,101] INFO Added plugin 'org.apache.kafka.connect.json.JsonConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:194)
[2022-11-23 18:10:20,102] INFO Added plugin 'org.apache.kafka.connect.converters.ShortConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:194)
[2022-11-23 18:10:20,102] INFO Added plugin 'org.apache.kafka.connect.transforms.ReplaceField$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:194)
[2022-11-23 18:10:20,102] INFO Added plugin 'org.apache.kafka.connect.transforms.SetSchemaMetadata$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:194)
[2022-11-23 18:10:20,102] INFO Added plugin 'org.apache.kafka.connect.transforms.ReplaceField$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:194)
[2022-11-23 18:10:20,102] INFO Added plugin 'org.apache.kafka.connect.transforms.InsertField$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:194)
[2022-11-23 18:10:20,102] INFO Added plugin 'org.apache.kafka.connect.transforms.TimestampConverter$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:194)
[2022-11-23 18:10:20,102] INFO Added plugin 'org.apache.kafka.connect.transforms.MaskField$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:194)
[2022-11-23 18:10:20,102] INFO Added plugin 'org.apache.kafka.connect.transforms.TimestampRouter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:194)
[2022-11-23 18:10:20,102] INFO Added plugin 'org.apache.kafka.connect.transforms.RegexRouter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:194)
[2022-11-23 18:10:20,102] INFO Added plugin 'org.apache.kafka.connect.transforms.HoistField$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:194)
[2022-11-23 18:10:20,102] INFO Added plugin 'org.apache.kafka.connect.transforms.ValueToKey' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:194)
[2022-11-23 18:10:20,102] INFO Added plugin 'org.apache.kafka.connect.transforms.MaskField$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:194)
[2022-11-23 18:10:20,103] INFO Added plugin 'org.apache.kafka.connect.transforms.Cast$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:194)
[2022-11-23 18:10:20,103] INFO Added plugin 'org.apache.kafka.connect.transforms.Cast$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:194)
[2022-11-23 18:10:20,103] INFO Added plugin 'org.apache.kafka.connect.transforms.ExtractField$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:194)
[2022-11-23 18:10:20,103] INFO Added plugin 'org.apache.kafka.connect.transforms.Flatten$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:194)
[2022-11-23 18:10:20,103] INFO Added plugin 'org.apache.kafka.connect.transforms.InsertField$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:194)
[2022-11-23 18:10:20,103] INFO Added plugin 'org.apache.kafka.connect.transforms.Flatten$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:194)
[2022-11-23 18:10:20,103] INFO Added plugin 'org.apache.kafka.connect.transforms.SetSchemaMetadata$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:194)
[2022-11-23 18:10:20,103] INFO Added plugin 'org.apache.kafka.connect.transforms.ExtractField$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:194)
[2022-11-23 18:10:20,103] INFO Added plugin 'org.apache.kafka.connect.transforms.TimestampConverter$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:194)
[2022-11-23 18:10:20,103] INFO Added plugin 'org.apache.kafka.connect.transforms.HoistField$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:194)
[2022-11-23 18:10:20,103] INFO Added plugin 'org.apache.kafka.connect.rest.basic.auth.extension.BasicAuthSecurityRestExtension' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:194)
[2022-11-23 18:10:20,104] INFO Added aliases 'MirrorCheckpointConnector' and 'MirrorCheckpoint' to plugin 'org.apache.kafka.connect.mirror.MirrorCheckpointConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:422)
[2022-11-23 18:10:20,104] INFO Added aliases 'MirrorHeartbeatConnector' and 'MirrorHeartbeat' to plugin 'org.apache.kafka.connect.mirror.MirrorHeartbeatConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:422)
[2022-11-23 18:10:20,104] INFO Added aliases 'MirrorSourceConnector' and 'MirrorSource' to plugin 'org.apache.kafka.connect.mirror.MirrorSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:422)
[2022-11-23 18:10:20,104] INFO Added aliases 'MockConnector' and 'Mock' to plugin 'org.apache.kafka.connect.tools.MockConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:422)
[2022-11-23 18:10:20,104] INFO Added aliases 'MockSinkConnector' and 'MockSink' to plugin 'org.apache.kafka.connect.tools.MockSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:422)
[2022-11-23 18:10:20,105] INFO Added aliases 'MockSourceConnector' and 'MockSource' to plugin 'org.apache.kafka.connect.tools.MockSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:422)
[2022-11-23 18:10:20,105] INFO Added aliases 'SchemaSourceConnector' and 'SchemaSource' to plugin 'org.apache.kafka.connect.tools.SchemaSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:422)
[2022-11-23 18:10:20,105] INFO Added aliases 'VerifiableSinkConnector' and 'VerifiableSink' to plugin 'org.apache.kafka.connect.tools.VerifiableSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:422)
[2022-11-23 18:10:20,105] INFO Added aliases 'VerifiableSourceConnector' and 'VerifiableSource' to plugin 'org.apache.kafka.connect.tools.VerifiableSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:422)
[2022-11-23 18:10:20,105] INFO Added aliases 'OraCdcJdbcSinkConnector' and 'OraCdcJdbcSink' to plugin 'solutions.a2.cdc.oracle.OraCdcJdbcSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:422)
[2022-11-23 18:10:20,105] INFO Added aliases 'OraCdcLogMinerConnector' and 'OraCdcLogMiner' to plugin 'solutions.a2.cdc.oracle.OraCdcLogMinerConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:422)
[2022-11-23 18:10:20,105] INFO Added aliases 'OraCdcSourceConnector' and 'OraCdcSource' to plugin 'solutions.a2.cdc.oracle.OraCdcSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:422)
[2022-11-23 18:10:20,105] INFO Added aliases 'AvroConverter' and 'Avro' to plugin 'io.confluent.connect.avro.AvroConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:422)
[2022-11-23 18:10:20,105] INFO Added aliases 'ByteArrayConverter' and 'ByteArray' to plugin 'org.apache.kafka.connect.converters.ByteArrayConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:422)
[2022-11-23 18:10:20,106] INFO Added aliases 'DoubleConverter' and 'Double' to plugin 'org.apache.kafka.connect.converters.DoubleConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:422)
[2022-11-23 18:10:20,106] INFO Added aliases 'FloatConverter' and 'Float' to plugin 'org.apache.kafka.connect.converters.FloatConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:422)
[2022-11-23 18:10:20,106] INFO Added aliases 'IntegerConverter' and 'Integer' to plugin 'org.apache.kafka.connect.converters.IntegerConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:422)
[2022-11-23 18:10:20,106] INFO Added aliases 'LongConverter' and 'Long' to plugin 'org.apache.kafka.connect.converters.LongConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:422)
[2022-11-23 18:10:20,106] INFO Added aliases 'ShortConverter' and 'Short' to plugin 'org.apache.kafka.connect.converters.ShortConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:422)
[2022-11-23 18:10:20,106] INFO Added aliases 'JsonConverter' and 'Json' to plugin 'org.apache.kafka.connect.json.JsonConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:422)
[2022-11-23 18:10:20,106] INFO Added aliases 'StringConverter' and 'String' to plugin 'org.apache.kafka.connect.storage.StringConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:422)
[2022-11-23 18:10:20,106] INFO Added aliases 'ByteArrayConverter' and 'ByteArray' to plugin 'org.apache.kafka.connect.converters.ByteArrayConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:422)
[2022-11-23 18:10:20,106] INFO Added aliases 'DoubleConverter' and 'Double' to plugin 'org.apache.kafka.connect.converters.DoubleConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:422)
[2022-11-23 18:10:20,106] INFO Added aliases 'FloatConverter' and 'Float' to plugin 'org.apache.kafka.connect.converters.FloatConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:422)
[2022-11-23 18:10:20,107] INFO Added aliases 'IntegerConverter' and 'Integer' to plugin 'org.apache.kafka.connect.converters.IntegerConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:422)
[2022-11-23 18:10:20,107] INFO Added aliases 'LongConverter' and 'Long' to plugin 'org.apache.kafka.connect.converters.LongConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:422)
[2022-11-23 18:10:20,107] INFO Added aliases 'ShortConverter' and 'Short' to plugin 'org.apache.kafka.connect.converters.ShortConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:422)
[2022-11-23 18:10:20,107] INFO Added aliases 'JsonConverter' and 'Json' to plugin 'org.apache.kafka.connect.json.JsonConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:422)
[2022-11-23 18:10:20,107] INFO Added alias 'SimpleHeaderConverter' to plugin 'org.apache.kafka.connect.storage.SimpleHeaderConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419)
[2022-11-23 18:10:20,107] INFO Added aliases 'StringConverter' and 'String' to plugin 'org.apache.kafka.connect.storage.StringConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:422)
[2022-11-23 18:10:20,107] INFO Added alias 'RegexRouter' to plugin 'org.apache.kafka.connect.transforms.RegexRouter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419)
[2022-11-23 18:10:20,107] INFO Added alias 'TimestampRouter' to plugin 'org.apache.kafka.connect.transforms.TimestampRouter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419)
[2022-11-23 18:10:20,107] INFO Added alias 'ValueToKey' to plugin 'org.apache.kafka.connect.transforms.ValueToKey' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419)
[2022-11-23 18:10:20,107] INFO Added alias 'BasicAuthSecurityRestExtension' to plugin 'org.apache.kafka.connect.rest.basic.auth.extension.BasicAuthSecurityRestExtension' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419)
[2022-11-23 18:10:20,108] INFO Added aliases 'AllConnectorClientConfigOverridePolicy' and 'All' to plugin 'org.apache.kafka.connect.connector.policy.AllConnectorClientConfigOverridePolicy' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:422)
[2022-11-23 18:10:20,108] INFO Added aliases 'NoneConnectorClientConfigOverridePolicy' and 'None' to plugin 'org.apache.kafka.connect.connector.policy.NoneConnectorClientConfigOverridePolicy' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:422)
[2022-11-23 18:10:20,108] INFO Added aliases 'PrincipalConnectorClientConfigOverridePolicy' and 'Principal' to plugin 'org.apache.kafka.connect.connector.policy.PrincipalConnectorClientConfigOverridePolicy' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:422)
[2022-11-23 18:10:20,130] INFO StandaloneConfig values: 
    access.control.allow.methods = 
    access.control.allow.origin = 
    admin.listeners = null
    bootstrap.servers = [172.18.6.112:9092]
    client.dns.lookup = default
    config.providers = []
    connector.client.config.override.policy = None
    header.converter = class org.apache.kafka.connect.storage.SimpleHeaderConverter
    internal.key.converter = class org.apache.kafka.connect.json.JsonConverter
    internal.value.converter = class org.apache.kafka.connect.json.JsonConverter
    key.converter = class org.apache.kafka.connect.json.JsonConverter
    listeners = null
    metric.reporters = []
    metrics.num.samples = 2
    metrics.recording.level = INFO
    metrics.sample.window.ms = 30000
    offset.flush.interval.ms = 10000
    offset.flush.timeout.ms = 5000
    offset.storage.file.filename = /tmp/connect.offsets
    plugin.path = [/kafka/connect]
    rest.advertised.host.name = null
    rest.advertised.listener = null
    rest.advertised.port = null
    rest.extension.classes = []
    rest.host.name = null
    rest.port = 8083
    ssl.cipher.suites = null
    ssl.client.auth = none
    ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
    ssl.endpoint.identification.algorithm = https
    ssl.key.password = null
    ssl.keymanager.algorithm = SunX509
    ssl.keystore.location = null
    ssl.keystore.password = null
    ssl.keystore.type = JKS
    ssl.protocol = TLS
    ssl.provider = null
    ssl.secure.random.implementation = null
    ssl.trustmanager.algorithm = PKIX
    ssl.truststore.location = null
    ssl.truststore.password = null
    ssl.truststore.type = JKS
    task.shutdown.graceful.timeout.ms = 5000
    value.converter = class org.apache.kafka.connect.json.JsonConverter
 (org.apache.kafka.connect.runtime.standalone.StandaloneConfig:347)
[2022-11-23 18:10:20,131] INFO Creating Kafka admin client (org.apache.kafka.connect.util.ConnectUtils:43)
[2022-11-23 18:10:20,134] INFO AdminClientConfig values: 
    bootstrap.servers = [172.18.6.112:9092]
    client.dns.lookup = default
    client.id = 
    connections.max.idle.ms = 300000
    metadata.max.age.ms = 300000
    metric.reporters = []
    metrics.num.samples = 2
    metrics.recording.level = INFO
    metrics.sample.window.ms = 30000
    receive.buffer.bytes = 65536
    reconnect.backoff.max.ms = 1000
    reconnect.backoff.ms = 50
    request.timeout.ms = 120000
    retries = 5
    retry.backoff.ms = 100
    sasl.client.callback.handler.class = null
    sasl.jaas.config = null
    sasl.kerberos.kinit.cmd = /usr/bin/kinit
    sasl.kerberos.min.time.before.relogin = 60000
    sasl.kerberos.service.name = null
    sasl.kerberos.ticket.renew.jitter = 0.05
    sasl.kerberos.ticket.renew.window.factor = 0.8
    sasl.login.callback.handler.class = null
    sasl.login.class = null
    sasl.login.refresh.buffer.seconds = 300
    sasl.login.refresh.min.period.seconds = 60
    sasl.login.refresh.window.factor = 0.8
    sasl.login.refresh.window.jitter = 0.05
    sasl.mechanism = GSSAPI
    security.protocol = PLAINTEXT
    security.providers = null
    send.buffer.bytes = 131072
    ssl.cipher.suites = null
    ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
    ssl.endpoint.identification.algorithm = https
    ssl.key.password = null
    ssl.keymanager.algorithm = SunX509
    ssl.keystore.location = null
    ssl.keystore.password = null
    ssl.keystore.type = JKS
    ssl.protocol = TLS
    ssl.provider = null
    ssl.secure.random.implementation = null
    ssl.trustmanager.algorithm = PKIX
    ssl.truststore.location = null
    ssl.truststore.password = null
    ssl.truststore.type = JKS
 (org.apache.kafka.clients.admin.AdminClientConfig:347)
[2022-11-23 18:10:20,177] WARN The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355)
[2022-11-23 18:10:20,177] WARN The configuration 'key.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355)
[2022-11-23 18:10:20,177] WARN The configuration 'offset.storage.file.filename' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355)
[2022-11-23 18:10:20,177] WARN The configuration 'value.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355)
[2022-11-23 18:10:20,177] WARN The configuration 'plugin.path' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355)
[2022-11-23 18:10:20,177] WARN The configuration 'value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355)
[2022-11-23 18:10:20,177] WARN The configuration 'key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355)
[2022-11-23 18:10:20,178] INFO Kafka version: 5.4.8-ccs (org.apache.kafka.common.utils.AppInfoParser:117)
[2022-11-23 18:10:20,178] INFO Kafka commitId: 3b3c46bb11268041 (org.apache.kafka.common.utils.AppInfoParser:118)
[2022-11-23 18:10:20,178] INFO Kafka startTimeMs: 1669198220177 (org.apache.kafka.common.utils.AppInfoParser:119)
[2022-11-23 18:10:20,379] INFO Kafka cluster ID: UT0FkBSwQRm0sSGMBI7VFg (org.apache.kafka.connect.util.ConnectUtils:59)
[2022-11-23 18:10:20,393] INFO Logging initialized @3470ms to org.eclipse.jetty.util.log.Slf4jLog (org.eclipse.jetty.util.log:170)
[2022-11-23 18:10:20,426] INFO Added connector for http://:8083 (org.apache.kafka.connect.runtime.rest.RestServer:131)
[2022-11-23 18:10:20,426] INFO Initializing REST server (org.apache.kafka.connect.runtime.rest.RestServer:203)
[2022-11-23 18:10:20,431] INFO jetty-9.4.44.v20210927; built: 2021-09-27T23:02:44.612Z; git: 8da83308eeca865e495e53ef315a249d63ba9332; jvm 11.0.8+10-LTS (org.eclipse.jetty.server.Server:375)
[2022-11-23 18:10:20,452] INFO Started http_8083@a137d7a{HTTP/1.1, (http/1.1)}{0.0.0.0:8083} (org.eclipse.jetty.server.AbstractConnector:331)
[2022-11-23 18:10:20,452] INFO Started @3529ms (org.eclipse.jetty.server.Server:415)
[2022-11-23 18:10:20,500] INFO Advertised URI: http://172.18.6.111:8083/ (org.apache.kafka.connect.runtime.rest.RestServer:365)
[2022-11-23 18:10:20,500] INFO REST server listening at http://172.18.6.111:8083/, advertising URL http://172.18.6.111:8083/ (org.apache.kafka.connect.runtime.rest.RestServer:218)
[2022-11-23 18:10:20,500] INFO Advertised URI: http://172.18.6.111:8083/ (org.apache.kafka.connect.runtime.rest.RestServer:365)
[2022-11-23 18:10:20,500] INFO REST admin endpoints at http://172.18.6.111:8083/ (org.apache.kafka.connect.runtime.rest.RestServer:219)
[2022-11-23 18:10:20,500] INFO Advertised URI: http://172.18.6.111:8083/ (org.apache.kafka.connect.runtime.rest.RestServer:365)
[2022-11-23 18:10:20,501] INFO Setting up None Policy for ConnectorClientConfigOverride. This will disallow any client configuration to be overridden (org.apache.kafka.connect.connector.policy.NoneConnectorClientConfigOverridePolicy:45)
[2022-11-23 18:10:20,509] INFO Kafka version: 5.4.8-ccs (org.apache.kafka.common.utils.AppInfoParser:117)
[2022-11-23 18:10:20,509] INFO Kafka commitId: 3b3c46bb11268041 (org.apache.kafka.common.utils.AppInfoParser:118)
[2022-11-23 18:10:20,509] INFO Kafka startTimeMs: 1669198220509 (org.apache.kafka.common.utils.AppInfoParser:119)
[2022-11-23 18:10:20,610] INFO JsonConverterConfig values: 
    converter.type = key
    decimal.format = BASE64
    schemas.cache.size = 1000
    schemas.enable = false
 (org.apache.kafka.connect.json.JsonConverterConfig:347)
[2022-11-23 18:10:20,611] INFO JsonConverterConfig values: 
    converter.type = value
    decimal.format = BASE64
    schemas.cache.size = 1000
    schemas.enable = false
 (org.apache.kafka.connect.json.JsonConverterConfig:347)
[2022-11-23 18:10:20,617] INFO Kafka Connect standalone worker initialization took 3307ms (org.apache.kafka.connect.cli.ConnectStandalone:100)
[2022-11-23 18:10:20,617] INFO Kafka Connect starting (org.apache.kafka.connect.runtime.Connect:50)
[2022-11-23 18:10:20,617] INFO Herder starting (org.apache.kafka.connect.runtime.standalone.StandaloneHerder:93)
[2022-11-23 18:10:20,617] INFO Worker starting (org.apache.kafka.connect.runtime.Worker:184)
[2022-11-23 18:10:20,617] INFO Starting FileOffsetBackingStore with file /tmp/connect.offsets (org.apache.kafka.connect.storage.FileOffsetBackingStore:58)
[2022-11-23 18:10:20,620] INFO Worker started (org.apache.kafka.connect.runtime.Worker:191)
[2022-11-23 18:10:20,620] INFO Herder started (org.apache.kafka.connect.runtime.standalone.StandaloneHerder:95)
[2022-11-23 18:10:20,620] INFO Initializing REST resources (org.apache.kafka.connect.runtime.rest.RestServer:223)
[2022-11-23 18:10:20,650] INFO Adding admin resources to main listener (org.apache.kafka.connect.runtime.rest.RestServer:240)
[2022-11-23 18:10:20,699] INFO DefaultSessionIdManager workerName=node0 (org.eclipse.jetty.server.session:334)
[2022-11-23 18:10:20,700] INFO No SessionScavenger set, using defaults (org.eclipse.jetty.server.session:339)
[2022-11-23 18:10:20,700] INFO node0 Scavenging every 660000ms (org.eclipse.jetty.server.session:132)
Nov 23, 2022 6:10:20 PM org.glassfish.jersey.internal.inject.Providers checkProviderRuntime
WARNING: A provider org.apache.kafka.connect.runtime.rest.resources.RootResource registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider org.apache.kafka.connect.runtime.rest.resources.RootResource will be ignored. 
Nov 23, 2022 6:10:20 PM org.glassfish.jersey.internal.inject.Providers checkProviderRuntime
WARNING: A provider org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource will be ignored. 
Nov 23, 2022 6:10:20 PM org.glassfish.jersey.internal.inject.Providers checkProviderRuntime
WARNING: A provider org.apache.kafka.connect.runtime.rest.resources.ConnectorPluginsResource registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider org.apache.kafka.connect.runtime.rest.resources.ConnectorPluginsResource will be ignored. 
Nov 23, 2022 6:10:20 PM org.glassfish.jersey.internal.inject.Providers checkProviderRuntime
WARNING: A provider org.apache.kafka.connect.runtime.rest.resources.LoggingResource registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider org.apache.kafka.connect.runtime.rest.resources.LoggingResource will be ignored. 
Nov 23, 2022 6:10:21 PM org.glassfish.jersey.internal.Errors logErrors
WARNING: The following warnings have been detected: WARNING: The (sub)resource method listLoggers in org.apache.kafka.connect.runtime.rest.resources.LoggingResource contains empty path annotation.
WARNING: The (sub)resource method createConnector in org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource contains empty path annotation.
WARNING: The (sub)resource method listConnectors in org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource contains empty path annotation.
WARNING: The (sub)resource method listConnectorPlugins in org.apache.kafka.connect.runtime.rest.resources.ConnectorPluginsResource contains empty path annotation.
WARNING: The (sub)resource method serverInfo in org.apache.kafka.connect.runtime.rest.resources.RootResource contains empty path annotation.

[2022-11-23 18:10:21,045] INFO Started o.e.j.s.ServletContextHandler@77b919a3{/,null,AVAILABLE} (org.eclipse.jetty.server.handler.ContextHandler:915)
[2022-11-23 18:10:21,046] INFO REST resources initialized; server is started and ready to handle requests (org.apache.kafka.connect.runtime.rest.RestServer:313)
[2022-11-23 18:10:21,046] INFO Kafka Connect started (org.apache.kafka.connect.runtime.Connect:56)
[2022-11-23 18:10:21,051] INFO AbstractConfig values: 
 (org.apache.kafka.common.config.AbstractConfig:347)
[2022-11-23 18:10:21,057] INFO ConnectorConfig values: 
    config.action.reload = restart
    connector.class = solutions.a2.cdc.oracle.OraCdcLogMinerConnector
    errors.log.enable = true
    errors.log.include.messages = true
    errors.retry.delay.max.ms = 60000
    errors.retry.timeout = 0
    errors.tolerance = none
    header.converter = null
    key.converter = null
    name = logminer-source-testapps
    tasks.max = 1
    transforms = []
    value.converter = null
 (org.apache.kafka.connect.runtime.ConnectorConfig:347)
[2022-11-23 18:10:21,058] INFO EnrichedConnectorConfig values: 
    config.action.reload = restart
    connector.class = solutions.a2.cdc.oracle.OraCdcLogMinerConnector
    errors.log.enable = true
    errors.log.include.messages = true
    errors.retry.delay.max.ms = 60000
    errors.retry.timeout = 0
    errors.tolerance = none
    header.converter = null
    key.converter = null
    name = logminer-source-testapps
    tasks.max = 1
    transforms = []
    value.converter = null
 (org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig:347)
[2022-11-23 18:10:21,058] INFO Creating connector logminer-source-testapps of type solutions.a2.cdc.oracle.OraCdcLogMinerConnector (org.apache.kafka.connect.runtime.Worker:253)
[2022-11-23 18:10:21,060] INFO Instantiated connector logminer-source-testapps with version 1.2.2 of type class solutions.a2.cdc.oracle.OraCdcLogMinerConnector (org.apache.kafka.connect.runtime.Worker:256)
[2022-11-23 18:10:21,060] INFO 
   _   ____                            _      
  /_\ |___ \    ___  _ __ __ _  ___ __| | ___ 
 //_\\  __) |  / _ \| '__/ _` |/ __/ _` |/ __|
/  _  \/ __/  | (_) | | | (_| | (_| (_| | (__ 
\_/ \_/_____|  \___/|_|  \__,_|\___\__,_|\___|

 (solutions.a2.cdc.oracle.OraCdcLogMinerConnector:72)
[2022-11-23 18:10:21,061] INFO Starting oracdc 'logminer-source-testapps' logminer source connector (solutions.a2.cdc.oracle.OraCdcLogMinerConnector:73)
[2022-11-23 18:10:21,061] INFO OraCdcSourceConnectorConfig values: 
    __a2.internal.rac.urls = []
    a2.archived.log.catalog = solutions.a2.cdc.oracle.OraCdcV$ArchivedLogImpl
    a2.batch.size = 1
    a2.connection.backoff = 30000
    a2.dictionary.file = 
    a2.distributed.activate = false
    a2.distributed.jdbc.url = 
    a2.distributed.target.host = 
    a2.distributed.target.port = 21521
    a2.distributed.wallet.location = 
    a2.exclude = []
    a2.fetch.size = 32
    a2.first.change = 0
    a2.include = [ABC.TEST]
    a2.initial.load = EXECUTE
    a2.jdbc.password = [hidden]
    a2.jdbc.url = jdbc:oracle:thin:@172.18.6.33:1521/orcl1
    a2.jdbc.username = abc
    a2.kafka.topic = oracdc-topic
    a2.lob.transformation.class = solutions.a2.cdc.oracle.data.OraCdcDefaultLobTransformationsImpl
    a2.logminer.trace = false
    a2.oracdc.schemas = false
    a2.persistent.state.file = /kafka/tempdir/oracdc.state
    a2.poll.interval = 2000
    a2.process.lobs = true
    a2.redo.count = 1
    a2.redo.size = 20000
    a2.resiliency.type = fault-tolerant
    a2.schema.type = kafka
    a2.standby.activate = false
    a2.standby.jdbc.url = 
    a2.standby.wallet.location = 
    a2.table.list.style = static
    a2.tmpdir = /kafka/tempdir
    a2.topic.name.delimiter = -
    a2.topic.name.style = SCHEMA_TABLE
    a2.topic.partition = 0
    a2.topic.prefix = 
    a2.use.rac = false
    a2.wallet.location = 
 (solutions.a2.cdc.oracle.OraCdcSourceConnectorConfig:347)
[2022-11-23 18:10:21,066] INFO Connection to RDBMS will be performed using Oracle username 'abc' (solutions.a2.cdc.oracle.OraCdcLogMinerConnector:137)
[2022-11-23 18:10:21,067] INFO Redo size threshold will be used instead of count of redo files. (solutions.a2.cdc.oracle.OraCdcLogMinerConnector:214)
[2022-11-23 18:10:21,067] INFO Finished creating connector logminer-source-testapps (org.apache.kafka.connect.runtime.Worker:275)
[2022-11-23 18:10:21,068] INFO SourceConnectorConfig values: 
    config.action.reload = restart
    connector.class = solutions.a2.cdc.oracle.OraCdcLogMinerConnector
    errors.log.enable = true
    errors.log.include.messages = true
    errors.retry.delay.max.ms = 60000
    errors.retry.timeout = 0
    errors.tolerance = none
    header.converter = null
    key.converter = null
    name = logminer-source-testapps
    tasks.max = 1
    transforms = []
    value.converter = null
 (org.apache.kafka.connect.runtime.SourceConnectorConfig:347)
[2022-11-23 18:10:21,068] INFO EnrichedConnectorConfig values: 
    config.action.reload = restart
    connector.class = solutions.a2.cdc.oracle.OraCdcLogMinerConnector
    errors.log.enable = true
    errors.log.include.messages = true
    errors.retry.delay.max.ms = 60000
    errors.retry.timeout = 0
    errors.tolerance = none
    header.converter = null
    key.converter = null
    name = logminer-source-testapps
    tasks.max = 1
    transforms = []
    value.converter = null
 (org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig:347)
[2022-11-23 18:10:21,070] INFO Creating task logminer-source-testapps-0 (org.apache.kafka.connect.runtime.Worker:421)
[2022-11-23 18:10:21,071] INFO ConnectorConfig values: 
    config.action.reload = restart
    connector.class = solutions.a2.cdc.oracle.OraCdcLogMinerConnector
    errors.log.enable = true
    errors.log.include.messages = true
    errors.retry.delay.max.ms = 60000
    errors.retry.timeout = 0
    errors.tolerance = none
    header.converter = null
    key.converter = null
    name = logminer-source-testapps
    tasks.max = 1
    transforms = []
    value.converter = null
 (org.apache.kafka.connect.runtime.ConnectorConfig:347)
[2022-11-23 18:10:21,072] INFO EnrichedConnectorConfig values: 
    config.action.reload = restart
    connector.class = solutions.a2.cdc.oracle.OraCdcLogMinerConnector
    errors.log.enable = true
    errors.log.include.messages = true
    errors.retry.delay.max.ms = 60000
    errors.retry.timeout = 0
    errors.tolerance = none
    header.converter = null
    key.converter = null
    name = logminer-source-testapps
    tasks.max = 1
    transforms = []
    value.converter = null
 (org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig:347)
[2022-11-23 18:10:21,075] INFO TaskConfig values: 
    task.class = class solutions.a2.cdc.oracle.OraCdcLogMinerTask
 (org.apache.kafka.connect.runtime.TaskConfig:347)
[2022-11-23 18:10:21,075] INFO Instantiated task logminer-source-testapps-0 with version 1.2.2 of type solutions.a2.cdc.oracle.OraCdcLogMinerTask (org.apache.kafka.connect.runtime.Worker:436)
[2022-11-23 18:10:21,075] INFO JsonConverterConfig values: 
    converter.type = key
    decimal.format = BASE64
    schemas.cache.size = 1000
    schemas.enable = true
 (org.apache.kafka.connect.json.JsonConverterConfig:347)
[2022-11-23 18:10:21,075] INFO Set up the key converter class org.apache.kafka.connect.json.JsonConverter for task logminer-source-testapps-0 using the worker config (org.apache.kafka.connect.runtime.Worker:449)
[2022-11-23 18:10:21,075] INFO JsonConverterConfig values: 
    converter.type = value
    decimal.format = BASE64
    schemas.cache.size = 1000
    schemas.enable = true
 (org.apache.kafka.connect.json.JsonConverterConfig:347)
[2022-11-23 18:10:21,075] INFO Set up the value converter class org.apache.kafka.connect.json.JsonConverter for task logminer-source-testapps-0 using the worker config (org.apache.kafka.connect.runtime.Worker:455)
[2022-11-23 18:10:21,076] INFO Set up the header converter class org.apache.kafka.connect.storage.SimpleHeaderConverter for task logminer-source-testapps-0 using the worker config (org.apache.kafka.connect.runtime.Worker:462)
[2022-11-23 18:10:21,079] INFO Initializing: org.apache.kafka.connect.runtime.TransformationChain{} (org.apache.kafka.connect.runtime.Worker:516)
[2022-11-23 18:10:21,083] INFO ProducerConfig values: 
    acks = all
    batch.size = 16384
    bootstrap.servers = [172.18.6.112:9092]
    buffer.memory = 33554432
    client.dns.lookup = default
    client.id = connector-producer-logminer-source-testapps-0
    compression.type = none
    connections.max.idle.ms = 540000
    delivery.timeout.ms = 2147483647
    enable.idempotence = false
    interceptor.classes = []
    key.serializer = class org.apache.kafka.common.serialization.ByteArraySerializer
    linger.ms = 0
    max.block.ms = 9223372036854775807
    max.in.flight.requests.per.connection = 1
    max.request.size = 1048576
    metadata.max.age.ms = 300000
    metric.reporters = []
    metrics.num.samples = 2
    metrics.recording.level = INFO
    metrics.sample.window.ms = 30000
    partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner
    receive.buffer.bytes = 32768
    reconnect.backoff.max.ms = 1000
    reconnect.backoff.ms = 50
    request.timeout.ms = 2147483647
    retries = 2147483647
    retry.backoff.ms = 100
    sasl.client.callback.handler.class = null
    sasl.jaas.config = null
    sasl.kerberos.kinit.cmd = /usr/bin/kinit
    sasl.kerberos.min.time.before.relogin = 60000
    sasl.kerberos.service.name = null
    sasl.kerberos.ticket.renew.jitter = 0.05
    sasl.kerberos.ticket.renew.window.factor = 0.8
    sasl.login.callback.handler.class = null
    sasl.login.class = null
    sasl.login.refresh.buffer.seconds = 300
    sasl.login.refresh.min.period.seconds = 60
    sasl.login.refresh.window.factor = 0.8
    sasl.login.refresh.window.jitter = 0.05
    sasl.mechanism = GSSAPI
    security.protocol = PLAINTEXT
    security.providers = null
    send.buffer.bytes = 131072
    ssl.cipher.suites = null
    ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
    ssl.endpoint.identification.algorithm = https
    ssl.key.password = null
    ssl.keymanager.algorithm = SunX509
    ssl.keystore.location = null
    ssl.keystore.password = null
    ssl.keystore.type = JKS
    ssl.protocol = TLS
    ssl.provider = null
    ssl.secure.random.implementation = null
    ssl.trustmanager.algorithm = PKIX
    ssl.truststore.location = null
    ssl.truststore.password = null
    ssl.truststore.type = JKS
    transaction.timeout.ms = 60000
    transactional.id = null
    value.serializer = class org.apache.kafka.common.serialization.ByteArraySerializer
 (org.apache.kafka.clients.producer.ProducerConfig:347)
[2022-11-23 18:10:21,099] INFO Kafka version: 5.4.8-ccs (org.apache.kafka.common.utils.AppInfoParser:117)
[2022-11-23 18:10:21,099] INFO Kafka commitId: 3b3c46bb11268041 (org.apache.kafka.common.utils.AppInfoParser:118)
[2022-11-23 18:10:21,099] INFO Kafka startTimeMs: 1669198221099 (org.apache.kafka.common.utils.AppInfoParser:119)
[2022-11-23 18:10:21,104] INFO Created connector logminer-source-testapps (org.apache.kafka.connect.cli.ConnectStandalone:112)
[2022-11-23 18:10:21,106] INFO Starting oracdc logminer source task for connector logminer-source-testapps. (solutions.a2.cdc.oracle.OraCdcLogMinerTask:126)
[2022-11-23 18:10:21,107] INFO OraCdcSourceConnectorConfig values: 
    __a2.internal.rac.urls = []
    a2.archived.log.catalog = solutions.a2.cdc.oracle.OraCdcV$ArchivedLogImpl
    a2.batch.size = 1
    a2.connection.backoff = 30000
    a2.dictionary.file = 
    a2.distributed.activate = false
    a2.distributed.jdbc.url = 
    a2.distributed.target.host = 
    a2.distributed.target.port = 21521
    a2.distributed.wallet.location = 
    a2.exclude = []
    a2.fetch.size = 32
    a2.first.change = 0
    a2.include = [ABC.TEST]
    a2.initial.load = EXECUTE
    a2.jdbc.password = [hidden]
    a2.jdbc.url = jdbc:oracle:thin:@172.18.6.33:1521/orcl1
    a2.jdbc.username = abc
    a2.kafka.topic = oracdc-topic
    a2.lob.transformation.class = solutions.a2.cdc.oracle.data.OraCdcDefaultLobTransformationsImpl
    a2.logminer.trace = false
    a2.oracdc.schemas = false
    a2.persistent.state.file = /kafka/tempdir/oracdc.state
    a2.poll.interval = 2000
    a2.process.lobs = true
    a2.redo.count = 1
    a2.redo.size = 20000
    a2.resiliency.type = fault-tolerant
    a2.schema.type = kafka
    a2.standby.activate = false
    a2.standby.jdbc.url = 
    a2.standby.wallet.location = 
    a2.table.list.style = static
    a2.tmpdir = /kafka/tempdir
    a2.topic.name.delimiter = -
    a2.topic.name.style = SCHEMA_TABLE
    a2.topic.partition = 0
    a2.topic.prefix = 
    a2.use.rac = false
    a2.wallet.location = 
 (solutions.a2.cdc.oracle.OraCdcSourceConnectorConfig:347)
[2022-11-23 18:10:21,114] INFO [Producer clientId=connector-producer-logminer-source-testapps-0] Cluster ID: UT0FkBSwQRm0sSGMBI7VFg (org.apache.kafka.clients.Metadata:259)
[2022-11-23 18:10:21,149] INFO oracdc will process Oracle LOBs using solutions.a2.cdc.oracle.data.OraCdcDefaultLobTransformationsImpl LOB transformations implementation (solutions.a2.cdc.oracle.OraCdcLogMinerTask:204)
[2022-11-23 18:10:21,904] INFO Connector logminer-source-testapps connected to Oracle Database 11g Enterprise Edition , 11.2.0.4.0
    $ORACLE_SID=orcl1, running on orcl1, OS Linux x86 64-bit. (solutions.a2.cdc.oracle.OraCdcLogMinerTask:247)
[2022-11-23 18:10:21,927] INFO No data present in connector's offset storage for orcl1_orcl1:1503552930 (solutions.a2.cdc.oracle.OraCdcLogMinerTask:541)
[2022-11-23 18:10:21,927] INFO oracdc will start from minimum available SCN in V$ARCHIVED_LOG = 4213287. (solutions.a2.cdc.oracle.OraCdcLogMinerTask:559)
[2022-11-23 18:10:21,935] INFO Initializing oracdc logminer archivelog worker thread (solutions.a2.cdc.oracle.OraCdcLogMinerWorkerThread:138)
[2022-11-23 18:10:21,989] INFO LogMiner will start from SCN 4213287 (solutions.a2.cdc.oracle.OraCdcV$ArchivedLogImpl:105)
[2022-11-23 18:10:21,989] INFO Mining database orcl1 is in OPEN mode (solutions.a2.cdc.oracle.OraCdcLogMinerWorkerThread:250)
[2022-11-23 18:10:21,989] INFO Same database will be used for dictionary query and mining (solutions.a2.cdc.oracle.OraCdcLogMinerWorkerThread:252)
[2022-11-23 18:10:21,989] INFO RowPrefetch size for accessing V$LOGMNR_CONTENTS set to 32. (solutions.a2.cdc.oracle.OraCdcLogMinerWorkerThread:270)
[2022-11-23 18:10:21,993] INFO Adding archived log /home/oracle/archive_log/1_18_1119103460.dbf thread# 1 sequence# 18 first change number 4213287 next log first change 4405566 (solutions.a2.cdc.oracle.OraCdcV$ArchivedLogImpl:201)
[2022-11-23 18:10:22,055] INFO Initializing oracdc initial load thread (solutions.a2.cdc.oracle.OraCdcInitialLoadThread:64)
[2022-11-23 18:10:22,056] INFO DB cores available 2, Kafka Cores available 4. (solutions.a2.cdc.oracle.OraCdcInitialLoadThread:74)
[2022-11-23 18:10:22,056] INFO {} parallel loaders for select phase will be used. (solutions.a2.cdc.oracle.OraCdcInitialLoadThread:75)
[2022-11-23 18:10:22,057] INFO WorkerSourceTask{id=logminer-source-testapps-0} Source task finished initialization and start (org.apache.kafka.connect.runtime.WorkerSourceTask:215)
[2022-11-23 18:10:22,057] INFO BEGIN: OraCdcInitialLoadThread.run() (solutions.a2.cdc.oracle.OraCdcInitialLoadThread:86)
[2022-11-23 18:10:22,061] INFO BEGIN: OraCdcLogMinerWorkerThread.run() (solutions.a2.cdc.oracle.OraCdcLogMinerWorkerThread:338)
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access using Lookup on net.openhft.chronicle.core.Jvm (file:/kafka/connect/lib/chronicle-core-2.21.95.jar) to class java.lang.reflect.AccessibleObject
WARNING: Please consider reporting this to the maintainers of net.openhft.chronicle.core.Jvm
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
[2022-11-23 18:10:22,100] INFO Chronicle core loaded from file:/kafka/connect/lib/chronicle-core-2.21.95.jar (net.openhft.chronicle.core.Jvm:169)
Nov 23, 2022 6:10:22 PM net.openhft.chronicle.core.cleaner.impl.reflect.ReflectionBasedByteBufferCleanerService <clinit>
WARNING: Make sure you have set the command line option "--illegal-access=permit --add-exports java.base/jdk.internal.ref=ALL-UNNAMED" to enable ReflectionBasedByteBufferCleanerService
[2022-11-23 18:10:22,153] INFO Took 7 ms to add mapping for /kafka/tempdir/ABC.TEST.1389930030663974984/metadata.cq4t (net.openhft.chronicle.bytes.MappedFile:56)
[2022-11-23 18:10:22,211] INFO Running under OpenJDK Runtime Environment 11.0.8+10-LTS with 4 processors reported. (net.openhft.chronicle.core.internal.announcer.InternalAnnouncer:56)
[2022-11-23 18:10:22,213] INFO Process id: 13020 :: Chronicle Queue (5.21.99) (net.openhft.chronicle.core.internal.announcer.InternalAnnouncer:56)
[2022-11-23 18:10:22,214] INFO Analytics: Chronicle Queue reports usage statistics. Learn more or turn off: https://github.com/OpenHFT/Chronicle-Queue/blob/master/DISCLAIMER.adoc (net.openhft.chronicle.core.internal.announcer.InternalAnnouncer:56)
[2022-11-23 18:10:22,229] INFO Table ABC.TEST (DEPENDENCY='DISABLED') initial load (read phase) started. (solutions.a2.cdc.oracle.OraTable4InitialLoad:582)
[2022-11-23 18:10:22,234] INFO Table ABC.TEST initial load (read phase) completed. 0 rows read. (solutions.a2.cdc.oracle.OraTable4InitialLoad:597)
[2022-11-23 18:10:22,234] INFO END: OraCdcInitialLoadThread.run(), elapsed time 177 ms (solutions.a2.cdc.oracle.OraCdcInitialLoadThread:117)
[2022-11-23 18:10:24,057] INFO Table ABC.TEST initial load (send to Kafka phase) started. (solutions.a2.cdc.oracle.OraCdcLogMinerTask:807)
[2022-11-23 18:10:24,059] INFO Table ABC.TEST initial load (send to Kafka phase) completed. (solutions.a2.cdc.oracle.OraCdcLogMinerTask:821)
[2022-11-23 18:10:26,111] INFO Initial load completed (solutions.a2.cdc.oracle.OraCdcLogMinerTask:791)
[2022-11-23 18:10:31,113] INFO WorkerSourceTask{id=logminer-source-testapps-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:422)
[2022-11-23 18:10:31,116] INFO WorkerSourceTask{id=logminer-source-testapps-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:439)
[2022-11-23 18:10:41,117] INFO WorkerSourceTask{id=logminer-source-testapps-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:422)
[2022-11-23 18:10:41,117] INFO WorkerSourceTask{id=logminer-source-testapps-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:439)
[2022-11-23 18:10:41,931] INFO Adding archived log /home/oracle/archive_log/1_19_1119103460.dbf thread# 1 sequence# 19 first change number 4405566 next log first change 4621919 (solutions.a2.cdc.oracle.OraCdcV$ArchivedLogImpl:201)
[2022-11-23 18:10:42,934] INFO Took 2.375 ms to pollDiskSpace for /kafka/tempdir/03000A00F60F0000.3552841194382771644 (net.openhft.chronicle.threads.DiskSpaceMonitor:56)
[2022-11-23 18:10:42,988] INFO Took 0.296 ms to pollDiskSpace for /kafka/tempdir/09000700E60F0000.5176074577321270222 (net.openhft.chronicle.threads.DiskSpaceMonitor:56)
[2022-11-23 18:10:43,106] WARN Unsupported DDL operation 'truncate table test' at SCN 4421942 for object ID 87351 (solutions.a2.cdc.oracle.OraCdcLogMinerWorkerThread:605)
[2022-11-23 18:10:43,114] INFO Took 0.252 ms to pollDiskSpace for /kafka/tempdir/02000600EA0F0000.3665596025644710700 (net.openhft.chronicle.threads.DiskSpaceMonitor:56)
[2022-11-23 18:10:43,124] INFO Took 0.284 ms to pollDiskSpace for /kafka/tempdir/03001300F80F0000.14851711087510529675 (net.openhft.chronicle.threads.DiskSpaceMonitor:56)
[2022-11-23 18:10:43,140] INFO Took 0.381 ms to pollDiskSpace for /kafka/tempdir/060011009D100000.18223063542442876840 (net.openhft.chronicle.threads.DiskSpaceMonitor:56)
[2022-11-23 18:10:43,145] INFO Took 0.342 ms to pollDiskSpace for /kafka/tempdir/0A002000EB0B0000.3078809154074983254 (net.openhft.chronicle.threads.DiskSpaceMonitor:56)
[2022-11-23 18:10:51,117] INFO WorkerSourceTask{id=logminer-source-testapps-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:422)
[2022-11-23 18:10:51,118] INFO WorkerSourceTask{id=logminer-source-testapps-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:439)
[2022-11-23 18:10:51,122] INFO WorkerSourceTask{id=logminer-source-testapps-0} Finished commitOffsets successfully in 4 ms (org.apache.kafka.connect.runtime.WorkerSourceTask:521)
[2022-11-23 18:11:01,002] INFO Adding archived log /home/oracle/archive_log/1_20_1119103460.dbf thread# 1 sequence# 20 first change number 4621919 next log first change 4811177 (solutions.a2.cdc.oracle.OraCdcV$ArchivedLogImpl:201)
[2022-11-23 18:11:01,123] INFO WorkerSourceTask{id=logminer-source-testapps-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:422)
[2022-11-23 18:11:01,123] INFO WorkerSourceTask{id=logminer-source-testapps-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:439)
[2022-11-23 18:11:11,123] INFO WorkerSourceTask{id=logminer-source-testapps-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:422)
[2022-11-23 18:11:11,123] INFO WorkerSourceTask{id=logminer-source-testapps-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:439)
[2022-11-23 18:11:13,089] INFO Took 0.26 ms to pollDiskSpace for /kafka/tempdir/03000E0016110000.5544017224543104765 (net.openhft.chronicle.threads.DiskSpaceMonitor:56)
[2022-11-23 18:11:21,002] INFO Adding archived log /home/oracle/archive_log/1_21_1119103460.dbf thread# 1 sequence# 21 first change number 4811177 next log first change 5064790 (solutions.a2.cdc.oracle.OraCdcV$ArchivedLogImpl:201)
[2022-11-23 18:11:21,124] INFO WorkerSourceTask{id=logminer-source-testapps-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:422)
[2022-11-23 18:11:21,124] INFO WorkerSourceTask{id=logminer-source-testapps-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:439)
[2022-11-23 18:11:21,126] INFO WorkerSourceTask{id=logminer-source-testapps-0} Finished commitOffsets successfully in 2 ms (org.apache.kafka.connect.runtime.WorkerSourceTask:521)
[2022-11-23 18:11:31,126] INFO WorkerSourceTask{id=logminer-source-testapps-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:422)
[2022-11-23 18:11:31,126] INFO WorkerSourceTask{id=logminer-source-testapps-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:439)
[2022-11-23 18:11:40,325] INFO Adding archived log /home/oracle/archive_log/1_22_1119103460.dbf thread# 1 sequence# 22 first change number 5064790 next log first change 5254983 (solutions.a2.cdc.oracle.OraCdcV$ArchivedLogImpl:201)
[2022-11-23 18:11:41,127] INFO WorkerSourceTask{id=logminer-source-testapps-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:422)
[2022-11-23 18:11:41,127] INFO WorkerSourceTask{id=logminer-source-testapps-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:439)
[2022-11-23 18:11:51,127] INFO WorkerSourceTask{id=logminer-source-testapps-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:422)
[2022-11-23 18:11:51,127] INFO WorkerSourceTask{id=logminer-source-testapps-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:439)
[2022-11-23 18:11:59,195] INFO Adding archived log /home/oracle/archive_log/1_23_1119103460.dbf thread# 1 sequence# 23 first change number 5254983 next log first change 5503415 (solutions.a2.cdc.oracle.OraCdcV$ArchivedLogImpl:201)
[2022-11-23 18:12:01,128] INFO WorkerSourceTask{id=logminer-source-testapps-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:422)
[2022-11-23 18:12:01,128] INFO WorkerSourceTask{id=logminer-source-testapps-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:439)
[2022-11-23 18:12:11,129] INFO WorkerSourceTask{id=logminer-source-testapps-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:422)
[2022-11-23 18:12:11,129] INFO WorkerSourceTask{id=logminer-source-testapps-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:439)

I test this table :

insert into test values(10)

The data obtained by the consumer is as follows:

2022-11-23 18:10:30,868] WARN [Consumer clientId=consumer-console-consumer-82334-1, groupId=console-consumer-82334] Error while fetching metadata with correlation id 2 : {ABC-TEST=LEADER_NOT_AVAILABLE} (org.apache.kafka.clients.NetworkClient)
{"schema":{"type":"struct","fields":[{"type":"string","optional":true,"field":"MYID"}],"optional":true,"name":"ABC.TEST.Value","version":1},"payload":{"MYID":"1"}}
{"schema":{"type":"struct","fields":[{"type":"string","optional":true,"field":"MYID"}],"optional":true,"name":"ABC.TEST.Value","version":1},"payload":{"MYID":"1"}}
{"schema":{"type":"struct","fields":[{"type":"string","optional":true,"field":"MYID"}],"optional":true,"name":"ABC.TEST.Value","version":1},"payload":{"MYID":"1"}}
{"schema":{"type":"struct","fields":[{"type":"string","optional":true,"field":"MYID"}],"optional":true,"name":"ABC.TEST.Value","version":1},"payload":{"MYID":"1"}}
{"schema":{"type":"struct","fields":[{"type":"string","optional":true,"field":"MYID"}],"optional":true,"name":"ABC.TEST.Value","version":1},"payload":{"MYID":"1"}}
{"schema":{"type":"struct","fields":[{"type":"string","optional":true,"field":"MYID"}],"optional":true,"name":"ABC.TEST.Value","version":1},"payload":{"MYID":"1"}}
{"schema":{"type":"struct","fields":[{"type":"string","optional":true,"field":"MYID"}],"optional":true,"name":"ABC.TEST.Value","version":1},"payload":{"MYID":"1"}}
{"schema":{"type":"struct","fields":[{"type":"string","optional":true,"field":"MYID"}],"optional":true,"name":"ABC.TEST.Value","version":1},"payload":{"MYID":"1"}}
{"schema":{"type":"struct","fields":[{"type":"string","optional":true,"field":"MYID"}],"optional":true,"name":"ABC.TEST.Value","version":1},"payload":{"MYID":"1"}}
{"schema":{"type":"struct","fields":[{"type":"string","optional":true,"field":"MYID"}],"optional":true,"name":"ABC.TEST.Value","version":1},"payload":{"MYID":"1"}}
{"schema":{"type":"struct","fields":[{"type":"string","optional":true,"field":"MYID"}],"optional":true,"name":"ABC.TEST.Value","version":1},"payload":{"MYID":"1"}}
averemee-si commented 1 year ago

Hello,

For cleaning everything just rename connector from logminer-source-testapps to new name, for instance - logminer-source-testapps-00 It is expected that /kafka/tempdir/ is empty as I wrote before - if you like we can schedule Google Meet session (I'm located in CET TZ) to speedup resolution of this issue.

Best regards, Aleksei