apache / flink-cdc

Flink CDC is a streaming data integration tool
https://nightlies.apache.org/flink/flink-cdc-docs-stable
Apache License 2.0
5.71k stars 1.94k forks source link

请问出现这个错误(io.debezium.DebeziumException: Failed to deserialize data of EventHeaderV4{timestamp=1627541881000,)是什么原因? #276

Closed hejiay closed 3 years ago

hejiay commented 3 years ago

org.apache.kafka.connect.errors.ConnectException: An exception occurred in the change event producer. This connector will be stopped. at io.debezium.pipeline.ErrorHandler.setProducerThrowable(ErrorHandler.java:42) at io.debezium.connector.mysql.MySqlStreamingChangeEventSource$ReaderThreadLifecycleListener.onEventDeserializationFailure(MySqlStreamingChangeEventSource.java:1189) at com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:958) at com.github.shyiko.mysql.binlog.BinaryLogClient.connect(BinaryLogClient.java:606) at com.github.shyiko.mysql.binlog.BinaryLogClient$7.run(BinaryLogClient.java:850) at java.lang.Thread.run(Thread.java:748) Caused by: io.debezium.DebeziumException: Failed to deserialize data of EventHeaderV4{timestamp=1627541881000, eventType=WRITE_ROWS, serverId=2, headerLength=19, dataLength=15, nextPosition=8448, flags=0} at io.debezium.connector.mysql.MySqlStreamingChangeEventSource.wrap(MySqlStreamingChangeEventSource.java:1142) ... 5 more Caused by: com.github.shyiko.mysql.binlog.event.deserialization.EventDataDeserializationException: Failed to deserialize data of EventHeaderV4{timestamp=1627541881000, eventType=WRITE_ROWS, serverId=2, headerLength=19, dataLength=15, nextPosition=8448, flags=0} at com.github.shyiko.mysql.binlog.event.deserialization.EventDeserializer.deserializeEventData(EventDeserializer.java:309) at com.github.shyiko.mysql.binlog.event.deserialization.EventDeserializer.nextEvent(EventDeserializer.java:232) at io.debezium.connector.mysql.MySqlStreamingChangeEventSource$1.nextEvent(MySqlStreamingChangeEventSource.java:233) at com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:945) ... 3 more Caused by: com.github.shyiko.mysql.binlog.event.deserialization.MissingTableMapEventException: No TableMapEventData has been found for table id:65. Usually that means that you have started reading binary log 'within the logical event group' (e.g. from WRITE_ROWS and not proceeding TABLE_MAP at com.github.shyiko.mysql.binlog.event.deserialization.AbstractRowsEventDataDeserializer.deserializeRow(AbstractRowsEventDataDeserializer.java:109) at com.github.shyiko.mysql.binlog.event.deserialization.WriteRowsEventDataDeserializer.deserializeRows(WriteRowsEventDataDeserializer.java:64) at com.github.shyiko.mysql.binlog.event.deserialization.WriteRowsEventDataDeserializer.deserialize(WriteRowsEventDataDeserializer.java:56) at com.github.shyiko.mysql.binlog.event.deserialization.WriteRowsEventDataDeserializer.deserialize(WriteRowsEventDataDeserializer.java:32) at com.github.shyiko.mysql.binlog.event.deserialization.EventDeserializer.deserializeEventData(EventDeserializer.java:303) ... 6 more

enterwhat commented 3 years ago

请问你解决了吗?啥原因?

ericshao commented 3 years ago

org.apache.kafka.connect.errors.ConnectException: An exception occurred in the change event producer. This connector will be stopped. at io.debezium.pipeline.ErrorHandler.setProducerThrowable(ErrorHandler.java:42) at io.debezium.connector.mysql.MySqlStreamingChangeEventSource$ReaderThreadLifecycleListener.onEventDeserializationFailure(MySqlStreamingChangeEventSource.java:1189) at com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:958) at com.github.shyiko.mysql.binlog.BinaryLogClient.connect(BinaryLogClient.java:606) at com.github.shyiko.mysql.binlog.BinaryLogClient$7.run(BinaryLogClient.java:850) at java.lang.Thread.run(Thread.java:748) Caused by: io.debezium.DebeziumException: Failed to deserialize data of EventHeaderV4{timestamp=1627541881000, eventType=WRITE_ROWS, serverId=2, headerLength=19, dataLength=15, nextPosition=8448, flags=0} at io.debezium.connector.mysql.MySqlStreamingChangeEventSource.wrap(MySqlStreamingChangeEventSource.java:1142) ... 5 more Caused by: com.github.shyiko.mysql.binlog.event.deserialization.EventDataDeserializationException: Failed to deserialize data of EventHeaderV4{timestamp=1627541881000, eventType=WRITE_ROWS, serverId=2, headerLength=19, dataLength=15, nextPosition=8448, flags=0} at com.github.shyiko.mysql.binlog.event.deserialization.EventDeserializer.deserializeEventData(EventDeserializer.java:309) at com.github.shyiko.mysql.binlog.event.deserialization.EventDeserializer.nextEvent(EventDeserializer.java:232) at io.debezium.connector.mysql.MySqlStreamingChangeEventSource$1.nextEvent(MySqlStreamingChangeEventSource.java:233) at com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:945) ... 3 more Caused by: com.github.shyiko.mysql.binlog.event.deserialization.MissingTableMapEventException: No TableMapEventData has been found for table id:65. Usually that means that you have started reading binary log 'within the logical event group' (e.g. from WRITE_ROWS and not proceeding TABLE_MAP at com.github.shyiko.mysql.binlog.event.deserialization.AbstractRowsEventDataDeserializer.deserializeRow(AbstractRowsEventDataDeserializer.java:109) at com.github.shyiko.mysql.binlog.event.deserialization.WriteRowsEventDataDeserializer.deserializeRows(WriteRowsEventDataDeserializer.java:64) at com.github.shyiko.mysql.binlog.event.deserialization.WriteRowsEventDataDeserializer.deserialize(WriteRowsEventDataDeserializer.java:56) at com.github.shyiko.mysql.binlog.event.deserialization.WriteRowsEventDataDeserializer.deserialize(WriteRowsEventDataDeserializer.java:32) at com.github.shyiko.mysql.binlog.event.deserialization.EventDeserializer.deserializeEventData(EventDeserializer.java:303) ... 6 more

the same issue occurred

hejiay commented 3 years ago

... 5 more do you have

请问你解决了吗?啥原因?

没有呢,你解决了么

leonardBang commented 3 years ago

I will fix this one, guys

leonardBang commented 3 years ago

post the detail information

java.lang.RuntimeException: One or more fetchers have encountered exception
  at org.apache.flink.connector.base.source.reader.fetcher.SplitFetcherManager.checkErrors(SplitFetcherManager.java:199)
  at org.apache.flink.connector.base.source.reader.SourceReaderBase.getNextFetch(SourceReaderBase.java:163)
  at org.apache.flink.connector.base.source.reader.SourceReaderBase.pollNext(SourceReaderBase.java:125)
  at org.apache.flink.connector.base.source.reader.SourceReaderBase.pollNext(SourceReaderBase.java:150)
  at org.apache.flink.streaming.api.operators.SourceOperator.emitNext(SourceOperator.java:314)
  at org.apache.flink.streaming.runtime.io.StreamTaskSourceInput.emitNext(StreamTaskSourceInput.java:69)
  at org.apache.flink.streaming.runtime.io.StreamOneInputProcessor.processInput(StreamOneInputProcessor.java:66)
  at org.apache.flink.streaming.runtime.tasks.StreamTask.processInput(StreamTask.java:423)
  at org.apache.flink.streaming.runtime.tasks.mailbox.MailboxProcessor.runMailboxLoop(MailboxProcessor.java:204)
  at org.apache.flink.streaming.runtime.tasks.StreamTask.runMailboxLoop(StreamTask.java:684)
  at org.apache.flink.streaming.runtime.tasks.StreamTask.executeInvoke(StreamTask.java:639)
  at org.apache.flink.streaming.runtime.tasks.StreamTask.runWithCleanUpOnFail(StreamTask.java:650)
  at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:623)
  at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:779)
  at org.apache.flink.runtime.taskmanager.Task.run(Task.java:566)
  at java.lang.Thread.run(Thread.java:834)
Caused by: java.lang.RuntimeException: SplitFetcher thread 0 received unexpected exception while polling the records
  at org.apache.flink.connector.base.source.reader.fetcher.SplitFetcher.runOnce(SplitFetcher.java:146)
  at org.apache.flink.connector.base.source.reader.fetcher.SplitFetcher.run(SplitFetcher.java:101)
  at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
  at java.util.concurrent.FutureTask.run(FutureTask.java:266)
  at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1147)
  at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:622)
  ... 1 more
Caused by: org.apache.kafka.connect.errors.ConnectException: An exception occurred in the change event producer. This connector will be stopped.
  at io.debezium.pipeline.ErrorHandler.setProducerThrowable(ErrorHandler.java:42)
  at io.debezium.connector.mysql.MySqlStreamingChangeEventSource$ReaderThreadLifecycleListener.onEventDeserializationFailure(MySqlStreamingChangeEventSource.java:1189)
  at com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:958)
  at com.github.shyiko.mysql.binlog.BinaryLogClient.connect(BinaryLogClient.java:606)
  at com.github.shyiko.mysql.binlog.BinaryLogClient$7.run(BinaryLogClient.java:850)
  ... 1 more
Caused by: io.debezium.DebeziumException: Failed to deserialize data of EventHeaderV4{timestamp=1630293411000, eventType=UPDATE_ROWS, serverId=958258332, headerLength=19, dataLength=1119, nextPosition=2607515, flags=0}
  at io.debezium.connector.mysql.MySqlStreamingChangeEventSource.wrap(MySqlStreamingChangeEventSource.java:1142)
  ... 5 more
Caused by: com.github.shyiko.mysql.binlog.event.deserialization.EventDataDeserializationException: Failed to deserialize data of EventHeaderV4{timestamp=1630293411000, eventType=UPDATE_ROWS, serverId=958258332, headerLength=19, dataLength=1119, nextPosition=2607515, flags=0}
  at com.github.shyiko.mysql.binlog.event.deserialization.EventDeserializer.deserializeEventData(EventDeserializer.java:309)
  at com.github.shyiko.mysql.binlog.event.deserialization.EventDeserializer.nextEvent(EventDeserializer.java:232)
  at io.debezium.connector.mysql.MySqlStreamingChangeEventSource$1.nextEvent(MySqlStreamingChangeEventSource.java:233)
  at com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:945)
  ... 3 more
Caused by: com.github.shyiko.mysql.binlog.event.deserialization.MissingTableMapEventException: No TableMapEventData has been found for table id:536. Usually that means that you have started reading binary log 'within the logical event group' (e.g. from WRITE_ROWS and not proceeding TABLE_MAP
  at com.github.shyiko.mysql.binlog.event.deserialization.AbstractRowsEventDataDeserializer.deserializeRow(AbstractRowsEventDataDeserializer.java:109)
  at com.github.shyiko.mysql.binlog.event.deserialization.UpdateRowsEventDataDeserializer.deserializeRows(UpdateRowsEventDataDeserializer.java:71)
  at com.github.shyiko.mysql.binlog.event.deserialization.UpdateRowsEventDataDeserializer.deserialize(UpdateRowsEventDataDeserializer.java:58)
  at com.github.shyiko.mysql.binlog.event.deserialization.UpdateRowsEventDataDeserializer.deserialize(UpdateRowsEventDataDeserializer.java:33)
  at com.github.shyiko.mysql.binlog.event.deserialization.EventDeserializer.deserializeEventData(EventDeserializer.java:303)
  ... 6 more
findczk commented 3 years ago

I will fix this one, guys

when it can be fixed?

leonardBang commented 3 years ago

@findczk Are you getting the issue in production environment ?

findczk commented 3 years ago

@findczk Are you getting the issue in production environment ?

hi! Is preparing to use in a production environment, but stopped because of this issue

dczeeee commented 3 years ago

same problem

hunter-jin commented 3 years ago

same problem

findczk commented 3 years ago

I will fix this one, guys

hi, when it can be fixed?We look forward to this issue being fixed.

wuchong commented 3 years ago

image

leonardBang commented 3 years ago

image

leonardBang commented 3 years ago

Caused by: com.github.shyiko.mysql.binlog.event.deserialization.MissingTableMapEventException: No TableMapEventData has been found for table id:203830. Usually that means that you have started reading binary log 'within the logical event group' (e.g. from WRITE_ROWS and not proceeding TABLE_MAP

peterjqy commented 3 years ago

这个bug已经修复了吗?

leonardBang commented 3 years ago

image

iknowitisawar commented 2 years ago

image

2.2.0 still has this problem.. sign...

leonardBang commented 2 years ago

2.2.0 still has this problem.. sign...

Could you share your MySQL version and configuration? and it's great if you can offer more details to reproduce the issue.

l1183479157 commented 2 years ago

问题还没有解决吗,2.2.1版本我这边也出现这个问题了

duanminhui commented 2 years ago

期待解决这个问题丫,在极端情况下。mysql负载严重的时候,必出现这种问题。

xuhaiL commented 2 years ago

出现这个异常后,数据发生了丢失,你们有观察发现嘛?

enterwhat commented 2 years ago

之前遇到过,初步排查应该是source端或者sink端数据读写超时

在 2022-09-30 10:14:33,"xu haiLong" @.***> 写道:

出现这个异常后,数据发生了丢失,你们有观察发现嘛?

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.***>

ChenShuai1981 commented 2 years ago

我也遇到了相同的问题,flink 1.13.2 + cdc 2.1.0 消费mysql v5.7.26 binlog,作业成功进行3次checkpoint之后开始频繁重启,抛错日志如下:

2022-10-17 11:48:21,621 ERROR io.debezium.connector.mysql.MySqlStreamingChangeEventSource  [] - Error during binlog processing. Last offset stored = null, binlog reader near position = mysql-bin.009532/63197184
2022-10-17 11:48:22,308 ERROR io.debezium.pipeline.ErrorHandler                            [] - Producer failure
io.debezium.DebeziumException: Failed to deserialize data of EventHeaderV4{timestamp=1665978065000, eventType=UPDATE_ROWS, serverId=1940348705, headerLength=19, dataLength=2201, nextPosition=63199535, flags=0}
        at io.debezium.connector.mysql.MySqlStreamingChangeEventSource.wrap(MySqlStreamingChangeEventSource.java:1146) ~[blob_p-a536766096736050a4a547179c8b9e0478fae327-0ab3c26039e8cbf9fe0d0837d5972cc8:?]
        at io.debezium.connector.mysql.MySqlStreamingChangeEventSource$ReaderThreadLifecycleListener.onCommunicationFailure(MySqlStreamingChangeEventSource.java:1185) [blob_p-a536766096736050a4a547179c8b9e0478fae327-0ab3c26039e8cbf9fe0d0837d5972cc8:?]
        at com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:973) [blob_p-a536766096736050a4a547179c8b9e0478fae327-0ab3c26039e8cbf9fe0d0837d5972cc8:?]
        at com.github.shyiko.mysql.binlog.BinaryLogClient.connect(BinaryLogClient.java:606) [blob_p-a536766096736050a4a547179c8b9e0478fae327-0ab3c26039e8cbf9fe0d0837d5972cc8:?]
        at com.github.shyiko.mysql.binlog.BinaryLogClient$7.run(BinaryLogClient.java:850) [blob_p-a536766096736050a4a547179c8b9e0478fae327-0ab3c26039e8cbf9fe0d0837d5972cc8:?]
        at java.lang.Thread.run(Thread.java:748) [?:1.8.0_312]
Caused by: com.github.shyiko.mysql.binlog.event.deserialization.EventDataDeserializationException: Failed to deserialize data of EventHeaderV4{timestamp=1665978065000, eventType=UPDATE_ROWS, serverId=1940348705, headerLength=19, dataLength=2201, nextPosition=63199535, flags=0}
        at com.github.shyiko.mysql.binlog.event.deserialization.EventDeserializer.deserializeEventData(EventDeserializer.java:309) ~[blob_p-a536766096736050a4a547179c8b9e0478fae327-0ab3c26039e8cbf9fe0d0837d5972cc8:?]
        at com.github.shyiko.mysql.binlog.event.deserialization.EventDeserializer.nextEvent(EventDeserializer.java:232) ~[blob_p-a536766096736050a4a547179c8b9e0478fae327-0ab3c26039e8cbf9fe0d0837d5972cc8:?]
        at io.debezium.connector.mysql.MySqlStreamingChangeEventSource$1.nextEvent(MySqlStreamingChangeEventSource.java:233) ~[blob_p-a536766096736050a4a547179c8b9e0478fae327-0ab3c26039e8cbf9fe0d0837d5972cc8:?]
        at com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:945) ~[blob_p-a536766096736050a4a547179c8b9e0478fae327-0ab3c26039e8cbf9fe0d0837d5972cc8:?]
        ... 3 more
Caused by: java.io.EOFException
        at com.github.shyiko.mysql.binlog.io.ByteArrayInputStream.fill(ByteArrayInputStream.java:113) ~[blob_p-a536766096736050a4a547179c8b9e0478fae327-0ab3c26039e8cbf9fe0d0837d5972cc8:?]
        at com.github.shyiko.mysql.binlog.io.ByteArrayInputStream.read(ByteArrayInputStream.java:104) ~[blob_p-a536766096736050a4a547179c8b9e0478fae327-0ab3c26039e8cbf9fe0d0837d5972cc8:?]
        at io.debezium.connector.mysql.RowDeserializers.deserializeVarString(RowDeserializers.java:264) ~[blob_p-a536766096736050a4a547179c8b9e0478fae327-0ab3c26039e8cbf9fe0d0837d5972cc8:?]
        at io.debezium.connector.mysql.RowDeserializers$UpdateRowsDeserializer.deserializeVarString(RowDeserializers.java:130) ~[blob_p-a536766096736050a4a547179c8b9e0478fae327-0ab3c26039e8cbf9fe0d0837d5972cc8:?]
        at com.github.shyiko.mysql.binlog.event.deserialization.AbstractRowsEventDataDeserializer.deserializeCell(AbstractRowsEventDataDeserializer.java:189) ~[blob_p-a536766096736050a4a547179c8b9e0478fae327-0ab3c26039e8cbf9fe0d0837d5972cc8:?]
        at com.github.shyiko.mysql.binlog.event.deserialization.AbstractRowsEventDataDeserializer.deserializeRow(AbstractRowsEventDataDeserializer.java:143) ~[blob_p-a536766096736050a4a547179c8b9e0478fae327-0ab3c26039e8cbf9fe0d0837d5972cc8:?]
        at com.github.shyiko.mysql.binlog.event.deserialization.UpdateRowsEventDataDeserializer.deserializeRows(UpdateRowsEventDataDeserializer.java:72) ~[blob_p-a536766096736050a4a547179c8b9e0478fae327-0ab3c26039e8cbf9fe0d0837d5972cc8:?]
        at com.github.shyiko.mysql.binlog.event.deserialization.UpdateRowsEventDataDeserializer.deserialize(UpdateRowsEventDataDeserializer.java:58) ~[blob_p-a536766096736050a4a547179c8b9e0478fae327-0ab3c26039e8cbf9fe0d0837d5972cc8:?]
        at com.github.shyiko.mysql.binlog.event.deserialization.UpdateRowsEventDataDeserializer.deserialize(UpdateRowsEventDataDeserializer.java:33) ~[blob_p-a536766096736050a4a547179c8b9e0478fae327-0ab3c26039e8cbf9fe0d0837d5972cc8:?]
        at com.github.shyiko.mysql.binlog.event.deserialization.EventDeserializer.deserializeEventData(EventDeserializer.java:303) ~[blob_p-a536766096736050a4a547179c8b9e0478fae327-0ab3c26039e8cbf9fe0d0837d5972cc8:?]
        at com.github.shyiko.mysql.binlog.event.deserialization.EventDeserializer.nextEvent(EventDeserializer.java:232) ~[blob_p-a536766096736050a4a547179c8b9e0478fae327-0ab3c26039e8cbf9fe0d0837d5972cc8:?]
        at io.debezium.connector.mysql.MySqlStreamingChangeEventSource$1.nextEvent(MySqlStreamingChangeEventSource.java:233) ~[blob_p-a536766096736050a4a547179c8b9e0478fae327-0ab3c26039e8cbf9fe0d0837d5972cc8:?]
        at com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:945) ~[blob_p-a536766096736050a4a547179c8b9e0478fae327-0ab3c26039e8cbf9fe0d0837d5972cc8:?]
        ... 3 more
2022-10-17 11:48:23,348 ERROR org.apache.flink.connector.base.source.reader.fetcher.SplitFetcherManager [] - Received uncaught exception.
java.lang.RuntimeException: SplitFetcher thread 1 received unexpected exception while polling the records
        at org.apache.flink.connector.base.source.reader.fetcher.SplitFetcher.runOnce(SplitFetcher.java:146) ~[flink-table-blink_2.12-1.13.2.jar:1.13.2]
        at org.apache.flink.connector.base.source.reader.fetcher.SplitFetcher.run(SplitFetcher.java:101) [flink-table-blink_2.12-1.13.2.jar:1.13.2]
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_312]
        at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_312]
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_312]
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_312]
        at java.lang.Thread.run(Thread.java:748) [?:1.8.0_312]
Caused by: com.ververica.cdc.connectors.shaded.org.apache.kafka.connect.errors.ConnectException: An exception occurred in the change event producer. This connector will be stopped.
        at io.debezium.pipeline.ErrorHandler.setProducerThrowable(ErrorHandler.java:42) ~[blob_p-a536766096736050a4a547179c8b9e0478fae327-0ab3c26039e8cbf9fe0d0837d5972cc8:?]
        at com.ververica.cdc.connectors.mysql.debezium.task.context.MySqlErrorHandler.setProducerThrowable(MySqlErrorHandler.java:72) ~[blob_p-a536766096736050a4a547179c8b9e0478fae327-0ab3c26039e8cbf9fe0d0837d5972cc8:?]
        at io.debezium.connector.mysql.MySqlStreamingChangeEventSource$ReaderThreadLifecycleListener.onCommunicationFailure(MySqlStreamingChangeEventSource.java:1185) ~[blob_p-a536766096736050a4a547179c8b9e0478fae327-0ab3c26039e8cbf9fe0d0837d5972cc8:?]
        at com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:973) ~[blob_p-a536766096736050a4a547179c8b9e0478fae327-0ab3c26039e8cbf9fe0d0837d5972cc8:?]
        at com.github.shyiko.mysql.binlog.BinaryLogClient.connect(BinaryLogClient.java:606) ~[blob_p-a536766096736050a4a547179c8b9e0478fae327-0ab3c26039e8cbf9fe0d0837d5972cc8:?]
        at com.github.shyiko.mysql.binlog.BinaryLogClient$7.run(BinaryLogClient.java:850) ~[blob_p-a536766096736050a4a547179c8b9e0478fae327-0ab3c26039e8cbf9fe0d0837d5972cc8:?]
        ... 1 more
Caused by: io.debezium.DebeziumException: Failed to deserialize data of EventHeaderV4{timestamp=1665978065000, eventType=UPDATE_ROWS, serverId=1940348705, headerLength=19, dataLength=2201, nextPosition=63199535, flags=0}
        at io.debezium.connector.mysql.MySqlStreamingChangeEventSource.wrap(MySqlStreamingChangeEventSource.java:1146) ~[blob_p-a536766096736050a4a547179c8b9e0478fae327-0ab3c26039e8cbf9fe0d0837d5972cc8:?]
        at io.debezium.connector.mysql.MySqlStreamingChangeEventSource$ReaderThreadLifecycleListener.onCommunicationFailure(MySqlStreamingChangeEventSource.java:1185) ~[blob_p-a536766096736050a4a547179c8b9e0478fae327-0ab3c26039e8cbf9fe0d0837d5972cc8:?]
        at com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:973) ~[blob_p-a536766096736050a4a547179c8b9e0478fae327-0ab3c26039e8cbf9fe0d0837d5972cc8:?]
        at com.github.shyiko.mysql.binlog.BinaryLogClient.connect(BinaryLogClient.java:606) ~[blob_p-a536766096736050a4a547179c8b9e0478fae327-0ab3c26039e8cbf9fe0d0837d5972cc8:?]
        at com.github.shyiko.mysql.binlog.BinaryLogClient$7.run(BinaryLogClient.java:850) ~[blob_p-a536766096736050a4a547179c8b9e0478fae327-0ab3c26039e8cbf9fe0d0837d5972cc8:?]
        ... 1 more
Caused by: com.github.shyiko.mysql.binlog.event.deserialization.EventDataDeserializationException: Failed to deserialize data of EventHeaderV4{timestamp=1665978065000, eventType=UPDATE_ROWS, serverId=1940348705, headerLength=19, dataLength=2201, nextPosition=63199535, flags=0}
        at com.github.shyiko.mysql.binlog.event.deserialization.EventDeserializer.deserializeEventData(EventDeserializer.java:309) ~[blob_p-a536766096736050a4a547179c8b9e0478fae327-0ab3c26039e8cbf9fe0d0837d5972cc8:?]
        at com.github.shyiko.mysql.binlog.event.deserialization.EventDeserializer.nextEvent(EventDeserializer.java:232) ~[blob_p-a536766096736050a4a547179c8b9e0478fae327-0ab3c26039e8cbf9fe0d0837d5972cc8:?]
        at io.debezium.connector.mysql.MySqlStreamingChangeEventSource$1.nextEvent(MySqlStreamingChangeEventSource.java:233) ~[blob_p-a536766096736050a4a547179c8b9e0478fae327-0ab3c26039e8cbf9fe0d0837d5972cc8:?]
        at com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:945) ~[blob_p-a536766096736050a4a547179c8b9e0478fae327-0ab3c26039e8cbf9fe0d0837d5972cc8:?]
        at com.github.shyiko.mysql.binlog.BinaryLogClient.connect(BinaryLogClient.java:606) ~[blob_p-a536766096736050a4a547179c8b9e0478fae327-0ab3c26039e8cbf9fe0d0837d5972cc8:?]
        at com.github.shyiko.mysql.binlog.BinaryLogClient$7.run(BinaryLogClient.java:850) ~[blob_p-a536766096736050a4a547179c8b9e0478fae327-0ab3c26039e8cbf9fe0d0837d5972cc8:?]
        ... 1 more
Caused by: java.io.EOFException
        at com.github.shyiko.mysql.binlog.io.ByteArrayInputStream.fill(ByteArrayInputStream.java:113) ~[blob_p-a536766096736050a4a547179c8b9e0478fae327-0ab3c26039e8cbf9fe0d0837d5972cc8:?]
        at com.github.shyiko.mysql.binlog.io.ByteArrayInputStream.read(ByteArrayInputStream.java:104) ~[blob_p-a536766096736050a4a547179c8b9e0478fae327-0ab3c26039e8cbf9fe0d0837d5972cc8:?]
        at io.debezium.connector.mysql.RowDeserializers.deserializeVarString(RowDeserializers.java:264) ~[blob_p-a536766096736050a4a547179c8b9e0478fae327-0ab3c26039e8cbf9fe0d0837d5972cc8:?]
        at io.debezium.connector.mysql.RowDeserializers$UpdateRowsDeserializer.deserializeVarString(RowDeserializers.java:130) ~[blob_p-a536766096736050a4a547179c8b9e0478fae327-0ab3c26039e8cbf9fe0d0837d5972cc8:?]
        at com.github.shyiko.mysql.binlog.event.deserialization.AbstractRowsEventDataDeserializer.deserializeCell(AbstractRowsEventDataDeserializer.java:189) ~[blob_p-a536766096736050a4a547179c8b9e0478fae327-0ab3c26039e8cbf9fe0d0837d5972cc8:?]
        at com.github.shyiko.mysql.binlog.event.deserialization.AbstractRowsEventDataDeserializer.deserializeRow(AbstractRowsEventDataDeserializer.java:143) ~[blob_p-a536766096736050a4a547179c8b9e0478fae327-0ab3c26039e8cbf9fe0d0837d5972cc8:?]
        at com.github.shyiko.mysql.binlog.event.deserialization.UpdateRowsEventDataDeserializer.deserializeRows(UpdateRowsEventDataDeserializer.java:72) ~[blob_p-a536766096736050a4a547179c8b9e0478fae327-0ab3c26039e8cbf9fe0d0837d5972cc8:?]
        at com.github.shyiko.mysql.binlog.event.deserialization.UpdateRowsEventDataDeserializer.deserialize(UpdateRowsEventDataDeserializer.java:58) ~[blob_p-a536766096736050a4a547179c8b9e0478fae327-0ab3c26039e8cbf9fe0d0837d5972cc8:?]
        at com.github.shyiko.mysql.binlog.event.deserialization.UpdateRowsEventDataDeserializer.deserialize(UpdateRowsEventDataDeserializer.java:33) ~[blob_p-a536766096736050a4a547179c8b9e0478fae327-0ab3c26039e8cbf9fe0d0837d5972cc8:?]
        at com.github.shyiko.mysql.binlog.event.deserialization.EventDeserializer.deserializeEventData(EventDeserializer.java:303) ~[blob_p-a536766096736050a4a547179c8b9e0478fae327-0ab3c26039e8cbf9fe0d0837d5972cc8:?]
        at com.github.shyiko.mysql.binlog.event.deserialization.EventDeserializer.nextEvent(EventDeserializer.java:232) ~[blob_p-a536766096736050a4a547179c8b9e0478fae327-0ab3c26039e8cbf9fe0d0837d5972cc8:?]
        at io.debezium.connector.mysql.MySqlStreamingChangeEventSource$1.nextEvent(MySqlStreamingChangeEventSource.java:233) ~[blob_p-a536766096736050a4a547179c8b9e0478fae327-0ab3c26039e8cbf9fe0d0837d5972cc8:?]
        at com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:945) ~[blob_p-a536766096736050a4a547179c8b9e0478fae327-0ab3c26039e8cbf9fe0d0837d5972cc8:?]
        at com.github.shyiko.mysql.binlog.BinaryLogClient.connect(BinaryLogClient.java:606) ~[blob_p-a536766096736050a4a547179c8b9e0478fae327-0ab3c26039e8cbf9fe0d0837d5972cc8:?]
        at com.github.shyiko.mysql.binlog.BinaryLogClient$7.run(BinaryLogClient.java:850) ~[blob_p-a536766096736050a4a547179c8b9e0478fae327-0ab3c26039e8cbf9fe0d0837d5972cc8:?]
        ... 1 more
polarbearmx commented 2 years ago

我们也遇到了类似的问题: 2022-11-14 19:05:24,287 ERROR io.debezium.connector.mysql.MySqlStreamingChangeEventSource [] - Error during binlog processing. Last offset stored = null, binlog reader near position = binlog.003157/462016893 2022-11-14 19:05:24,287 ERROR io.debezium.pipeline.ErrorHandler [] - Producer failure io.debezium.DebeziumException: Failed to deserialize data of EventHeaderV4{timestamp=1668423620000, eventType=EXT_WRITE_ROWS, serverId=246, headerLength=19, dataLength=256, nextPosition=462017354, flags=0} at io.debezium.connector.mysql.MySqlStreamingChangeEventSource.wrap(MySqlStreamingChangeEventSource.java:1146) ~[flink-sql-connector-mysql-cdc-2.2.0.jar:2.2.0] at io.debezium.connector.mysql.MySqlStreamingChangeEventSource$ReaderThreadLifecycleListener.onCommunicationFailure(MySqlStreamingChangeEventSource.java:1185) [flink-sql-connector-mysql-cdc-2.2.0.jar:2.2.0] at com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:973) [flink-sql-connector-mysql-cdc-2.2.0.jar:2.2.0] at com.github.shyiko.mysql.binlog.BinaryLogClient.connect(BinaryLogClient.java:606) [flink-sql-connector-mysql-cdc-2.2.0.jar:2.2.0] at com.github.shyiko.mysql.binlog.BinaryLogClient$7.run(BinaryLogClient.java:850) [flink-sql-connector-mysql-cdc-2.2.0.jar:2.2.0] at java.lang.Thread.run(Thread.java:748) [?:1.8.0_231] Caused by: com.github.shyiko.mysql.binlog.event.deserialization.EventDataDeserializationException: Failed to deserialize data of EventHeaderV4{timestamp=1668423620000, eventType=EXT_WRITE_ROWS, serverId=246, headerLength=19, dataLength=256, nextPosition=462017354, flags=0} at com.github.shyiko.mysql.binlog.event.deserialization.EventDeserializer.deserializeEventData(EventDeserializer.java:309) ~[flink-sql-connector-mysql-cdc-2.2.0.jar:2.2.0] at com.github.shyiko.mysql.binlog.event.deserialization.EventDeserializer.nextEvent(EventDeserializer.java:232) ~[flink-sql-connector-mysql-cdc-2.2.0.jar:2.2.0] at io.debezium.connector.mysql.MySqlStreamingChangeEventSource$1.nextEvent(MySqlStreamingChangeEventSource.java:233) ~[flink-sql-connector-mysql-cdc-2.2.0.jar:2.2.0] at com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:945) ~[flink-sql-connector-mysql-cdc-2.2.0.jar:2.2.0] ... 3 more Caused by: java.io.EOFException at com.github.shyiko.mysql.binlog.io.ByteArrayInputStream.fill(ByteArrayInputStream.java:113) ~[flink-sql-connector-mysql-cdc-2.2.0.jar:2.2.0] at com.github.shyiko.mysql.binlog.io.ByteArrayInputStream.read(ByteArrayInputStream.java:104) ~[flink-sql-connector-mysql-cdc-2.2.0.jar:2.2.0] at io.debezium.connector.mysql.RowDeserializers.deserializeVarString(RowDeserializers.java:264) ~[flink-sql-connector-mysql-cdc-2.2.0.jar:2.2.0] at io.debezium.connector.mysql.RowDeserializers$WriteRowsDeserializer.deserializeVarString(RowDeserializers.java:192) ~[flink-sql-connector-mysql-cdc-2.2.0.jar:2.2.0] at com.github.shyiko.mysql.binlog.event.deserialization.AbstractRowsEventDataDeserializer.deserializeCell(AbstractRowsEventDataDeserializer.java:189) ~[flink-sql-connector-mysql-cdc-2.2.0.jar:2.2.0] at com.github.shyiko.mysql.binlog.event.deserialization.AbstractRowsEventDataDeserializer.deserializeRow(AbstractRowsEventDataDeserializer.java:143) ~[flink-sql-connector-mysql-cdc-2.2.0.jar:2.2.0] at com.github.shyiko.mysql.binlog.event.deserialization.WriteRowsEventDataDeserializer.deserializeRows(WriteRowsEventDataDeserializer.java:64) ~[flink-sql-connector-mysql-cdc-2.2.0.jar:2.2.0] at com.github.shyiko.mysql.binlog.event.deserialization.WriteRowsEventDataDeserializer.deserialize(WriteRowsEventDataDeserializer.java:56) ~[flink-sql-connector-mysql-cdc-2.2.0.jar:2.2.0] at com.github.shyiko.mysql.binlog.event.deserialization.WriteRowsEventDataDeserializer.deserialize(WriteRowsEventDataDeserializer.java:32) ~[flink-sql-connector-mysql-cdc-2.2.0.jar:2.2.0] at com.github.shyiko.mysql.binlog.event.deserialization.EventDeserializer.deserializeEventData(EventDeserializer.java:303) ~[flink-sql-connector-mysql-cdc-2.2.0.jar:2.2.0] at com.github.shyiko.mysql.binlog.event.deserialization.EventDeserializer.nextEvent(EventDeserializer.java:232) ~[flink-sql-connector-mysql-cdc-2.2.0.jar:2.2.0] at io.debezium.connector.mysql.MySqlStreamingChangeEventSource$1.nextEvent(MySqlStreamingChangeEventSource.java:233) ~[flink-sql-connector-mysql-cdc-2.2.0.jar:2.2.0] at com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:945) ~[flink-sql-connector-mysql-cdc-2.2.0.jar:2.2.0] ... 3 more

ShaddockWong commented 2 years ago

flink 1.14.5+ mysql-cdc 2.2.1也遇到这个问题了:

java.lang.RuntimeException: One or more fetchers have encountered exception
    at org.apache.flink.connector.base.source.reader.fetcher.SplitFetcherManager.checkErrors(SplitFetcherManager.java:225) ~[jbs-realtime-task-1.0-SNAPSHOT.jar:?]
    at org.apache.flink.connector.base.source.reader.SourceReaderBase.getNextFetch(SourceReaderBase.java:169) ~[jbs-realtime-task-1.0-SNAPSHOT.jar:?]
    at org.apache.flink.connector.base.source.reader.SourceReaderBase.pollNext(SourceReaderBase.java:130) ~[jbs-realtime-task-1.0-SNAPSHOT.jar:?]
    at org.apache.flink.streaming.api.operators.SourceOperator.emitNext(SourceOperator.java:354) ~[jbs-realtime-task-1.0-SNAPSHOT.jar:?]
    at org.apache.flink.streaming.runtime.io.StreamTaskSourceInput.emitNext(StreamTaskSourceInput.java:68) ~[jbs-realtime-task-1.0-SNAPSHOT.jar:?]
    at org.apache.flink.streaming.runtime.io.StreamOneInputProcessor.processInput(StreamOneInputProcessor.java:65) ~[jbs-realtime-task-1.0-SNAPSHOT.jar:?]
    at org.apache.flink.streaming.runtime.tasks.StreamTask.processInput(StreamTask.java:496) ~[jbs-realtime-task-1.0-SNAPSHOT.jar:?]
    at org.apache.flink.streaming.runtime.tasks.mailbox.MailboxProcessor.runMailboxLoop(MailboxProcessor.java:203) ~[jbs-realtime-task-1.0-SNAPSHOT.jar:?]
    at org.apache.flink.streaming.runtime.tasks.StreamTask.runMailboxLoop(StreamTask.java:809) ~[jbs-realtime-task-1.0-SNAPSHOT.jar:?]
    at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:761) ~[jbs-realtime-task-1.0-SNAPSHOT.jar:?]
    at org.apache.flink.runtime.taskmanager.Task.runWithSystemExitMonitoring(Task.java:958) ~[jbs-realtime-task-1.0-SNAPSHOT.jar:?]
    at org.apache.flink.runtime.taskmanager.Task.restoreAndInvoke(Task.java:937) ~[jbs-realtime-task-1.0-SNAPSHOT.jar:?]
    at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:766) ~[jbs-realtime-task-1.0-SNAPSHOT.jar:?]
    at org.apache.flink.runtime.taskmanager.Task.run(Task.java:575) ~[jbs-realtime-task-1.0-SNAPSHOT.jar:?]
    at java.lang.Thread.run(Thread.java:748) ~[?:1.8.0_181]
Caused by: java.lang.RuntimeException: SplitFetcher thread 0 received unexpected exception while polling the records
    at org.apache.flink.connector.base.source.reader.fetcher.SplitFetcher.runOnce(SplitFetcher.java:150) ~[jbs-realtime-task-1.0-SNAPSHOT.jar:?]
    at org.apache.flink.connector.base.source.reader.fetcher.SplitFetcher.run(SplitFetcher.java:105) ~[jbs-realtime-task-1.0-SNAPSHOT.jar:?]
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_181]
    at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_181]
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_181]
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_181]
    ... 1 more
Caused by: com.ververica.cdc.connectors.shaded.org.apache.kafka.connect.errors.ConnectException: An exception occurred in the change event producer. This connector will be stopped.
    at io.debezium.pipeline.ErrorHandler.setProducerThrowable(ErrorHandler.java:42) ~[jbs-realtime-task-1.0-SNAPSHOT.jar:?]
    at io.debezium.connector.mysql.MySqlStreamingChangeEventSource$ReaderThreadLifecycleListener.onCommunicationFailure(MySqlStreamingChangeEventSource.java:1185) ~[jbs-realtime-task-1.0-SNAPSHOT.jar:?]
    at com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:973) ~[jbs-realtime-task-1.0-SNAPSHOT.jar:?]
    at com.github.shyiko.mysql.binlog.BinaryLogClient.connect(BinaryLogClient.java:606) ~[jbs-realtime-task-1.0-SNAPSHOT.jar:?]
    at com.github.shyiko.mysql.binlog.BinaryLogClient$7.run(BinaryLogClient.java:850) ~[jbs-realtime-task-1.0-SNAPSHOT.jar:?]
    ... 1 more
Caused by: io.debezium.DebeziumException: Failed to deserialize data of EventHeaderV4{timestamp=1668586474000, eventType=WRITE_ROWS, serverId=1714314141, headerLength=19, dataLength=8007, nextPosition=441745462, flags=0}
    at io.debezium.connector.mysql.MySqlStreamingChangeEventSource.wrap(MySqlStreamingChangeEventSource.java:1146) ~[jbs-realtime-task-1.0-SNAPSHOT.jar:?]
    at io.debezium.connector.mysql.MySqlStreamingChangeEventSource$ReaderThreadLifecycleListener.onCommunicationFailure(MySqlStreamingChangeEventSource.java:1185) ~[jbs-realtime-task-1.0-SNAPSHOT.jar:?]
    at com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:973) ~[jbs-realtime-task-1.0-SNAPSHOT.jar:?]
    at com.github.shyiko.mysql.binlog.BinaryLogClient.connect(BinaryLogClient.java:606) ~[jbs-realtime-task-1.0-SNAPSHOT.jar:?]
    at com.github.shyiko.mysql.binlog.BinaryLogClient$7.run(BinaryLogClient.java:850) ~[jbs-realtime-task-1.0-SNAPSHOT.jar:?]
    ... 1 more
Caused by: com.github.shyiko.mysql.binlog.event.deserialization.EventDataDeserializationException: Failed to deserialize data of EventHeaderV4{timestamp=1668586474000, eventType=WRITE_ROWS, serverId=1714314141, headerLength=19, dataLength=8007, nextPosition=441745462, flags=0}
    at com.github.shyiko.mysql.binlog.event.deserialization.EventDeserializer.deserializeEventData(EventDeserializer.java:309) ~[jbs-realtime-task-1.0-SNAPSHOT.jar:?]
    at com.github.shyiko.mysql.binlog.event.deserialization.EventDeserializer.nextEvent(EventDeserializer.java:232) ~[jbs-realtime-task-1.0-SNAPSHOT.jar:?]
    at io.debezium.connector.mysql.MySqlStreamingChangeEventSource$1.nextEvent(MySqlStreamingChangeEventSource.java:233) ~[jbs-realtime-task-1.0-SNAPSHOT.jar:?]
    at com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:945) ~[jbs-realtime-task-1.0-SNAPSHOT.jar:?]
    at com.github.shyiko.mysql.binlog.BinaryLogClient.connect(BinaryLogClient.java:606) ~[jbs-realtime-task-1.0-SNAPSHOT.jar:?]
    at com.github.shyiko.mysql.binlog.BinaryLogClient$7.run(BinaryLogClient.java:850) ~[jbs-realtime-task-1.0-SNAPSHOT.jar:?]
    ... 1 more
Caused by: java.io.EOFException
    at com.github.shyiko.mysql.binlog.io.ByteArrayInputStream.fill(ByteArrayInputStream.java:113) ~[jbs-realtime-task-1.0-SNAPSHOT.jar:?]
    at com.github.shyiko.mysql.binlog.io.ByteArrayInputStream.read(ByteArrayInputStream.java:104) ~[jbs-realtime-task-1.0-SNAPSHOT.jar:?]
    at io.debezium.connector.mysql.RowDeserializers.deserializeVarString(RowDeserializers.java:264) ~[jbs-realtime-task-1.0-SNAPSHOT.jar:?]
    at io.debezium.connector.mysql.RowDeserializers$WriteRowsDeserializer.deserializeVarString(RowDeserializers.java:192) ~[jbs-realtime-task-1.0-SNAPSHOT.jar:?]
    at com.github.shyiko.mysql.binlog.event.deserialization.AbstractRowsEventDataDeserializer.deserializeCell(AbstractRowsEventDataDeserializer.java:189) ~[jbs-realtime-task-1.0-SNAPSHOT.jar:?]
    at com.github.shyiko.mysql.binlog.event.deserialization.AbstractRowsEventDataDeserializer.deserializeRow(AbstractRowsEventDataDeserializer.java:143) ~[jbs-realtime-task-1.0-SNAPSHOT.jar:?]
    at com.github.shyiko.mysql.binlog.event.deserialization.WriteRowsEventDataDeserializer.deserializeRows(WriteRowsEventDataDeserializer.java:64) ~[jbs-realtime-task-1.0-SNAPSHOT.jar:?]
    at com.github.shyiko.mysql.binlog.event.deserialization.WriteRowsEventDataDeserializer.deserialize(WriteRowsEventDataDeserializer.java:56) ~[jbs-realtime-task-1.0-SNAPSHOT.jar:?]
    at com.github.shyiko.mysql.binlog.event.deserialization.WriteRowsEventDataDeserializer.deserialize(WriteRowsEventDataDeserializer.java:32) ~[jbs-realtime-task-1.0-SNAPSHOT.jar:?]
    at com.github.shyiko.mysql.binlog.event.deserialization.EventDeserializer.deserializeEventData(EventDeserializer.java:303) ~[jbs-realtime-task-1.0-SNAPSHOT.jar:?]
    at com.github.shyiko.mysql.binlog.event.deserialization.EventDeserializer.nextEvent(EventDeserializer.java:232) ~[jbs-realtime-task-1.0-SNAPSHOT.jar:?]
    at io.debezium.connector.mysql.MySqlStreamingChangeEventSource$1.nextEvent(MySqlStreamingChangeEventSource.java:233) ~[jbs-realtime-task-1.0-SNAPSHOT.jar:?]
    at com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:945) ~[jbs-realtime-task-1.0-SNAPSHOT.jar:?]
    at com.github.shyiko.mysql.binlog.BinaryLogClient.connect(BinaryLogClient.java:606) ~[jbs-realtime-task-1.0-SNAPSHOT.jar:?]
    at com.github.shyiko.mysql.binlog.BinaryLogClient$7.run(BinaryLogClient.java:850) ~[jbs-realtime-task-1.0-SNAPSHOT.jar:?]
    ... 1 more
Mindfaker commented 1 year ago

flink 1.14.3+ oracle-cdc 2.2.1 也遇到这个问题

`org.apache.flink.runtime.JobException: Recovery is suppressed by FixedDelayRestartBackoffTimeStrategy(maxNumberRestartAttempts=3, backoffTimeMS=30000) at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:138) ~[flink-dist_2.12-1.14.3.jar:1.14.3] at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:82) ~[flink-dist_2.12-1.14.3.jar:1.14.3] at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:252) ~[flink-dist_2.12-1.14.3.jar:1.14.3] at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:242) ~[flink-dist_2.12-1.14.3.jar:1.14.3] at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:233) ~[flink-dist_2.12-1.14.3.jar:1.14.3] at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:684) ~[flink-dist_2.12-1.14.3.jar:1.14.3] at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:79) ~[flink-dist_2.12-1.14.3.jar:1.14.3] at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:444) ~[flink-dist_2.12-1.14.3.jar:1.14.3] at sun.reflect.GeneratedMethodAccessor35.invoke(Unknown Source) ~[?:?] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_202] at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_202] at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.lambda$handleRpcInvocation$1(AkkaRpcActor.java:316) ~[flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at org.apache.flink.runtime.concurrent.akka.ClassLoadingUtils.runWithContextClassLoader(ClassLoadingUtils.java:83) ~[flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:314) ~[flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:217) ~[flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:78) ~[flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:163) ~[flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:24) [flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:20) [flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:123) [flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:122) [flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:20) [flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171) [flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:172) [flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:172) [flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at akka.actor.Actor.aroundReceive(Actor.scala:537) [flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at akka.actor.Actor.aroundReceive$(Actor.scala:535) [flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:220) [flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at akka.actor.ActorCell.receiveMessage(ActorCell.scala:580) [flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at akka.actor.ActorCell.invoke(ActorCell.scala:548) [flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:270) [flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at akka.dispatch.Mailbox.run(Mailbox.scala:231) [flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at akka.dispatch.Mailbox.exec(Mailbox.scala:243) [flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289) [?:1.8.0_202] at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056) [?:1.8.0_202] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692) [?:1.8.0_202] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:157) [?:1.8.0_202] Caused by: com.ververica.cdc.connectors.shaded.org.apache.kafka.connect.errors.ConnectException: An exception occurred in the change event producer. This connector will be stopped. at io.debezium.pipeline.ErrorHandler.setProducerThrowable(ErrorHandler.java:42) ~[?:?] at io.debezium.connector.oracle.logminer.LogMinerStreamingChangeEventSource.execute(LogMinerStreamingChangeEventSource.java:208) ~[?:?] at io.debezium.pipeline.ChangeEventSourceCoordinator.streamEvents(ChangeEventSourceCoordinator.java:152) ~[?:?] at io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:119) ~[?:?] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_202] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_202] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_202] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_202] at java.lang.Thread.run(Thread.java:748) ~[?:1.8.0_202] Caused by: java.sql.SQLException: ORA-04030: out of process memory when trying to allocate 65568 bytes (Logminer LCR c,krvxrib:buffer)

at oracle.jdbc.driver.T4CTTIoer11.processError(T4CTTIoer11.java:509) ~[?:?]
at oracle.jdbc.driver.T4CTTIoer11.processError(T4CTTIoer11.java:461) ~[?:?]
at oracle.jdbc.driver.T4C8Oall.processError(T4C8Oall.java:1104) ~[?:?]
at oracle.jdbc.driver.T4CTTIfun.receive(T4CTTIfun.java:550) ~[?:?]
at oracle.jdbc.driver.T4CTTIfun.doRPC(T4CTTIfun.java:268) ~[?:?]
at oracle.jdbc.driver.T4C8Oall.doOALL(T4C8Oall.java:655) ~[?:?]
at oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:270) ~[?:?]
at oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:91) ~[?:?]
at oracle.jdbc.driver.T4CPreparedStatement.executeForRows(T4CPreparedStatement.java:970) ~[?:?]
at oracle.jdbc.driver.OracleStatement.executeMaybeDescribe(OracleStatement.java:1012) ~[?:?]
at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1168) ~[?:?]
at oracle.jdbc.driver.OraclePreparedStatement.executeInternal(OraclePreparedStatement.java:3666) ~[?:?]
at oracle.jdbc.driver.T4CPreparedStatement.executeInternal(T4CPreparedStatement.java:1426) ~[?:?]
at oracle.jdbc.driver.OraclePreparedStatement.executeQuery(OraclePreparedStatement.java:3713) ~[?:?]
at oracle.jdbc.driver.OraclePreparedStatementWrapper.executeQuery(OraclePreparedStatementWrapper.java:1167) ~[?:?]
at io.debezium.connector.oracle.logminer.LogMinerStreamingChangeEventSource.execute(LogMinerStreamingChangeEventSource.java:184) ~[?:?]
at io.debezium.pipeline.ChangeEventSourceCoordinator.streamEvents(ChangeEventSourceCoordinator.java:152) ~[?:?]
at io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:119) ~[?:?]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_202]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_202]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_202]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_202]
at java.lang.Thread.run(Thread.java:748) ~[?:1.8.0_202]

Caused by: oracle.jdbc.OracleDatabaseException: ORA-04030: out of process memory when trying to allocate 65568 bytes (Logminer LCR c,krvxrib:buffer)

at oracle.jdbc.driver.T4CTTIoer11.processError(T4CTTIoer11.java:513) ~[?:?]
at oracle.jdbc.driver.T4CTTIoer11.processError(T4CTTIoer11.java:461) ~[?:?]
at oracle.jdbc.driver.T4C8Oall.processError(T4C8Oall.java:1104) ~[?:?]
at oracle.jdbc.driver.T4CTTIfun.receive(T4CTTIfun.java:550) ~[?:?]
at oracle.jdbc.driver.T4CTTIfun.doRPC(T4CTTIfun.java:268) ~[?:?]
at oracle.jdbc.driver.T4C8Oall.doOALL(T4C8Oall.java:655) ~[?:?]
at oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:270) ~[?:?]
at oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:91) ~[?:?]
at oracle.jdbc.driver.T4CPreparedStatement.executeForRows(T4CPreparedStatement.java:970) ~[?:?]
at oracle.jdbc.driver.OracleStatement.executeMaybeDescribe(OracleStatement.java:1012) ~[?:?]
at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1168) ~[?:?]
at oracle.jdbc.driver.OraclePreparedStatement.executeInternal(OraclePreparedStatement.java:3666) ~[?:?]
at oracle.jdbc.driver.T4CPreparedStatement.executeInternal(T4CPreparedStatement.java:1426) ~[?:?]
at oracle.jdbc.driver.OraclePreparedStatement.executeQuery(OraclePreparedStatement.java:3713) ~[?:?]
at oracle.jdbc.driver.OraclePreparedStatementWrapper.executeQuery(OraclePreparedStatementWrapper.java:1167) ~[?:?]
at io.debezium.connector.oracle.logminer.LogMinerStreamingChangeEventSource.execute(LogMinerStreamingChangeEventSource.java:184) ~[?:?]
at io.debezium.pipeline.ChangeEventSourceCoordinator.streamEvents(ChangeEventSourceCoordinator.java:152) ~[?:?]
at io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:119) ~[?:?]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_202]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_202]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_202]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_202]
at java.lang.Thread.run(Thread.java:748) ~[?:1.8.0_202]

2022-11-29 15:27:31,702 INFO org.apache.flink.runtime.checkpoint.CheckpointCoordinator [] - Stopping checkpoint coordinator for job 286631e3bd04d5a53a0977138634fee5. 2022-11-29 15:27:31,705 INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher [] - Job 286631e3bd04d5a53a0977138634fee5 reached terminal state FAILED. org.apache.flink.runtime.JobException: Recovery is suppressed by FixedDelayRestartBackoffTimeStrategy(maxNumberRestartAttempts=3, backoffTimeMS=30000) at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:138) at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:82) at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:252) at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:242) at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:233) at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:684) at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:79) at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:444) at sun.reflect.GeneratedMethodAccessor35.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.lambda$handleRpcInvocation$1(AkkaRpcActor.java:316) at org.apache.flink.runtime.concurrent.akka.ClassLoadingUtils.runWithContextClassLoader(ClassLoadingUtils.java:83) at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:314) at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:217) at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:78) at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:163) at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:24) at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:20) at scala.PartialFunction.applyOrElse(PartialFunction.scala:123) at scala.PartialFunction.applyOrElse$(PartialFunction.scala:122) at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:20) at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171) at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:172) at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:172) at akka.actor.Actor.aroundReceive(Actor.scala:537) at akka.actor.Actor.aroundReceive$(Actor.scala:535) at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:220) at akka.actor.ActorCell.receiveMessage(ActorCell.scala:580) at akka.actor.ActorCell.invoke(ActorCell.scala:548) at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:270) at akka.dispatch.Mailbox.run(Mailbox.scala:231) at akka.dispatch.Mailbox.exec(Mailbox.scala:243) at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289) at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056) at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692) at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:157) Caused by: com.ververica.cdc.connectors.shaded.org.apache.kafka.connect.errors.ConnectException: An exception occurred in the change event producer. This connector will be stopped. at io.debezium.pipeline.ErrorHandler.setProducerThrowable(ErrorHandler.java:42) at io.debezium.connector.oracle.logminer.LogMinerStreamingChangeEventSource.execute(LogMinerStreamingChangeEventSource.java:208) at io.debezium.pipeline.ChangeEventSourceCoordinator.streamEvents(ChangeEventSourceCoordinator.java:152) at io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:119) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: java.sql.SQLException: ORA-04030: out of process memory when trying to allocate 65568 bytes (Logminer LCR c,krvxrib:buffer)

at oracle.jdbc.driver.T4CTTIoer11.processError(T4CTTIoer11.java:509)
at oracle.jdbc.driver.T4CTTIoer11.processError(T4CTTIoer11.java:461)
at oracle.jdbc.driver.T4C8Oall.processError(T4C8Oall.java:1104)
at oracle.jdbc.driver.T4CTTIfun.receive(T4CTTIfun.java:550)
at oracle.jdbc.driver.T4CTTIfun.doRPC(T4CTTIfun.java:268)
at oracle.jdbc.driver.T4C8Oall.doOALL(T4C8Oall.java:655)
at oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:270)
at oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:91)
at oracle.jdbc.driver.T4CPreparedStatement.executeForRows(T4CPreparedStatement.java:970)
at oracle.jdbc.driver.OracleStatement.executeMaybeDescribe(OracleStatement.java:1012)
at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1168)
at oracle.jdbc.driver.OraclePreparedStatement.executeInternal(OraclePreparedStatement.java:3666)
at oracle.jdbc.driver.T4CPreparedStatement.executeInternal(T4CPreparedStatement.java:1426)
at oracle.jdbc.driver.OraclePreparedStatement.executeQuery(OraclePreparedStatement.java:3713)
at oracle.jdbc.driver.OraclePreparedStatementWrapper.executeQuery(OraclePreparedStatementWrapper.java:1167)
at io.debezium.connector.oracle.logminer.LogMinerStreamingChangeEventSource.execute(LogMinerStreamingChangeEventSource.java:184)`
ShaddockWong commented 1 year ago

我们没有到oracle,但是那个问题,之前官方说修复过一次,但是我查到好多人又碰到了这个问题,暂时只能判断应该是cdc官方的bug,我们是重启cdc任务解决的。

------------------ 原始邮件 ------------------ 发件人: @.>; 发送时间: 2022年11月30日(星期三) 上午9:46 收件人: @.>; 抄送: "To @.>; @.>; 主题: Re: [ververica/flink-cdc-connectors] 请问出现这个错误(io.debezium.DebeziumException: Failed to deserialize data of EventHeaderV4{timestamp=1627541881000,)是什么原因? (#276)

flink 1.14.3+ oracle-cdc 2.2.1 也遇到这个问题

org.apache.flink.runtime.JobException: Recovery is suppressed by FixedDelayRestartBackoffTimeStrategy(maxNumberRestartAttempts=3, backoffTimeMS=30000) at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:138) ~[flink-dist_2.12-1.14.3.jar:1.14.3] at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:82) ~[flink-dist_2.12-1.14.3.jar:1.14.3] at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:252) ~[flink-dist_2.12-1.14.3.jar:1.14.3] at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:242) ~[flink-dist_2.12-1.14.3.jar:1.14.3] at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:233) ~[flink-dist_2.12-1.14.3.jar:1.14.3] at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:684) ~[flink-dist_2.12-1.14.3.jar:1.14.3] at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:79) ~[flink-dist_2.12-1.14.3.jar:1.14.3] at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:444) ~[flink-dist_2.12-1.14.3.jar:1.14.3] at sun.reflect.GeneratedMethodAccessor35.invoke(Unknown Source) ~[?:?] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_202] at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_202] at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.lambda$handleRpcInvocation$1(AkkaRpcActor.java:316) ~[flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at org.apache.flink.runtime.concurrent.akka.ClassLoadingUtils.runWithContextClassLoader(ClassLoadingUtils.java:83) ~[flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:314) ~[flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:217) ~[flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:78) ~[flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:163) ~[flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:24) [flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:20) [flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:123) [flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:122) [flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:20) [flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171) [flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:172) [flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:172) [flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at akka.actor.Actor.aroundReceive(Actor.scala:537) [flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at akka.actor.Actor.aroundReceive$(Actor.scala:535) [flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:220) [flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at akka.actor.ActorCell.receiveMessage(ActorCell.scala:580) [flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at akka.actor.ActorCell.invoke(ActorCell.scala:548) [flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:270) [flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at akka.dispatch.Mailbox.run(Mailbox.scala:231) [flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at akka.dispatch.Mailbox.exec(Mailbox.scala:243) [flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289) [?:1.8.0_202] at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056) [?:1.8.0_202] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692) [?:1.8.0_202] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:157) [?:1.8.0_202] Caused by: com.ververica.cdc.connectors.shaded.org.apache.kafka.connect.errors.ConnectException: An exception occurred in the change event producer. This connector will be stopped. at io.debezium.pipeline.ErrorHandler.setProducerThrowable(ErrorHandler.java:42) ~[?:?] at io.debezium.connector.oracle.logminer.LogMinerStreamingChangeEventSource.execute(LogMinerStreamingChangeEventSource.java:208) ~[?:?] at io.debezium.pipeline.ChangeEventSourceCoordinator.streamEvents(ChangeEventSourceCoordinator.java:152) ~[?:?] at io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:119) ~[?:?] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_202] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_202] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_202] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_202] at java.lang.Thread.run(Thread.java:748) ~[?:1.8.0_202] Caused by: java.sql.SQLException: ORA-04030: out of process memory when trying to allocate 65568 bytes (Logminer LCR c,krvxrib:buffer) at oracle.jdbc.driver.T4CTTIoer11.processError(T4CTTIoer11.java:509) ~[?:?] at oracle.jdbc.driver.T4CTTIoer11.processError(T4CTTIoer11.java:461) ~[?:?] at oracle.jdbc.driver.T4C8Oall.processError(T4C8Oall.java:1104) ~[?:?] at oracle.jdbc.driver.T4CTTIfun.receive(T4CTTIfun.java:550) ~[?:?] at oracle.jdbc.driver.T4CTTIfun.doRPC(T4CTTIfun.java:268) ~[?:?] at oracle.jdbc.driver.T4C8Oall.doOALL(T4C8Oall.java:655) ~[?:?] at oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:270) ~[?:?] at oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:91) ~[?:?] at oracle.jdbc.driver.T4CPreparedStatement.executeForRows(T4CPreparedStatement.java:970) ~[?:?] at oracle.jdbc.driver.OracleStatement.executeMaybeDescribe(OracleStatement.java:1012) ~[?:?] at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1168) ~[?:?] at oracle.jdbc.driver.OraclePreparedStatement.executeInternal(OraclePreparedStatement.java:3666) ~[?:?] at oracle.jdbc.driver.T4CPreparedStatement.executeInternal(T4CPreparedStatement.java:1426) ~[?:?] at oracle.jdbc.driver.OraclePreparedStatement.executeQuery(OraclePreparedStatement.java:3713) ~[?:?] at oracle.jdbc.driver.OraclePreparedStatementWrapper.executeQuery(OraclePreparedStatementWrapper.java:1167) ~[?:?] at io.debezium.connector.oracle.logminer.LogMinerStreamingChangeEventSource.execute(LogMinerStreamingChangeEventSource.java:184) ~[?:?] at io.debezium.pipeline.ChangeEventSourceCoordinator.streamEvents(ChangeEventSourceCoordinator.java:152) ~[?:?] at io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:119) ~[?:?] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_202] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_202] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_202] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_202] at java.lang.Thread.run(Thread.java:748) ~[?:1.8.0_202] Caused by: oracle.jdbc.OracleDatabaseException: ORA-04030: out of process memory when trying to allocate 65568 bytes (Logminer LCR c,krvxrib:buffer) at oracle.jdbc.driver.T4CTTIoer11.processError(T4CTTIoer11.java:513) ~[?:?] at oracle.jdbc.driver.T4CTTIoer11.processError(T4CTTIoer11.java:461) ~[?:?] at oracle.jdbc.driver.T4C8Oall.processError(T4C8Oall.java:1104) ~[?:?] at oracle.jdbc.driver.T4CTTIfun.receive(T4CTTIfun.java:550) ~[?:?] at oracle.jdbc.driver.T4CTTIfun.doRPC(T4CTTIfun.java:268) ~[?:?] at oracle.jdbc.driver.T4C8Oall.doOALL(T4C8Oall.java:655) ~[?:?] at oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:270) ~[?:?] at oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:91) ~[?:?] at oracle.jdbc.driver.T4CPreparedStatement.executeForRows(T4CPreparedStatement.java:970) ~[?:?] at oracle.jdbc.driver.OracleStatement.executeMaybeDescribe(OracleStatement.java:1012) ~[?:?] at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1168) ~[?:?] at oracle.jdbc.driver.OraclePreparedStatement.executeInternal(OraclePreparedStatement.java:3666) ~[?:?] at oracle.jdbc.driver.T4CPreparedStatement.executeInternal(T4CPreparedStatement.java:1426) ~[?:?] at oracle.jdbc.driver.OraclePreparedStatement.executeQuery(OraclePreparedStatement.java:3713) ~[?:?] at oracle.jdbc.driver.OraclePreparedStatementWrapper.executeQuery(OraclePreparedStatementWrapper.java:1167) ~[?:?] at io.debezium.connector.oracle.logminer.LogMinerStreamingChangeEventSource.execute(LogMinerStreamingChangeEventSource.java:184) ~[?:?] at io.debezium.pipeline.ChangeEventSourceCoordinator.streamEvents(ChangeEventSourceCoordinator.java:152) ~[?:?] at io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:119) ~[?:?] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_202] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_202] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_202] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_202] at java.lang.Thread.run(Thread.java:748) ~[?:1.8.0_202] 2022-11-29 15:27:31,702 INFO org.apache.flink.runtime.checkpoint.CheckpointCoordinator [] - Stopping checkpoint coordinator for job 286631e3bd04d5a53a0977138634fee5. 2022-11-29 15:27:31,705 INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher [] - Job 286631e3bd04d5a53a0977138634fee5 reached terminal state FAILED. org.apache.flink.runtime.JobException: Recovery is suppressed by FixedDelayRestartBackoffTimeStrategy(maxNumberRestartAttempts=3, backoffTimeMS=30000) at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:138) at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:82) at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:252) at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:242) at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:233) at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:684) at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:79) at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:444) at sun.reflect.GeneratedMethodAccessor35.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.lambda$handleRpcInvocation$1(AkkaRpcActor.java:316) at org.apache.flink.runtime.concurrent.akka.ClassLoadingUtils.runWithContextClassLoader(ClassLoadingUtils.java:83) at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:314) at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:217) at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:78) at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:163) at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:24) at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:20) at scala.PartialFunction.applyOrElse(PartialFunction.scala:123) at scala.PartialFunction.applyOrElse$(PartialFunction.scala:122) at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:20) at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171) at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:172) at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:172) at akka.actor.Actor.aroundReceive(Actor.scala:537) at akka.actor.Actor.aroundReceive$(Actor.scala:535) at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:220) at akka.actor.ActorCell.receiveMessage(ActorCell.scala:580) at akka.actor.ActorCell.invoke(ActorCell.scala:548) at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:270) at akka.dispatch.Mailbox.run(Mailbox.scala:231) at akka.dispatch.Mailbox.exec(Mailbox.scala:243) at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289) at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056) at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692) at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:157) Caused by: com.ververica.cdc.connectors.shaded.org.apache.kafka.connect.errors.ConnectException: An exception occurred in the change event producer. This connector will be stopped. at io.debezium.pipeline.ErrorHandler.setProducerThrowable(ErrorHandler.java:42) at io.debezium.connector.oracle.logminer.LogMinerStreamingChangeEventSource.execute(LogMinerStreamingChangeEventSource.java:208) at io.debezium.pipeline.ChangeEventSourceCoordinator.streamEvents(ChangeEventSourceCoordinator.java:152) at io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:119) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: java.sql.SQLException: ORA-04030: out of process memory when trying to allocate 65568 bytes (Logminer LCR c,krvxrib:buffer) at oracle.jdbc.driver.T4CTTIoer11.processError(T4CTTIoer11.java:509) at oracle.jdbc.driver.T4CTTIoer11.processError(T4CTTIoer11.java:461) at oracle.jdbc.driver.T4C8Oall.processError(T4C8Oall.java:1104) at oracle.jdbc.driver.T4CTTIfun.receive(T4CTTIfun.java:550) at oracle.jdbc.driver.T4CTTIfun.doRPC(T4CTTIfun.java:268) at oracle.jdbc.driver.T4C8Oall.doOALL(T4C8Oall.java:655) at oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:270) at oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:91) at oracle.jdbc.driver.T4CPreparedStatement.executeForRows(T4CPreparedStatement.java:970) at oracle.jdbc.driver.OracleStatement.executeMaybeDescribe(OracleStatement.java:1012) at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1168) at oracle.jdbc.driver.OraclePreparedStatement.executeInternal(OraclePreparedStatement.java:3666) at oracle.jdbc.driver.T4CPreparedStatement.executeInternal(T4CPreparedStatement.java:1426) at oracle.jdbc.driver.OraclePreparedStatement.executeQuery(OraclePreparedStatement.java:3713) at oracle.jdbc.driver.OraclePreparedStatementWrapper.executeQuery(OraclePreparedStatementWrapper.java:1167) at io.debezium.connector.oracle.logminer.LogMinerStreamingChangeEventSource.execute(LogMinerStreamingChangeEventSource.java:184)
— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.***>

ldwnt commented 1 year ago

I get this error after upgrading to cdc 2.3 and using specificOffset for startupOptions:

            MySqlSourceBuilder<String> builder = MySqlSource.<String>builder()
                .hostname(datasource.getConnect().getHost())
                .port(Integer.parseInt(datasource.getConnect().getPort()))
                .username(datasource.getConnect().getUsername())
                .password(datasource.getConnect().getPassword())
                .databaseList(datasource.getDatabase())
                .tableList(tableRegex)
                .scanNewlyAddedTableEnabled(true)
//                .serverTimeZone("SERVER")
                .startupOptions(StartupOptions.specificOffset("binlog.000017", 499573306))
                .includeSchemaChanges(true)
                .deserializer(new ChangJsonDeserializationSchema(true))
                .serverId(serverId)
                .debeziumProperties(properties);

log as below:

2022-12-05 16:02:06,396 INFO  [blc-mysql-sit.deepq.tech:3306] io.debezium.connector.mysql.MySqlStreamingChangeEventSource [1185] - Connected to MySQL binlog at mysql-sit.deepq.tech:3306, starting at MySqlOffsetContext [sourceInfoSchema=Schema{io.debezium.connector.mysql.Source:STRUCT}, sourceInfo=SourceInfo [currentGtid=null, currentBinlogFilename=binlog.000017, currentBinlogPosition=499573306, currentRowNumber=0, serverId=0, sourceTime=null, threadId=-1, currentQuery=null, tableIds=[], databaseName=null], partition={server=mysql_binlog_source}, snapshotCompleted=false, transactionContext=TransactionContext [currentTransactionId=null, perTableEventCount={}, totalEventCount=0], restartGtidSet=null, currentGtidSet=null, restartBinlogFilename=binlog.000017, restartBinlogPosition=499573306, restartRowsToSkip=0, restartEventsToSkip=0, currentEventLengthInBytes=0, inTransaction=false, transactionId=null, incrementalSnapshotContext =IncrementalSnapshotContext [windowOpened=false, chunkEndPosition=null, dataCollectionsToSnapshot=[], lastEventKeySent=null, maximumKey=null]]
2022-12-05 16:02:06,399 INFO  [debezium-reader-0] io.debezium.connector.mysql.MySqlStreamingChangeEventSource [917] - Waiting for keepalive thread to start
2022-12-05 16:02:06,399 INFO  [blc-mysql-sit.deepq.tech:3306] io.debezium.util.Threads [287] - Creating thread debezium-mysqlconnector-mysql_binlog_source-binlog-client
2022-12-05 16:02:06,400 INFO  [debezium-reader-0] io.debezium.connector.mysql.MySqlStreamingChangeEventSource [924] - Keepalive thread is running
2022-12-05 16:02:06,405 ERROR [blc-mysql-sit.deepq.tech:3306] io.debezium.connector.mysql.MySqlStreamingChangeEventSource [1054] - Error during binlog processing. Last offset stored = null, binlog reader near position = binlog.000017/499573306
2022-12-05 16:02:06,430 ERROR [blc-mysql-sit.deepq.tech:3306] io.debezium.pipeline.ErrorHandler [31] - Producer failure
io.debezium.DebeziumException: Failed to deserialize data of EventHeaderV4{timestamp=1670226722000, eventType=EXT_UPDATE_ROWS, serverId=1, headerLength=19, dataLength=31, nextPosition=499573356, flags=0}
    at io.debezium.connector.mysql.MySqlStreamingChangeEventSource.wrap(MySqlStreamingChangeEventSource.java:1154) ~[debezium-connector-mysql-1.6.4.Final.jar:1.6.4.Final]
    at io.debezium.connector.mysql.MySqlStreamingChangeEventSource$ReaderThreadLifecycleListener.onEventDeserializationFailure(MySqlStreamingChangeEventSource.java:1207) [debezium-connector-mysql-1.6.4.Final.jar:1.6.4.Final]
    at com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:958) [mysql-binlog-connector-java-0.25.1.jar:0.25.1]
    at com.github.shyiko.mysql.binlog.BinaryLogClient.connect(BinaryLogClient.java:606) [mysql-binlog-connector-java-0.25.1.jar:0.25.1]
    at com.github.shyiko.mysql.binlog.BinaryLogClient$7.run(BinaryLogClient.java:850) [mysql-binlog-connector-java-0.25.1.jar:0.25.1]
    at java.lang.Thread.run(Thread.java:748) [?:1.8.0_312]
Caused by: com.github.shyiko.mysql.binlog.event.deserialization.EventDataDeserializationException: Failed to deserialize data of EventHeaderV4{timestamp=1670226722000, eventType=EXT_UPDATE_ROWS, serverId=1, headerLength=19, dataLength=31, nextPosition=499573356, flags=0}
    at com.github.shyiko.mysql.binlog.event.deserialization.EventDeserializer.deserializeEventData(EventDeserializer.java:309) ~[mysql-binlog-connector-java-0.25.1.jar:0.25.1]
    at com.github.shyiko.mysql.binlog.event.deserialization.EventDeserializer.nextEvent(EventDeserializer.java:232) ~[mysql-binlog-connector-java-0.25.1.jar:0.25.1]
    at io.debezium.connector.mysql.MySqlStreamingChangeEventSource$1.nextEvent(MySqlStreamingChangeEventSource.java:230) ~[debezium-connector-mysql-1.6.4.Final.jar:1.6.4.Final]
    at com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:945) ~[mysql-binlog-connector-java-0.25.1.jar:0.25.1]
    ... 3 more
Caused by: com.github.shyiko.mysql.binlog.event.deserialization.MissingTableMapEventException: No TableMapEventData has been found for table id:905. Usually that means that you have started reading binary log 'within the logical event group' (e.g. from WRITE_ROWS and not proceeding TABLE_MAP
    at com.github.shyiko.mysql.binlog.event.deserialization.AbstractRowsEventDataDeserializer.deserializeRow(AbstractRowsEventDataDeserializer.java:109) ~[mysql-binlog-connector-java-0.25.1.jar:0.25.1]
    at com.github.shyiko.mysql.binlog.event.deserialization.UpdateRowsEventDataDeserializer.deserializeRows(UpdateRowsEventDataDeserializer.java:71) ~[mysql-binlog-connector-java-0.25.1.jar:0.25.1]
    at com.github.shyiko.mysql.binlog.event.deserialization.UpdateRowsEventDataDeserializer.deserialize(UpdateRowsEventDataDeserializer.java:58) ~[mysql-binlog-connector-java-0.25.1.jar:0.25.1]
    at com.github.shyiko.mysql.binlog.event.deserialization.UpdateRowsEventDataDeserializer.deserialize(UpdateRowsEventDataDeserializer.java:33) ~[mysql-binlog-connector-java-0.25.1.jar:0.25.1]
    at com.github.shyiko.mysql.binlog.event.deserialization.EventDeserializer.deserializeEventData(EventDeserializer.java:303) ~[mysql-binlog-connector-java-0.25.1.jar:0.25.1]
    at com.github.shyiko.mysql.binlog.event.deserialization.EventDeserializer.nextEvent(EventDeserializer.java:232) ~[mysql-binlog-connector-java-0.25.1.jar:0.25.1]
    at io.debezium.connector.mysql.MySqlStreamingChangeEventSource$1.nextEvent(MySqlStreamingChangeEventSource.java:230) ~[debezium-connector-mysql-1.6.4.Final.jar:1.6.4.Final]
    at com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:945) ~[mysql-binlog-connector-java-0.25.1.jar:0.25.1]
    ... 3 more
ShaddockWong commented 1 year ago

我们没有到oracle,但是那个问题,之前官方说修复过一次,但是我查到好多人又碰到了这个问题,暂时只能判断应该是cdc官方的bug,我们是重启cdc任务解决的。

------------------ 原始邮件 ------------------ 发件人: @.>; 发送时间: 2022年11月30日(星期三) 上午9:46 收件人: @.>; 抄送: "To @.>; @.>; 主题: Re: [ververica/flink-cdc-connectors] 请问出现这个错误(io.debezium.DebeziumException: Failed to deserialize data of EventHeaderV4{timestamp=1627541881000,)是什么原因? (#276)

flink 1.14.3+ oracle-cdc 2.2.1 也遇到这个问题

org.apache.flink.runtime.JobException: Recovery is suppressed by FixedDelayRestartBackoffTimeStrategy(maxNumberRestartAttempts=3, backoffTimeMS=30000) at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:138) ~[flink-dist_2.12-1.14.3.jar:1.14.3] at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:82) ~[flink-dist_2.12-1.14.3.jar:1.14.3] at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:252) ~[flink-dist_2.12-1.14.3.jar:1.14.3] at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:242) ~[flink-dist_2.12-1.14.3.jar:1.14.3] at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:233) ~[flink-dist_2.12-1.14.3.jar:1.14.3] at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:684) ~[flink-dist_2.12-1.14.3.jar:1.14.3] at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:79) ~[flink-dist_2.12-1.14.3.jar:1.14.3] at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:444) ~[flink-dist_2.12-1.14.3.jar:1.14.3] at sun.reflect.GeneratedMethodAccessor35.invoke(Unknown Source) ~[?:?] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_202] at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_202] at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.lambda$handleRpcInvocation$1(AkkaRpcActor.java:316) ~[flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at org.apache.flink.runtime.concurrent.akka.ClassLoadingUtils.runWithContextClassLoader(ClassLoadingUtils.java:83) ~[flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:314) ~[flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:217) ~[flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:78) ~[flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:163) ~[flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:24) [flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:20) [flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:123) [flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:122) [flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:20) [flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171) [flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:172) [flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:172) [flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at akka.actor.Actor.aroundReceive(Actor.scala:537) [flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at akka.actor.Actor.aroundReceive$(Actor.scala:535) [flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:220) [flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at akka.actor.ActorCell.receiveMessage(ActorCell.scala:580) [flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at akka.actor.ActorCell.invoke(ActorCell.scala:548) [flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:270) [flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at akka.dispatch.Mailbox.run(Mailbox.scala:231) [flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at akka.dispatch.Mailbox.exec(Mailbox.scala:243) [flink-rpc-akka_6c997f87-23fb-40a3-abdb-956b62cb8d21.jar:1.14.3] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289) [?:1.8.0_202] at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056) [?:1.8.0_202] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692) [?:1.8.0_202] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:157) [?:1.8.0_202] Caused by: com.ververica.cdc.connectors.shaded.org.apache.kafka.connect.errors.ConnectException: An exception occurred in the change event producer. This connector will be stopped. at io.debezium.pipeline.ErrorHandler.setProducerThrowable(ErrorHandler.java:42) ~[?:?] at io.debezium.connector.oracle.logminer.LogMinerStreamingChangeEventSource.execute(LogMinerStreamingChangeEventSource.java:208) ~[?:?] at io.debezium.pipeline.ChangeEventSourceCoordinator.streamEvents(ChangeEventSourceCoordinator.java:152) ~[?:?] at io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:119) ~[?:?] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_202] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_202] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_202] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_202] at java.lang.Thread.run(Thread.java:748) ~[?:1.8.0_202] Caused by: java.sql.SQLException: ORA-04030: out of process memory when trying to allocate 65568 bytes (Logminer LCR c,krvxrib:buffer) at oracle.jdbc.driver.T4CTTIoer11.processError(T4CTTIoer11.java:509) ~[?:?] at oracle.jdbc.driver.T4CTTIoer11.processError(T4CTTIoer11.java:461) ~[?:?] at oracle.jdbc.driver.T4C8Oall.processError(T4C8Oall.java:1104) ~[?:?] at oracle.jdbc.driver.T4CTTIfun.receive(T4CTTIfun.java:550) ~[?:?] at oracle.jdbc.driver.T4CTTIfun.doRPC(T4CTTIfun.java:268) ~[?:?] at oracle.jdbc.driver.T4C8Oall.doOALL(T4C8Oall.java:655) ~[?:?] at oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:270) ~[?:?] at oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:91) ~[?:?] at oracle.jdbc.driver.T4CPreparedStatement.executeForRows(T4CPreparedStatement.java:970) ~[?:?] at oracle.jdbc.driver.OracleStatement.executeMaybeDescribe(OracleStatement.java:1012) ~[?:?] at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1168) ~[?:?] at oracle.jdbc.driver.OraclePreparedStatement.executeInternal(OraclePreparedStatement.java:3666) ~[?:?] at oracle.jdbc.driver.T4CPreparedStatement.executeInternal(T4CPreparedStatement.java:1426) ~[?:?] at oracle.jdbc.driver.OraclePreparedStatement.executeQuery(OraclePreparedStatement.java:3713) ~[?:?] at oracle.jdbc.driver.OraclePreparedStatementWrapper.executeQuery(OraclePreparedStatementWrapper.java:1167) ~[?:?] at io.debezium.connector.oracle.logminer.LogMinerStreamingChangeEventSource.execute(LogMinerStreamingChangeEventSource.java:184) ~[?:?] at io.debezium.pipeline.ChangeEventSourceCoordinator.streamEvents(ChangeEventSourceCoordinator.java:152) ~[?:?] at io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:119) ~[?:?] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_202] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_202] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_202] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_202] at java.lang.Thread.run(Thread.java:748) ~[?:1.8.0_202] Caused by: oracle.jdbc.OracleDatabaseException: ORA-04030: out of process memory when trying to allocate 65568 bytes (Logminer LCR c,krvxrib:buffer) at oracle.jdbc.driver.T4CTTIoer11.processError(T4CTTIoer11.java:513) ~[?:?] at oracle.jdbc.driver.T4CTTIoer11.processError(T4CTTIoer11.java:461) ~[?:?] at oracle.jdbc.driver.T4C8Oall.processError(T4C8Oall.java:1104) ~[?:?] at oracle.jdbc.driver.T4CTTIfun.receive(T4CTTIfun.java:550) ~[?:?] at oracle.jdbc.driver.T4CTTIfun.doRPC(T4CTTIfun.java:268) ~[?:?] at oracle.jdbc.driver.T4C8Oall.doOALL(T4C8Oall.java:655) ~[?:?] at oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:270) ~[?:?] at oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:91) ~[?:?] at oracle.jdbc.driver.T4CPreparedStatement.executeForRows(T4CPreparedStatement.java:970) ~[?:?] at oracle.jdbc.driver.OracleStatement.executeMaybeDescribe(OracleStatement.java:1012) ~[?:?] at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1168) ~[?:?] at oracle.jdbc.driver.OraclePreparedStatement.executeInternal(OraclePreparedStatement.java:3666) ~[?:?] at oracle.jdbc.driver.T4CPreparedStatement.executeInternal(T4CPreparedStatement.java:1426) ~[?:?] at oracle.jdbc.driver.OraclePreparedStatement.executeQuery(OraclePreparedStatement.java:3713) ~[?:?] at oracle.jdbc.driver.OraclePreparedStatementWrapper.executeQuery(OraclePreparedStatementWrapper.java:1167) ~[?:?] at io.debezium.connector.oracle.logminer.LogMinerStreamingChangeEventSource.execute(LogMinerStreamingChangeEventSource.java:184) ~[?:?] at io.debezium.pipeline.ChangeEventSourceCoordinator.streamEvents(ChangeEventSourceCoordinator.java:152) ~[?:?] at io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:119) ~[?:?] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_202] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_202] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_202] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_202] at java.lang.Thread.run(Thread.java:748) ~[?:1.8.0_202] 2022-11-29 15:27:31,702 INFO org.apache.flink.runtime.checkpoint.CheckpointCoordinator [] - Stopping checkpoint coordinator for job 286631e3bd04d5a53a0977138634fee5. 2022-11-29 15:27:31,705 INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher [] - Job 286631e3bd04d5a53a0977138634fee5 reached terminal state FAILED. org.apache.flink.runtime.JobException: Recovery is suppressed by FixedDelayRestartBackoffTimeStrategy(maxNumberRestartAttempts=3, backoffTimeMS=30000) at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:138) at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:82) at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:252) at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:242) at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:233) at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:684) at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:79) at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:444) at sun.reflect.GeneratedMethodAccessor35.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.lambda$handleRpcInvocation$1(AkkaRpcActor.java:316) at org.apache.flink.runtime.concurrent.akka.ClassLoadingUtils.runWithContextClassLoader(ClassLoadingUtils.java:83) at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:314) at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:217) at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:78) at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:163) at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:24) at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:20) at scala.PartialFunction.applyOrElse(PartialFunction.scala:123) at scala.PartialFunction.applyOrElse$(PartialFunction.scala:122) at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:20) at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171) at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:172) at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:172) at akka.actor.Actor.aroundReceive(Actor.scala:537) at akka.actor.Actor.aroundReceive$(Actor.scala:535) at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:220) at akka.actor.ActorCell.receiveMessage(ActorCell.scala:580) at akka.actor.ActorCell.invoke(ActorCell.scala:548) at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:270) at akka.dispatch.Mailbox.run(Mailbox.scala:231) at akka.dispatch.Mailbox.exec(Mailbox.scala:243) at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289) at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056) at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692) at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:157) Caused by: com.ververica.cdc.connectors.shaded.org.apache.kafka.connect.errors.ConnectException: An exception occurred in the change event producer. This connector will be stopped. at io.debezium.pipeline.ErrorHandler.setProducerThrowable(ErrorHandler.java:42) at io.debezium.connector.oracle.logminer.LogMinerStreamingChangeEventSource.execute(LogMinerStreamingChangeEventSource.java:208) at io.debezium.pipeline.ChangeEventSourceCoordinator.streamEvents(ChangeEventSourceCoordinator.java:152) at io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:119) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: java.sql.SQLException: ORA-04030: out of process memory when trying to allocate 65568 bytes (Logminer LCR c,krvxrib:buffer) at oracle.jdbc.driver.T4CTTIoer11.processError(T4CTTIoer11.java:509) at oracle.jdbc.driver.T4CTTIoer11.processError(T4CTTIoer11.java:461) at oracle.jdbc.driver.T4C8Oall.processError(T4C8Oall.java:1104) at oracle.jdbc.driver.T4CTTIfun.receive(T4CTTIfun.java:550) at oracle.jdbc.driver.T4CTTIfun.doRPC(T4CTTIfun.java:268) at oracle.jdbc.driver.T4C8Oall.doOALL(T4C8Oall.java:655) at oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:270) at oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:91) at oracle.jdbc.driver.T4CPreparedStatement.executeForRows(T4CPreparedStatement.java:970) at oracle.jdbc.driver.OracleStatement.executeMaybeDescribe(OracleStatement.java:1012) at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1168) at oracle.jdbc.driver.OraclePreparedStatement.executeInternal(OraclePreparedStatement.java:3666) at oracle.jdbc.driver.T4CPreparedStatement.executeInternal(T4CPreparedStatement.java:1426) at oracle.jdbc.driver.OraclePreparedStatement.executeQuery(OraclePreparedStatement.java:3713) at oracle.jdbc.driver.OraclePreparedStatementWrapper.executeQuery(OraclePreparedStatementWrapper.java:1167) at io.debezium.connector.oracle.logminer.LogMinerStreamingChangeEventSource.execute(LogMinerStreamingChangeEventSource.java:184)
— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.***>

anotherZmm commented 1 year ago

flink-cdc 2.0.0 flink 1.13.0 mysql-version 8.0

duanminhui commented 1 year ago

Adjusting MySQL unique keys  Make Flink sharding data more uniform

------------------ 原始邮件 ------------------ 发件人: "ververica/flink-cdc-connectors" @.>; 发送时间: 2023年4月13日(星期四) 下午5:33 @.>; @.**@.>; 主题: Re: [ververica/flink-cdc-connectors] 请问出现这个错误(io.debezium.DebeziumException: Failed to deserialize data of EventHeaderV4{timestamp=1627541881000,)是什么原因? (#276)

I had the same problem On April 1 and April 11,A large amount of data is written during that time period ,And then the exception happens. version: flink 1.14.6 Flink-Connector-mysql-cdc (2.3.0) mysql 5.7.37

exception on April 1: java.lang.RuntimeException: One or more fetchers have encountered exception at org.apache.flink.connector.base.source.reader.fetcher.SplitFetcherManager.checkErrors(SplitFetcherManager.java:225) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at org.apache.flink.connector.base.source.reader.SourceReaderBase.getNextFetch(SourceReaderBase.java:169) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at org.apache.flink.connector.base.source.reader.SourceReaderBase.pollNext(SourceReaderBase.java:130) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at org.apache.flink.streaming.api.operators.SourceOperator.emitNext(SourceOperator.java:354) ~[flink-dist_2.12-1.14.6.jar:1.14.6] at org.apache.flink.streaming.runtime.io.StreamTaskSourceInput.emitNext(StreamTaskSourceInput.java:68) ~[flink-dist_2.12-1.14.6.jar:1.14.6] at org.apache.flink.streaming.runtime.io.StreamOneInputProcessor.processInput(StreamOneInputProcessor.java:65) ~[flink-dist_2.12-1.14.6.jar:1.14.6] at org.apache.flink.streaming.runtime.tasks.StreamTask.processInput(StreamTask.java:495) ~[flink-dist_2.12-1.14.6.jar:1.14.6] at org.apache.flink.streaming.runtime.tasks.mailbox.MailboxProcessor.runMailboxLoop(MailboxProcessor.java:203) ~[flink-dist_2.12-1.14.6.jar:1.14.6] at org.apache.flink.streaming.runtime.tasks.StreamTask.runMailboxLoop(StreamTask.java:806) ~[flink-dist_2.12-1.14.6.jar:1.14.6] at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:758) ~[flink-dist_2.12-1.14.6.jar:1.14.6] at org.apache.flink.runtime.taskmanager.Task.runWithSystemExitMonitoring(Task.java:958) ~[flink-dist_2.12-1.14.6.jar:1.14.6] at org.apache.flink.runtime.taskmanager.Task.restoreAndInvoke(Task.java:937) ~[flink-dist_2.12-1.14.6.jar:1.14.6] at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:766) ~[flink-dist_2.12-1.14.6.jar:1.14.6] at org.apache.flink.runtime.taskmanager.Task.run(Task.java:575) ~[flink-dist_2.12-1.14.6.jar:1.14.6] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_352] Caused by: java.lang.RuntimeException: SplitFetcher thread 0 received unexpected exception while polling the records at org.apache.flink.connector.base.source.reader.fetcher.SplitFetcher.runOnce(SplitFetcher.java:150) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at org.apache.flink.connector.base.source.reader.fetcher.SplitFetcher.run(SplitFetcher.java:105) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_352] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_352] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_352] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_352] ... 1 more Caused by: org.apache.kafka.connect.errors.ConnectException: An exception occurred in the change event producer. This connector will be stopped. at io.debezium.pipeline.ErrorHandler.setProducerThrowable(ErrorHandler.java:42) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at com.ververica.cdc.connectors.mysql.debezium.task.context.MySqlErrorHandler.setProducerThrowable(MySqlErrorHandler.java:89) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at io.debezium.connector.mysql.MySqlStreamingChangeEventSource$ReaderThreadLifecycleListener.onCommunicationFailure(MySqlStreamingChangeEventSource.java:1472) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:980) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at com.github.shyiko.mysql.binlog.BinaryLogClient.connect(BinaryLogClient.java:599) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at com.github.shyiko.mysql.binlog.BinaryLogClient$7.run(BinaryLogClient.java:857) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] ... 1 more Caused by: io.debezium.DebeziumException: Failed to deserialize data of EventHeaderV4{timestamp=1680373836000, eventType=DELETE_ROWS, serverId=265994456, headerLength=19, dataLength=8101, nextPosition=139705197, flags=0} at io.debezium.connector.mysql.MySqlStreamingChangeEventSource.wrap(MySqlStreamingChangeEventSource.java:1416) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at io.debezium.connector.mysql.MySqlStreamingChangeEventSource$ReaderThreadLifecycleListener.onCommunicationFailure(MySqlStreamingChangeEventSource.java:1472) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:980) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at com.github.shyiko.mysql.binlog.BinaryLogClient.connect(BinaryLogClient.java:599) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at com.github.shyiko.mysql.binlog.BinaryLogClient$7.run(BinaryLogClient.java:857) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] ... 1 more Caused by: com.github.shyiko.mysql.binlog.event.deserialization.EventDataDeserializationException: Failed to deserialize data of EventHeaderV4{timestamp=1680373836000, eventType=DELETE_ROWS, serverId=265994456, headerLength=19, dataLength=8101, nextPosition=139705197, flags=0} at com.github.shyiko.mysql.binlog.event.deserialization.EventDeserializer.deserializeEventData(EventDeserializer.java:309) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at com.github.shyiko.mysql.binlog.event.deserialization.EventDeserializer.nextEvent(EventDeserializer.java:232) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at io.debezium.connector.mysql.MySqlStreamingChangeEventSource$1.nextEvent(MySqlStreamingChangeEventSource.java:257) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:952) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at com.github.shyiko.mysql.binlog.BinaryLogClient.connect(BinaryLogClient.java:599) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at com.github.shyiko.mysql.binlog.BinaryLogClient$7.run(BinaryLogClient.java:857) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] ... 1 more Caused by: java.io.EOFException at com.github.shyiko.mysql.binlog.io.ByteArrayInputStream.fill(ByteArrayInputStream.java:113) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at com.github.shyiko.mysql.binlog.io.ByteArrayInputStream.read(ByteArrayInputStream.java:104) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at com.github.shyiko.mysql.binlog.event.deserialization.AbstractRowsEventDataDeserializer.deserializeBlob(AbstractRowsEventDataDeserializer.java:403) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at com.github.shyiko.mysql.binlog.event.deserialization.AbstractRowsEventDataDeserializer.deserializeCell(AbstractRowsEventDataDeserializer.java:191) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at com.github.shyiko.mysql.binlog.event.deserialization.AbstractRowsEventDataDeserializer.deserializeRow(AbstractRowsEventDataDeserializer.java:143) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at com.github.shyiko.mysql.binlog.event.deserialization.DeleteRowsEventDataDeserializer.deserializeRows(DeleteRowsEventDataDeserializer.java:64) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at com.github.shyiko.mysql.binlog.event.deserialization.DeleteRowsEventDataDeserializer.deserialize(DeleteRowsEventDataDeserializer.java:56) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at com.github.shyiko.mysql.binlog.event.deserialization.DeleteRowsEventDataDeserializer.deserialize(DeleteRowsEventDataDeserializer.java:32) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at com.github.shyiko.mysql.binlog.event.deserialization.EventDeserializer.deserializeEventData(EventDeserializer.java:303) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at com.github.shyiko.mysql.binlog.event.deserialization.EventDeserializer.nextEvent(EventDeserializer.java:232) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at io.debezium.connector.mysql.MySqlStreamingChangeEventSource$1.nextEvent(MySqlStreamingChangeEventSource.java:257) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:952) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at com.github.shyiko.mysql.binlog.BinaryLogClient.connect(BinaryLogClient.java:599) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at com.github.shyiko.mysql.binlog.BinaryLogClient$7.run(BinaryLogClient.java:857) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] ... 1 more exception on April 10: java.lang.RuntimeException: One or more fetchers have encountered exception at org.apache.flink.connector.base.source.reader.fetcher.SplitFetcherManager.checkErrors(SplitFetcherManager.java:225) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at org.apache.flink.connector.base.source.reader.SourceReaderBase.getNextFetch(SourceReaderBase.java:169) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at org.apache.flink.connector.base.source.reader.SourceReaderBase.pollNext(SourceReaderBase.java:130) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at org.apache.flink.connector.base.source.reader.SourceReaderBase.pollNext(SourceReaderBase.java:156) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at org.apache.flink.streaming.api.operators.SourceOperator.emitNext(SourceOperator.java:354) ~[flink-dist_2.12-1.14.6.jar:1.14.6] at org.apache.flink.streaming.runtime.io.StreamTaskSourceInput.emitNext(StreamTaskSourceInput.java:68) ~[flink-dist_2.12-1.14.6.jar:1.14.6] at org.apache.flink.streaming.runtime.io.StreamOneInputProcessor.processInput(StreamOneInputProcessor.java:65) ~[flink-dist_2.12-1.14.6.jar:1.14.6] at org.apache.flink.streaming.runtime.tasks.StreamTask.processInput(StreamTask.java:495) ~[flink-dist_2.12-1.14.6.jar:1.14.6] at org.apache.flink.streaming.runtime.tasks.mailbox.MailboxProcessor.runMailboxLoop(MailboxProcessor.java:203) ~[flink-dist_2.12-1.14.6.jar:1.14.6] at org.apache.flink.streaming.runtime.tasks.StreamTask.runMailboxLoop(StreamTask.java:806) ~[flink-dist_2.12-1.14.6.jar:1.14.6] at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:758) ~[flink-dist_2.12-1.14.6.jar:1.14.6] at org.apache.flink.runtime.taskmanager.Task.runWithSystemExitMonitoring(Task.java:958) ~[flink-dist_2.12-1.14.6.jar:1.14.6] at org.apache.flink.runtime.taskmanager.Task.restoreAndInvoke(Task.java:937) ~[flink-dist_2.12-1.14.6.jar:1.14.6] at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:766) ~[flink-dist_2.12-1.14.6.jar:1.14.6] at org.apache.flink.runtime.taskmanager.Task.run(Task.java:575) ~[flink-dist_2.12-1.14.6.jar:1.14.6] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_352] Caused by: java.lang.RuntimeException: SplitFetcher thread 0 received unexpected exception while polling the records at org.apache.flink.connector.base.source.reader.fetcher.SplitFetcher.runOnce(SplitFetcher.java:150) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at org.apache.flink.connector.base.source.reader.fetcher.SplitFetcher.run(SplitFetcher.java:105) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_352] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_352] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_352] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_352] ... 1 more Caused by: org.apache.kafka.connect.errors.ConnectException: An exception occurred in the change event producer. This connector will be stopped. at io.debezium.pipeline.ErrorHandler.setProducerThrowable(ErrorHandler.java:42) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at com.ververica.cdc.connectors.mysql.debezium.task.context.MySqlErrorHandler.setProducerThrowable(MySqlErrorHandler.java:89) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at io.debezium.connector.mysql.MySqlStreamingChangeEventSource$ReaderThreadLifecycleListener.onCommunicationFailure(MySqlStreamingChangeEventSource.java:1472) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:980) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at com.github.shyiko.mysql.binlog.BinaryLogClient.connect(BinaryLogClient.java:599) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at com.github.shyiko.mysql.binlog.BinaryLogClient$7.run(BinaryLogClient.java:857) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] ... 1 more Caused by: io.debezium.DebeziumException: Failed to deserialize data of EventHeaderV4{timestamp=1681151437000, eventType=DELETE_ROWS, serverId=265994456, headerLength=19, dataLength=8124, nextPosition=156286535, flags=0} at io.debezium.connector.mysql.MySqlStreamingChangeEventSource.wrap(MySqlStreamingChangeEventSource.java:1416) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at io.debezium.connector.mysql.MySqlStreamingChangeEventSource$ReaderThreadLifecycleListener.onCommunicationFailure(MySqlStreamingChangeEventSource.java:1472) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:980) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at com.github.shyiko.mysql.binlog.BinaryLogClient.connect(BinaryLogClient.java:599) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at com.github.shyiko.mysql.binlog.BinaryLogClient$7.run(BinaryLogClient.java:857) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] ... 1 more Caused by: com.github.shyiko.mysql.binlog.event.deserialization.EventDataDeserializationException: Failed to deserialize data of EventHeaderV4{timestamp=1681151437000, eventType=DELETE_ROWS, serverId=265994456, headerLength=19, dataLength=8124, nextPosition=156286535, flags=0} at com.github.shyiko.mysql.binlog.event.deserialization.EventDeserializer.deserializeEventData(EventDeserializer.java:309) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at com.github.shyiko.mysql.binlog.event.deserialization.EventDeserializer.nextEvent(EventDeserializer.java:232) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at io.debezium.connector.mysql.MySqlStreamingChangeEventSource$1.nextEvent(MySqlStreamingChangeEventSource.java:257) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:952) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at com.github.shyiko.mysql.binlog.BinaryLogClient.connect(BinaryLogClient.java:599) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at com.github.shyiko.mysql.binlog.BinaryLogClient$7.run(BinaryLogClient.java:857) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] ... 1 more Caused by: java.io.EOFException at com.github.shyiko.mysql.binlog.io.ByteArrayInputStream.fill(ByteArrayInputStream.java:113) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at com.github.shyiko.mysql.binlog.io.ByteArrayInputStream.read(ByteArrayInputStream.java:104) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at com.github.shyiko.mysql.binlog.event.deserialization.AbstractRowsEventDataDeserializer.deserializeBlob(AbstractRowsEventDataDeserializer.java:403) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at com.github.shyiko.mysql.binlog.event.deserialization.AbstractRowsEventDataDeserializer.deserializeCell(AbstractRowsEventDataDeserializer.java:191) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at com.github.shyiko.mysql.binlog.event.deserialization.AbstractRowsEventDataDeserializer.deserializeRow(AbstractRowsEventDataDeserializer.java:143) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at com.github.shyiko.mysql.binlog.event.deserialization.DeleteRowsEventDataDeserializer.deserializeRows(DeleteRowsEventDataDeserializer.java:64) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at com.github.shyiko.mysql.binlog.event.deserialization.DeleteRowsEventDataDeserializer.deserialize(DeleteRowsEventDataDeserializer.java:56) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at com.github.shyiko.mysql.binlog.event.deserialization.DeleteRowsEventDataDeserializer.deserialize(DeleteRowsEventDataDeserializer.java:32) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at com.github.shyiko.mysql.binlog.event.deserialization.EventDeserializer.deserializeEventData(EventDeserializer.java:303) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at com.github.shyiko.mysql.binlog.event.deserialization.EventDeserializer.nextEvent(EventDeserializer.java:232) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at io.debezium.connector.mysql.MySqlStreamingChangeEventSource$1.nextEvent(MySqlStreamingChangeEventSource.java:257) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:952) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at com.github.shyiko.mysql.binlog.BinaryLogClient.connect(BinaryLogClient.java:599) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] at com.github.shyiko.mysql.binlog.BinaryLogClient$7.run(BinaryLogClient.java:857) ~[klook-dp-data-integration-rds2kafka-1.0-SNAPSHOT-jar-with-dependencies.jar:?] ... 1 more

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.***>

EmuuGrass commented 1 year ago

I got the same problem. I found it only occured when the job sink to kafka with the config 'DeliveryGuarantee.EXACTLY_ONCE'. And I changed it to 'DeliveryGuarantee.AT_LEAST_ONCE'. Problem sovled.It must be a bug.

Lordeath commented 1 year ago
io.debezium.DebeziumException: Failed to deserialize data of EventHeaderV4{timestamp=1689302858000, eventType=EXT_WRITE_ROWS, serverId=1, headerLength=19, dataLength=8172, nextPosition=38377148, flags=0}
    at io.debezium.connector.mysql.MySqlStreamingChangeEventSource.wrap(MySqlStreamingChangeEventSource.java:1383) ~[blob_p-f629484234b4985de4500370a5f1469ae0dd1ece-8a023802348066a979970ab6ec8a3ba5:?]
    at io.debezium.connector.mysql.MySqlStreamingChangeEventSource$ReaderThreadLifecycleListener.onEventDeserializationFailure(MySqlStreamingChangeEventSource.java:1448) [blob_p-f629484234b4985de4500370a5f1469ae0dd1ece-8a023802348066a979970ab6ec8a3ba5:?]
    at com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:965) [blob_p-f629484234b4985de4500370a5f1469ae0dd1ece-8a023802348066a979970ab6ec8a3ba5:?]
    at com.github.shyiko.mysql.binlog.BinaryLogClient.connect(BinaryLogClient.java:599) [blob_p-f629484234b4985de4500370a5f1469ae0dd1ece-8a023802348066a979970ab6ec8a3ba5:?]
    at com.github.shyiko.mysql.binlog.BinaryLogClient$7.run(BinaryLogClient.java:857) [blob_p-f629484234b4985de4500370a5f1469ae0dd1ece-8a023802348066a979970ab6ec8a3ba5:?]
    at java.lang.Thread.run(Thread.java:829) [?:?]
Caused by: com.github.shyiko.mysql.binlog.event.deserialization.EventDataDeserializationException: Failed to deserialize data of EventHeaderV4{timestamp=1689302858000, eventType=EXT_WRITE_ROWS, serverId=1, headerLength=19, dataLength=8172, nextPosition=38377148, flags=0}
    at com.github.shyiko.mysql.binlog.event.deserialization.EventDeserializer.deserializeEventData(EventDeserializer.java:309) ~[blob_p-f629484234b4985de4500370a5f1469ae0dd1ece-8a023802348066a979970ab6ec8a3ba5:?]
    at com.github.shyiko.mysql.binlog.event.deserialization.EventDeserializer.nextEvent(EventDeserializer.java:232) ~[blob_p-f629484234b4985de4500370a5f1469ae0dd1ece-8a023802348066a979970ab6ec8a3ba5:?]
    at io.debezium.connector.mysql.MySqlStreamingChangeEventSource$1.nextEvent(MySqlStreamingChangeEventSource.java:256) ~[blob_p-f629484234b4985de4500370a5f1469ae0dd1ece-8a023802348066a979970ab6ec8a3ba5:?]
    at com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:952) ~[blob_p-f629484234b4985de4500370a5f1469ae0dd1ece-8a023802348066a979970ab6ec8a3ba5:?]
    ... 3 more
Caused by: com.github.shyiko.mysql.binlog.event.deserialization.MissingTableMapEventException: No TableMapEventData has been found for table id:102. Usually that means that you have started reading binary log 'within the logical event group' (e.g. from WRITE_ROWS and not proceeding TABLE_MAP
    at com.github.shyiko.mysql.binlog.event.deserialization.AbstractRowsEventDataDeserializer.deserializeRow(AbstractRowsEventDataDeserializer.java:109) ~[blob_p-f629484234b4985de4500370a5f1469ae0dd1ece-8a023802348066a979970ab6ec8a3ba5:?]
    at com.github.shyiko.mysql.binlog.event.deserialization.WriteRowsEventDataDeserializer.deserializeRows(WriteRowsEventDataDeserializer.java:64) ~[blob_p-f629484234b4985de4500370a5f1469ae0dd1ece-8a023802348066a979970ab6ec8a3ba5:?]
    at com.github.shyiko.mysql.binlog.event.deserialization.WriteRowsEventDataDeserializer.deserialize(WriteRowsEventDataDeserializer.java:56) ~[blob_p-f629484234b4985de4500370a5f1469ae0dd1ece-8a023802348066a979970ab6ec8a3ba5:?]
    at com.github.shyiko.mysql.binlog.event.deserialization.WriteRowsEventDataDeserializer.deserialize(WriteRowsEventDataDeserializer.java:32) ~[blob_p-f629484234b4985de4500370a5f1469ae0dd1ece-8a023802348066a979970ab6ec8a3ba5:?]
    at com.github.shyiko.mysql.binlog.event.deserialization.EventDeserializer.deserializeEventData(EventDeserializer.java:303) ~[blob_p-f629484234b4985de4500370a5f1469ae0dd1ece-8a023802348066a979970ab6ec8a3ba5:?]
    at com.github.shyiko.mysql.binlog.event.deserialization.EventDeserializer.nextEvent(EventDeserializer.java:232) ~[blob_p-f629484234b4985de4500370a5f1469ae0dd1ece-8a023802348066a979970ab6ec8a3ba5:?]
    at io.debezium.connector.mysql.MySqlStreamingChangeEventSource$1.nextEvent(MySqlStreamingChangeEventSource.java:256) ~[blob_p-f629484234b4985de4500370a5f1469ae0dd1ece-8a023802348066a979970ab6ec8a3ba5:?]
    at com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:952) ~[blob_p-f629484234b4985de4500370a5f1469ae0dd1ece-8a023802348066a979970ab6ec8a3ba5:?]
    ... 3 more

这有可能是在读取binlog的时候,如果一个事务更新了很多数据,mysql cdc会读取不到

EventDeserializer(292):tableMapEventByTableId.put(tableMapEvent.getTableId(), tableMapEvent); 可能是这一句代码的问题,因为 tableMapEventByTableId 是线程不安全的对象 而我这边读取数据是多个线程并行读取的,所以在扩容的时候,可能就读到空的了 暂时我把读取source改成并行度1,就没有再出现这个问题 偶现的这个bug我感觉是这个原因

jmzwcn commented 1 year ago

期待解决这个问题丫,在极端情况下。mysql负载严重的时候,必出现这种问题。

对,比如源表一下删除100万数据,这问题就出现

hehuiyuan commented 1 year ago
io.debezium.DebeziumException: Failed to deserialize data of EventHeaderV4{timestamp=1689302858000, eventType=EXT_WRITE_ROWS, serverId=1, headerLength=19, dataLength=8172, nextPosition=38377148, flags=0}
  at io.debezium.connector.mysql.MySqlStreamingChangeEventSource.wrap(MySqlStreamingChangeEventSource.java:1383) ~[blob_p-f629484234b4985de4500370a5f1469ae0dd1ece-8a023802348066a979970ab6ec8a3ba5:?]
  at io.debezium.connector.mysql.MySqlStreamingChangeEventSource$ReaderThreadLifecycleListener.onEventDeserializationFailure(MySqlStreamingChangeEventSource.java:1448) [blob_p-f629484234b4985de4500370a5f1469ae0dd1ece-8a023802348066a979970ab6ec8a3ba5:?]
  at com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:965) [blob_p-f629484234b4985de4500370a5f1469ae0dd1ece-8a023802348066a979970ab6ec8a3ba5:?]
  at com.github.shyiko.mysql.binlog.BinaryLogClient.connect(BinaryLogClient.java:599) [blob_p-f629484234b4985de4500370a5f1469ae0dd1ece-8a023802348066a979970ab6ec8a3ba5:?]
  at com.github.shyiko.mysql.binlog.BinaryLogClient$7.run(BinaryLogClient.java:857) [blob_p-f629484234b4985de4500370a5f1469ae0dd1ece-8a023802348066a979970ab6ec8a3ba5:?]
  at java.lang.Thread.run(Thread.java:829) [?:?]
Caused by: com.github.shyiko.mysql.binlog.event.deserialization.EventDataDeserializationException: Failed to deserialize data of EventHeaderV4{timestamp=1689302858000, eventType=EXT_WRITE_ROWS, serverId=1, headerLength=19, dataLength=8172, nextPosition=38377148, flags=0}
  at com.github.shyiko.mysql.binlog.event.deserialization.EventDeserializer.deserializeEventData(EventDeserializer.java:309) ~[blob_p-f629484234b4985de4500370a5f1469ae0dd1ece-8a023802348066a979970ab6ec8a3ba5:?]
  at com.github.shyiko.mysql.binlog.event.deserialization.EventDeserializer.nextEvent(EventDeserializer.java:232) ~[blob_p-f629484234b4985de4500370a5f1469ae0dd1ece-8a023802348066a979970ab6ec8a3ba5:?]
  at io.debezium.connector.mysql.MySqlStreamingChangeEventSource$1.nextEvent(MySqlStreamingChangeEventSource.java:256) ~[blob_p-f629484234b4985de4500370a5f1469ae0dd1ece-8a023802348066a979970ab6ec8a3ba5:?]
  at com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:952) ~[blob_p-f629484234b4985de4500370a5f1469ae0dd1ece-8a023802348066a979970ab6ec8a3ba5:?]
  ... 3 more
Caused by: com.github.shyiko.mysql.binlog.event.deserialization.MissingTableMapEventException: No TableMapEventData has been found for table id:102. Usually that means that you have started reading binary log 'within the logical event group' (e.g. from WRITE_ROWS and not proceeding TABLE_MAP
  at com.github.shyiko.mysql.binlog.event.deserialization.AbstractRowsEventDataDeserializer.deserializeRow(AbstractRowsEventDataDeserializer.java:109) ~[blob_p-f629484234b4985de4500370a5f1469ae0dd1ece-8a023802348066a979970ab6ec8a3ba5:?]
  at com.github.shyiko.mysql.binlog.event.deserialization.WriteRowsEventDataDeserializer.deserializeRows(WriteRowsEventDataDeserializer.java:64) ~[blob_p-f629484234b4985de4500370a5f1469ae0dd1ece-8a023802348066a979970ab6ec8a3ba5:?]
  at com.github.shyiko.mysql.binlog.event.deserialization.WriteRowsEventDataDeserializer.deserialize(WriteRowsEventDataDeserializer.java:56) ~[blob_p-f629484234b4985de4500370a5f1469ae0dd1ece-8a023802348066a979970ab6ec8a3ba5:?]
  at com.github.shyiko.mysql.binlog.event.deserialization.WriteRowsEventDataDeserializer.deserialize(WriteRowsEventDataDeserializer.java:32) ~[blob_p-f629484234b4985de4500370a5f1469ae0dd1ece-8a023802348066a979970ab6ec8a3ba5:?]
  at com.github.shyiko.mysql.binlog.event.deserialization.EventDeserializer.deserializeEventData(EventDeserializer.java:303) ~[blob_p-f629484234b4985de4500370a5f1469ae0dd1ece-8a023802348066a979970ab6ec8a3ba5:?]
  at com.github.shyiko.mysql.binlog.event.deserialization.EventDeserializer.nextEvent(EventDeserializer.java:232) ~[blob_p-f629484234b4985de4500370a5f1469ae0dd1ece-8a023802348066a979970ab6ec8a3ba5:?]
  at io.debezium.connector.mysql.MySqlStreamingChangeEventSource$1.nextEvent(MySqlStreamingChangeEventSource.java:256) ~[blob_p-f629484234b4985de4500370a5f1469ae0dd1ece-8a023802348066a979970ab6ec8a3ba5:?]
  at com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:952) ~[blob_p-f629484234b4985de4500370a5f1469ae0dd1ece-8a023802348066a979970ab6ec8a3ba5:?]
  ... 3 more

这有可能是在读取binlog的时候,如果一个事务更新了很多数据,mysql cdc会读取不到

EventDeserializer(292):tableMapEventByTableId.put(tableMapEvent.getTableId(), tableMapEvent); 可能是这一句代码的问题,因为 tableMapEventByTableId 是线程不安全的对象 而我这边读取数据是多个线程并行读取的,所以在扩容的时候,可能就读到空的了 暂时我把读取source改成并行度1,就没有再出现这个问题 偶现的这个bug我感觉是这个原因

Mysql cdc增量阶段不是只有一个task在读吗 你咋并行度?

RainbowWS commented 7 months ago

flink 1.15.3 + cdc 2.3.0 。任务taskmanager 4 jobmanager 1 ,出现该问题


    at org.apache.flink.connector.base.source.reader.fetcher.SplitFetcherManager.checkErrors(SplitFetcherManager.java:225) ~[flink-connector-base-1.15.3.jar:1.15.3]
    at org.apache.flink.connector.base.source.reader.SourceReaderBase.getNextFetch(SourceReaderBase.java:169) ~[flink-connector-base-1.15.3.jar:1.15.3]
    at org.apache.flink.connector.base.source.reader.SourceReaderBase.pollNext(SourceReaderBase.java:130) ~[flink-connector-base-1.15.3.jar:1.15.3]
    at org.apache.flink.streaming.api.operators.SourceOperator.emitNext(SourceOperator.java:385) ~[flink-streaming-java-1.15.3.jar:1.15.3]
    at org.apache.flink.streaming.runtime.io.StreamTaskSourceInput.emitNext(StreamTaskSourceInput.java:68) ~[flink-streaming-java-1.15.3.jar:1.15.3]
    at org.apache.flink.streaming.runtime.io.StreamOneInputProcessor.processInput(StreamOneInputProcessor.java:65) ~[flink-streaming-java-1.15.3.jar:1.15.3]
    at org.apache.flink.streaming.runtime.tasks.StreamTask.processInput(StreamTask.java:519) ~[flink-streaming-java-1.15.3.jar:1.15.3]
    at org.apache.flink.streaming.runtime.tasks.mailbox.MailboxProcessor.runMailboxLoop(MailboxProcessor.java:203) ~[flink-streaming-java-1.15.3.jar:1.15.3]
    at org.apache.flink.streaming.runtime.tasks.StreamTask.runMailboxLoop(StreamTask.java:807) ~[flink-streaming-java-1.15.3.jar:1.15.3]
    at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:756) ~[flink-streaming-java-1.15.3.jar:1.15.3]
    at org.apache.flink.runtime.taskmanager.Task.runWithSystemExitMonitoring(Task.java:948) ~[flink-runtime-1.15.3.jar:1.15.3]
    at org.apache.flink.runtime.taskmanager.Task.restoreAndInvoke(Task.java:927) ~[flink-runtime-1.15.3.jar:1.15.3]
    at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:741) ~[flink-runtime-1.15.3.jar:1.15.3]
    at org.apache.flink.runtime.taskmanager.Task.run(Task.java:563) ~[flink-runtime-1.15.3.jar:1.15.3]
    at java.lang.Thread.run(Thread.java:745) ~[?:1.8.0_101]
Caused by: java.lang.RuntimeException: SplitFetcher thread 0 received unexpected exception while polling the records
    at org.apache.flink.connector.base.source.reader.fetcher.SplitFetcher.runOnce(SplitFetcher.java:150) ~[flink-connector-base-1.15.3.jar:1.15.3]
    at org.apache.flink.connector.base.source.reader.fetcher.SplitFetcher.run(SplitFetcher.java:105) ~[flink-connector-base-1.15.3.jar:1.15.3]
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_101]
    at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_101]
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_101]
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_101]
    ... 1 more
Caused by: com.ververica.cdc.connectors.shaded.org.apache.kafka.connect.errors.ConnectException: An exception occurred in the change event producer. This connector will be stopped.
    at io.debezium.pipeline.ErrorHandler.setProducerThrowable(ErrorHandler.java:42) ~[flink-sql-connector-mongodb-cdc-2.3.0.jar:2.3.0]
    at com.ververica.cdc.connectors.mysql.debezium.task.context.MySqlErrorHandler.setProducerThrowable(MySqlErrorHandler.java:89) ~[flink-sql-connector-mysql-cdc-2.3.0.jar:2.3.0]
    at io.debezium.connector.mysql.MySqlStreamingChangeEventSource$ReaderThreadLifecycleListener.onCommunicationFailure(MySqlStreamingChangeEventSource.java:1439) ~[flink-sql-connector-mysql-cdc-2.3.0.jar:2.3.0]
    at com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:980) ~[flink-sql-connector-mysql-cdc-2.3.0.jar:2.3.0]
    at com.github.shyiko.mysql.binlog.BinaryLogClient.connect(BinaryLogClient.java:599) ~[flink-sql-connector-mysql-cdc-2.3.0.jar:2.3.0]
    at com.github.shyiko.mysql.binlog.BinaryLogClient$7.run(BinaryLogClient.java:857) ~[flink-sql-connector-mysql-cdc-2.3.0.jar:2.3.0]
    ... 1 more
Caused by: io.debezium.DebeziumException: Failed to deserialize data of EventHeaderV4{timestamp=1711672921000, eventType=TABLE_MAP, serverId=xx, headerLength=19, dataLength=242, nextPosition=349913650, flags=0}
    at io.debezium.connector.mysql.MySqlStreamingChangeEventSource.wrap(MySqlStreamingChangeEventSource.java:1383) ~[flink-sql-connector-mysql-cdc-2.3.0.jar:2.3.0]
    at io.debezium.connector.mysql.MySqlStreamingChangeEventSource$ReaderThreadLifecycleListener.onCommunicationFailure(MySqlStreamingChangeEventSource.java:1439) ~[flink-sql-connector-mysql-cdc-2.3.0.jar:2.3.0]
    at com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:980) ~[flink-sql-connector-mysql-cdc-2.3.0.jar:2.3.0]
    at com.github.shyiko.mysql.binlog.BinaryLogClient.connect(BinaryLogClient.java:599) ~[flink-sql-connector-mysql-cdc-2.3.0.jar:2.3.0]
    at com.github.shyiko.mysql.binlog.BinaryLogClient$7.run(BinaryLogClient.java:857) ~[flink-sql-connector-mysql-cdc-2.3.0.jar:2.3.0]
    ... 1 more
Caused by: com.github.shyiko.mysql.binlog.event.deserialization.EventDataDeserializationException: Failed to deserialize data of EventHeaderV4{timestamp=1711672921000, eventType=TABLE_MAP, serverId=xxx, headerLength=19, dataLength=242, nextPosition=349913650, flags=0}
    at com.github.shyiko.mysql.binlog.event.deserialization.EventDeserializer.deserializeEventData(EventDeserializer.java:309) ~[flink-sql-connector-mysql-cdc-2.3.0.jar:2.3.0]
    at com.github.shyiko.mysql.binlog.event.deserialization.EventDeserializer.deserializeTableMapEventData(EventDeserializer.java:281) ~[flink-sql-connector-mysql-cdc-2.3.0.jar:2.3.0]
    at com.github.shyiko.mysql.binlog.event.deserialization.EventDeserializer.nextEvent(EventDeserializer.java:228) ~[flink-sql-connector-mysql-cdc-2.3.0.jar:2.3.0]
    at io.debezium.connector.mysql.MySqlStreamingChangeEventSource$1.nextEvent(MySqlStreamingChangeEventSource.java:256) ~[flink-sql-connector-mysql-cdc-2.3.0.jar:2.3.0]
    at com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:952) ~[flink-sql-connector-mysql-cdc-2.3.0.jar:2.3.0]
    at com.github.shyiko.mysql.binlog.BinaryLogClient.connect(BinaryLogClient.java:599) ~[flink-sql-connector-mysql-cdc-2.3.0.jar:2.3.0]
    at com.github.shyiko.mysql.binlog.BinaryLogClient$7.run(BinaryLogClient.java:857) ~[flink-sql-connector-mysql-cdc-2.3.0.jar:2.3.0]
    ... 1 more
Caused by: java.io.EOFException
    at com.github.shyiko.mysql.binlog.io.ByteArrayInputStream.read(ByteArrayInputStream.java:209) ~[flink-sql-connector-mysql-cdc-2.3.0.jar:2.3.0]
    at com.github.shyiko.mysql.binlog.io.ByteArrayInputStream.readZeroTerminatedString(ByteArrayInputStream.java:96) ~[flink-sql-connector-mysql-cdc-2.3.0.jar:2.3.0]
    at com.github.shyiko.mysql.binlog.event.deserialization.TableMapEventDataDeserializer.deserialize(TableMapEventDataDeserializer.java:38) ~[flink-sql-connector-mysql-cdc-2.3.0.jar:2.3.0]
    at com.github.shyiko.mysql.binlog.event.deserialization.TableMapEventDataDeserializer.deserialize(TableMapEventDataDeserializer.java:27) ~[flink-sql-connector-mysql-cdc-2.3.0.jar:2.3.0]
    at com.github.shyiko.mysql.binlog.event.deserialization.EventDeserializer.deserializeEventData(EventDeserializer.java:303) ~[flink-sql-connector-mysql-cdc-2.3.0.jar:2.3.0]
    at com.github.shyiko.mysql.binlog.event.deserialization.EventDeserializer.deserializeTableMapEventData(EventDeserializer.java:281) ~[flink-sql-connector-mysql-cdc-2.3.0.jar:2.3.0]
    at com.github.shyiko.mysql.binlog.event.deserialization.EventDeserializer.nextEvent(EventDeserializer.java:228) ~[flink-sql-connector-mysql-cdc-2.3.0.jar:2.3.0]
    at io.debezium.connector.mysql.MySqlStreamingChangeEventSource$1.nextEvent(MySqlStreamingChangeEventSource.java:256) ~[flink-sql-connector-mysql-cdc-2.3.0.jar:2.3.0]
    at com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:952) ~[flink-sql-connector-mysql-cdc-2.3.0.jar:2.3.0]
    at com.github.shyiko.mysql.binlog.BinaryLogClient.connect(BinaryLogClient.java:599) ~[flink-sql-connector-mysql-cdc-2.3.0.jar:2.3.0]
    at com.github.shyiko.mysql.binlog.BinaryLogClient$7.run(BinaryLogClient.java:857) ~[flink-sql-connector-mysql-cdc-2.3.0.jar:2.3.0]
    ... 1 more```
kendn1993 commented 3 months ago

出现这个异常后,数据发生了丢失,你们有观察发现嘛?

是的

kendn1993 commented 3 months ago

flink1.17.1+cdc 2.4.1 还是会出现这个问题

Mindfaker commented 3 months ago

flink1.17.1+cdc 2.4.1 还是会出现这个问题

升级到2.4.2试试,我最近和DBA排查了下2.4.2之前的版本在源库有比较大批量的数据增删改时,会偶尔触发这个问题。导致与源表的连接一直新增不释放,一段时间后就会报错。 我看2.4.2的升级日志里看到了说解决了这个bug, 我计划下个月有时间在测试环境切换到2.4.2试试,建议你也可以试试,要是验证解决了记得回@一下,这样省去我的验证时间了。

kendn1993 commented 3 months ago

flink1.17.1+cdc 2.4.1 还是会出现这个问题

升级到2.4.2试试,我最近和DBA排查了下2.4.2之前的版本在源库有比较大批量的数据增删改时,会偶尔触发这个问题。导致与源表的连接一直新增不释放,一段时间后就会报错。 我看2.4.2的升级日志里看到了说解决了这个bug, 我计划下个月有时间在测试环境切换到2.4.2试试,建议你也可以试试,要是验证解决了记得回@一下,这样省去我的验证时间了。

我现在升级为flink1.8.1+cdc3.1.1还是不行,这个错误在数据量激增时容易出现

image

,并且再尝试重试完成后,会跳过某些它自己认为处理完成的记录,导致数据丢数

kendn1993 commented 3 months ago

image 在出现序列化失败后,会跳过某些记录,实际这些记录并没有入库

anynone commented 3 months ago

Flink 1.17.2 flink sql connector mysql cdc 2.3.0 在数据库融入大量数据后会出现这个问题

java.lang.RuntimeException: One or more fetchers have encountered exception at org.apache.flink.connector.base.source.reader.fetcher.SplitFetcherManager.checkErrors(SplitFetcherManager.java:261) at org.apache.flink.connector.base.source.reader.SourceReaderBase.getNextFetch(SourceReaderBase.java:169) at org.apache.flink.connector.base.source.reader.SourceReaderBase.pollNext(SourceReaderBase.java:131) at org.apache.flink.connector.base.source.reader.SourceReaderBase.pollNext(SourceReaderBase.java:157) at org.apache.flink.streaming.api.operators.SourceOperator.emitNext(SourceOperator.java:419) at org.apache.flink.streaming.runtime.io.StreamTaskSourceInput.emitNext(StreamTaskSourceInput.java:68) at org.apache.flink.streaming.runtime.io.StreamOneInputProcessor.processInput(StreamOneInputProcessor.java:65) at org.apache.flink.streaming.runtime.tasks.StreamTask.processInput(StreamTask.java:550) at org.apache.flink.streaming.runtime.tasks.mailbox.MailboxProcessor.runMailboxLoop(MailboxProcessor.java:231) at org.apache.flink.streaming.runtime.tasks.StreamTask.runMailboxLoop(StreamTask.java:839) at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:788) at org.apache.flink.runtime.taskmanager.Task.runWithSystemExitMonitoring(Task.java:952) at org.apache.flink.runtime.taskmanager.Task.restoreAndInvoke(Task.java:931) at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:745) at org.apache.flink.runtime.taskmanager.Task.run(Task.java:562) at java.base/java.lang.Thread.run(Unknown Source) Caused by: java.lang.RuntimeException: SplitFetcher thread 0 received unexpected exception while polling the records at org.apache.flink.connector.base.source.reader.fetcher.SplitFetcher.runOnce(SplitFetcher.java:165) at org.apache.flink.connector.base.source.reader.fetcher.SplitFetcher.run(SplitFetcher.java:114) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source) at java.base/java.util.concurrent.FutureTask.run(Unknown Source) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) ... 1 more Caused by: com.ververica.cdc.connectors.shaded.org.apache.kafka.connect.errors.ConnectException: An exception occurred in the change event producer. This connector will be stopped. at io.debezium.pipeline.ErrorHandler.setProducerThrowable(ErrorHandler.java:42) at com.ververica.cdc.connectors.mysql.debezium.task.context.MySqlErrorHandler.setProducerThrowable(MySqlErrorHandler.java:89) at io.debezium.connector.mysql.MySqlStreamingChangeEventSource$ReaderThreadLifecycleListener.onEventDeserializationFailure(MySqlStreamingChangeEventSource.java:1448) at com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:965) at com.github.shyiko.mysql.binlog.BinaryLogClient.connect(BinaryLogClient.java:599) at com.github.shyiko.mysql.binlog.BinaryLogClient$7.run(BinaryLogClient.java:857) ... 1 more Caused by: io.debezium.DebeziumException: Failed to deserialize data of EventHeaderV4{timestamp=1723045209000, eventType=EXT_UPDATE_ROWS, serverId=7436, headerLength=19, dataLength=6337, nextPosition=94510829, flags =0} at io.debezium.connector.mysql.MySqlStreamingChangeEventSource.wrap(MySqlStreamingChangeEventSource.java:1383) ... 5 more Caused by: com.github.shyiko.mysql.binlog.event.deserialization.EventDataDeserializationException: Failed to deserialize data of EventHeaderV4{timestamp=1723045209000, eventType=EXT_UPDATE_ROWS, serverId=7436, heade rLength=19, dataLength=6337, nextPosition=94510829, flags=0} at com.github.shyiko.mysql.binlog.event.deserialization.EventDeserializer.deserializeEventData(EventDeserializer.java:309) at com.github.shyiko.mysql.binlog.event.deserialization.EventDeserializer.nextEvent(EventDeserializer.java:232) at io.debezium.connector.mysql.MySqlStreamingChangeEventSource$1.nextEvent(MySqlStreamingChangeEventSource.java:256) at com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:952) ... 3 more Caused by: com.github.shyiko.mysql.binlog.event.deserialization.MissingTableMapEventException: No TableMapEventData has been found for table id:206. Usually that means that you have started reading binary log 'withi n the logical event group' (e.g. from WRITE_ROWS and not proceeding TABLE_MAP at com.github.shyiko.mysql.binlog.event.deserialization.AbstractRowsEventDataDeserializer.deserializeRow(AbstractRowsEventDataDeserializer.java:109) at com.github.shyiko.mysql.binlog.event.deserialization.UpdateRowsEventDataDeserializer.deserializeRows(UpdateRowsEventDataDeserializer.java:71) at com.github.shyiko.mysql.binlog.event.deserialization.UpdateRowsEventDataDeserializer.deserialize(UpdateRowsEventDataDeserializer.java:58) at com.github.shyiko.mysql.binlog.event.deserialization.UpdateRowsEventDataDeserializer.deserialize(UpdateRowsEventDataDeserializer.java:33)

patatosky commented 3 months ago

flink 1.18.1 finkcdc 3.0.1 还是有这个问题

kendn1993 commented 3 months ago

I tried to increase my program parallelism, then this question disappear

Mindfaker commented 3 months ago

并行度超过1会有问题吧。DML可能会不按日志中的时序写入到目标库 数据会错乱的

I tried to increase my program parallelism, then this question disappear

并行度超过1会有问题吧。DML可能会不按日志中的时序写入到目标库 数据会错乱的

nortrom-lh commented 2 months ago

在等待社区大佬处理时,在https://debezium.io/上发现类似的报错,有调整MySQL参数验证过没? What is causing intermittent EventDataDeserializationExceptions with the MySQL connector? When you run into intermittent deserialization exceptions around 1 minute after starting connector, with a root cause of type EOFException or java.net.SocketException: Connection reset:

Caused by: com.github.shyiko.mysql.binlog.event.deserialization.EventDataDeserializationException: Failed to deserialize data of EventHeaderV4{timestamp=1542193955000, eventType=GTID, serverId=91111, headerLength=19, dataLength=46, nextPosition=1058898202, flags=0} Caused by: java.lang.RuntimeException: com.github.shyiko.mysql.binlog.event.deserialization.EventDataDeserializationException: Failed to deserialize data of EventHeaderV4{timestamp=1542193955000, eventType=GTID, serverId=91111, headerLength=19, dataLength=46, nextPosition=1058898202, flags=0} Caused by: java.io.EOFException

or

Caused by: java.net.SocketException: Connection reset Then updating these MySQL server global properties like this will fix it:

set global slave_net_timeout = 120; (default was 30sec) set global thread_pool_idle_timeout = 120;

nortrom-lh commented 2 months ago

在等待社区大佬处理时,在https://debezium.io/上发现类似的报错,有调整MySQL参数验证过没? What is causing intermittent EventDataDeserializationExceptions with the MySQL connector? When you run into intermittent deserialization exceptions around 1 minute after starting connector, with a root cause of type EOFException or java.net.SocketException: Connection reset:

Caused by: com.github.shyiko.mysql.binlog.event.deserialization.EventDataDeserializationException: Failed to deserialize data of EventHeaderV4{timestamp=1542193955000, eventType=GTID, serverId=91111, headerLength=19, dataLength=46, nextPosition=1058898202, flags=0} Caused by: java.lang.RuntimeException: com.github.shyiko.mysql.binlog.event.deserialization.EventDataDeserializationException: Failed to deserialize data of EventHeaderV4{timestamp=1542193955000, eventType=GTID, serverId=91111, headerLength=19, dataLength=46, nextPosition=1058898202, flags=0} Caused by: java.io.EOFException

or

Caused by: java.net.SocketException: Connection reset Then updating these MySQL server global properties like this will fix it:

set global slave_net_timeout = 120; (default was 30sec) set global thread_pool_idle_timeout = 120;

已测试,问题还会再次发生。

njalan commented 1 month ago

这个issue 为什么close呢?我用flink1.8+flink cdc 3.2通过savepoint恢复后过几分钟就会出现这个问题,针对小表是正常的。但是10亿级别以上的就会报错了