alibaba / canal

阿里巴巴 MySQL binlog 增量订阅&消费组件
Apache License 2.0
28.45k stars 7.61k forks source link

canal.instance.filter.regex过滤不生效 #2429

Closed netrous closed 4 years ago

netrous commented 4 years ago

我启用了kafka模式 canal.serverMode = kafka

这是我的instance.properties配置 canal.instance.filter.regex=user_center\\.. canal.instance.filter.black.regex= canal.mq.topic=user_center canal.mq.dynamicTopic=user_center\\.. canal.mq.partition=0

启动日志中打印的过滤表达式

WARN c.a.o.canal.parse.inbound.mysql.dbsync.LogEventConvert - --> init table filter : ^user_center\..*$ WARN c.a.o.canal.parse.inbound.mysql.dbsync.LogEventConvert - --> init table black filter :

但是向kafka发送的都是下面这些消息,并不是user_center库的日志,kafka中也只生成了user_center这个topic,没有生成表对应的topic

DEBUG com.alibaba.otter.canal.kafka.CanalKafkaProducer - Send message to kafka topic: [user_center], packet: ProducerRecord(topic=user_center, partition=0, headers=RecordHeaders(headers = [], isReadOnly = true), key=null, value={"data":null,"database":"","es":1574427377000,"id":26,"isDdl":false,"mysqlType":null,"old":null,"pkNames":null,"sql":"UPDATE DTS_JOB_QRTZ_TRIGGERS SET TRIGGER_STATE = 'ACQUIRED' WHERE SCHED_NAME = 'quartzScheduler' AND TRIGGER_NAME = '201703171914390653' AND TRIGGER_GROUP = 'test' AND TRIGGER_STATE = 'WAITING'","sqlType":null,"table":"DTS_JOB_QRTZ_TRIGGERS","ts":1574668186110,"type":"QUERY"}, timestamp=null)

wyoldfour commented 4 years ago

开启了binlog_rows_query_log_events事件,该事件不受白名单控制,所以会受到type为query的时间信息

asn417 commented 4 years ago

我启用了kafka模式 canal.serverMode = kafka

这是我的instance.properties配置 canal.instance.filter.regex=user_center\.. canal.instance.filter.black.regex= canal.mq.topic=user_center canal.mq.dynamicTopic=user_center\.. canal.mq.partition=0

启动日志中打印的过滤表达式

WARN c.a.o.canal.parse.inbound.mysql.dbsync.LogEventConvert - --> init table filter : ^user_center..*$ WARN c.a.o.canal.parse.inbound.mysql.dbsync.LogEventConvert - --> init table black filter :

但是向kafka发送的都是下面这些消息,并不是user_center库的日志,kafka中也只生成了user_center这个topic,没有生成表对应的topic

DEBUG com.alibaba.otter.canal.kafka.CanalKafkaProducer - Send message to kafka topic: [user_center], packet: ProducerRecord(topic=user_center, partition=0, headers=RecordHeaders(headers = [], isReadOnly = true), key=null, value={"data":null,"database":"","es":1574427377000,"id":26,"isDdl":false,"mysqlType":null,"old":null,"pkNames":null,"sql":"UPDATE DTS_JOB_QRTZ_TRIGGERS SET TRIGGER_STATE = 'ACQUIRED' WHERE SCHED_NAME = 'quartzScheduler' AND TRIGGER_NAME = '201703171914390653' AND TRIGGER_GROUP = 'test' AND TRIGGER_STATE = 'WAITING'","sqlType":null,"table":"DTS_JOB_QRTZ_TRIGGERS","ts":1574668186110,"type":"QUERY"}, timestamp=null)

请问你是怎么解决的?我也是过滤不生效

wyoldfour commented 4 years ago

数据库是否开启了binlog_rows_query_log_events?如果开启了的话那么该事件的消息就不会通过白名单过滤,目前我是通过改源码重新打包的

发自我的iPhone

------------------ 原始邮件 ------------------ 发件人: asn417 <notifications@github.com> 发送时间: 2020年4月7日 17:57 收件人: alibaba/canal <canal@noreply.github.com> 抄送: wyoldfour <330538471@qq.com>, Comment <comment@noreply.github.com> 主题: 回复:[alibaba/canal] canal.instance.filter.regex过滤不生效 (#2429)

我启用了kafka模式 canal.serverMode = kafka

这是我的instance.properties配置 canal.instance.filter.regex=user_center.. canal.instance.filter.black.regex= canal.mq.topic=user_center canal.mq.dynamicTopic=user_center.. canal.mq.partition=0

启动日志中打印的过滤表达式

WARN c.a.o.canal.parse.inbound.mysql.dbsync.LogEventConvert - --> init table filter : ^user_center..*$ WARN c.a.o.canal.parse.inbound.mysql.dbsync.LogEventConvert - --> init table black filter :

但是向kafka发送的都是下面这些消息,并不是user_center库的日志,kafka中也只生成了user_center这个topic,没有生成表对应的topic

DEBUG com.alibaba.otter.canal.kafka.CanalKafkaProducer - Send message to kafka topic: [user_center], packet: ProducerRecord(topic=user_center, partition=0, headers=RecordHeaders(headers = [], isReadOnly = true), key=null, value={"data":null,"database":"","es":1574427377000,"id":26,"isDdl":false,"mysqlType":null,"old":null,"pkNames":null,"sql":"UPDATE DTS_JOB_QRTZ_TRIGGERS SET TRIGGER_STATE = 'ACQUIRED' WHERE SCHED_NAME = 'quartzScheduler' AND TRIGGER_NAME = '201703171914390653' AND TRIGGER_GROUP = 'test' AND TRIGGER_STATE = 'WAITING'","sqlType":null,"table":"DTS_JOB_QRTZ_TRIGGERS","ts":1574668186110,"type":"QUERY"}, timestamp=null)

请问你是怎么解决的?我也是过滤不生效

— You are receiving this because you commented. Reply to this email directly, view it on GitHub, or unsubscribe.

asn417 commented 4 years ago

谢谢你,我看了一下binlog_rows_query_log_events的状态是on,是不是要设置成off才能过滤生效?

------------------ 原始邮件 ------------------ 发件人: "wyoldfour"<notifications@github.com>; 发送时间: 2020年4月7日(星期二) 晚上6:19 收件人: "alibaba/canal"<canal@noreply.github.com>; 抄送: "天空"<895409039@qq.com>;"Comment"<comment@noreply.github.com>; 主题: Re: [alibaba/canal] canal.instance.filter.regex过滤不生效 (#2429)

数据库是否开启了binlog_rows_query_log_events?如果开启了的话那么该事件的消息就不会通过白名单过滤,目前我是通过改源码重新打包的

发自我的iPhone

------------------ 原始邮件 ------------------ 发件人: asn417 <notifications@github.com&gt; 发送时间: 2020年4月7日 17:57 收件人: alibaba/canal <canal@noreply.github.com&gt; 抄送: wyoldfour <330538471@qq.com&gt;, Comment <comment@noreply.github.com&gt; 主题: 回复:[alibaba/canal] canal.instance.filter.regex过滤不生效 (#2429)

我启用了kafka模式 canal.serverMode = kafka

这是我的instance.properties配置 canal.instance.filter.regex=user_center.. canal.instance.filter.black.regex= canal.mq.topic=user_center canal.mq.dynamicTopic=user_center.. canal.mq.partition=0

启动日志中打印的过滤表达式

WARN c.a.o.canal.parse.inbound.mysql.dbsync.LogEventConvert - --&gt; init table filter : ^user_center..*$ WARN c.a.o.canal.parse.inbound.mysql.dbsync.LogEventConvert - --&gt; init table black filter :

但是向kafka发送的都是下面这些消息,并不是user_center库的日志,kafka中也只生成了user_center这个topic,没有生成表对应的topic

DEBUG com.alibaba.otter.canal.kafka.CanalKafkaProducer - Send message to kafka topic: [user_center], packet: ProducerRecord(topic=user_center, partition=0, headers=RecordHeaders(headers = [], isReadOnly = true), key=null, value={"data":null,"database":"","es":1574427377000,"id":26,"isDdl":false,"mysqlType":null,"old":null,"pkNames":null,"sql":"UPDATE DTS_JOB_QRTZ_TRIGGERS SET TRIGGER_STATE = 'ACQUIRED' WHERE SCHED_NAME = 'quartzScheduler' AND TRIGGER_NAME = '201703171914390653' AND TRIGGER_GROUP = 'test' AND TRIGGER_STATE = 'WAITING'","sqlType":null,"table":"DTS_JOB_QRTZ_TRIGGERS","ts":1574668186110,"type":"QUERY"}, timestamp=null)

请问你是怎么解决的?我也是过滤不生效

— You are receiving this because you commented. Reply to this email directly, view it on GitHub, or unsubscribe. — You are receiving this because you commented. Reply to this email directly, view it on GitHub, or unsubscribe.

wyoldfour commented 4 years ago

可以的,如果你不需要这个事件的话

发自我的iPhone

------------------ 原始邮件 ------------------ 发件人: asn417 <notifications@github.com> 发送时间: 2020年4月7日 18:24 收件人: alibaba/canal <canal@noreply.github.com> 抄送: wyoldfour <330538471@qq.com>, Comment <comment@noreply.github.com> 主题: 回复:[alibaba/canal] canal.instance.filter.regex过滤不生效 (#2429)

谢谢你,我看了一下binlog_rows_query_log_events的状态是on,是不是要设置成off才能过滤生效?

------------------&nbsp;原始邮件&nbsp;------------------ 发件人:&nbsp;"wyoldfour"<notifications@github.com&gt;; 发送时间:&nbsp;2020年4月7日(星期二) 晚上6:19 收件人:&nbsp;"alibaba/canal"<canal@noreply.github.com&gt;; 抄送:&nbsp;"天空"<895409039@qq.com&gt;;"Comment"<comment@noreply.github.com&gt;; 主题:&nbsp;Re: [alibaba/canal] canal.instance.filter.regex过滤不生效 (#2429)

数据库是否开启了binlog_rows_query_log_events?如果开启了的话那么该事件的消息就不会通过白名单过滤,目前我是通过改源码重新打包的

发自我的iPhone

------------------ 原始邮件 ------------------
发件人: asn417 <notifications@github.com&amp;gt;
发送时间: 2020年4月7日 17:57
收件人: alibaba/canal <canal@noreply.github.com&amp;gt;
抄送: wyoldfour <330538471@qq.com&amp;gt;, Comment <comment@noreply.github.com&amp;gt;
主题: 回复:[alibaba/canal] canal.instance.filter.regex过滤不生效 (#2429)

我启用了kafka模式
canal.serverMode = kafka

这是我的instance.properties配置
canal.instance.filter.regex=user_center..
canal.instance.filter.black.regex=
canal.mq.topic=user_center
canal.mq.dynamicTopic=user_center..

canal.mq.partition=0

启动日志中打印的过滤表达式

WARN c.a.o.canal.parse.inbound.mysql.dbsync.LogEventConvert - --&amp;gt; init table filter : ^user_center..*$
WARN c.a.o.canal.parse.inbound.mysql.dbsync.LogEventConvert - --&amp;gt; init table black filter :

但是向kafka发送的都是下面这些消息,并不是user_center库的日志,kafka中也只生成了user_center这个topic,没有生成表对应的topic

DEBUG com.alibaba.otter.canal.kafka.CanalKafkaProducer - Send message to kafka topic: [user_center], packet: ProducerRecord(topic=user_center, partition=0, headers=RecordHeaders(headers = [], isReadOnly = true), key=null, value={"data":null,"database":"","es":1574427377000,"id":26,"isDdl":false,"mysqlType":null,"old":null,"pkNames":null,"sql":"UPDATE DTS_JOB_QRTZ_TRIGGERS SET TRIGGER_STATE = 'ACQUIRED' WHERE SCHED_NAME = 'quartzScheduler' AND TRIGGER_NAME = '201703171914390653' AND TRIGGER_GROUP = 'test' AND TRIGGER_STATE = 'WAITING'","sqlType":null,"table":"DTS_JOB_QRTZ_TRIGGERS","ts":1574668186110,"type":"QUERY"}, timestamp=null)

请问你是怎么解决的?我也是过滤不生效


You are receiving this because you commented.
Reply to this email directly, view it on GitHub, or unsubscribe.
— You are receiving this because you commented. Reply to this email directly, view it on GitHub, or unsubscribe. — You are receiving this because you commented. Reply to this email directly, view it on GitHub, or unsubscribe.