zhp8341 / flink-streaming-platform-web

基于flink的实时流计算web平台
MIT License
1.81k stars 677 forks source link

sql任务如何设置savepoint #61

Closed eaglesinchina closed 3 years ago

eaglesinchina commented 3 years ago

有个任务是读debezium-mysql-kafka, 写hbase的,但是每次停止任务后发现重新读了所有的kafka数据,不知道如何避免从头开始读: CREATE TABLE topic_test2_employee ( id int, dept_id int, name string ) WITH ( 'connector' = 'kafka', 'topic' = 'dbz2.test2.employee', 'properties.bootstrap.servers' = 'localhost:9092', 'scan.startup.mode' = 'earliest-offset', 'format' = 'debezium-avro-confluent', 'debezium-avro-confluent.schema-registry.url' = 'http://localhost:9081' ); 这里开始我是想让它从头开始读kafka的,但是任务执行了一段时间,考虑中途可能会停止,然后想从上次的消费的地方开始读,要如何配置

zhp8341 commented 3 years ago

默认web系统已经做了savepoint 可以冲savepoint 重启