DTStack / chunjun

A data integration framework
https://dtstack.github.io/chunjun/
Apache License 2.0
4k stars 1.69k forks source link

[Bug] [sqlservercdc] 任务提交给单机版的flink数据未同步 #1240

Open frank451209123 opened 2 years ago

frank451209123 commented 2 years ago

Search before asking

What happened

任务正常提交给flink,任务执行正常,没有任何异常信息,但是数据未同步到postgresql中。 任务配置信息如下: CREATE TABLE source ( id bigint NOT NULL, parent_id bigint NOT NULL, project_no varchar(50) NOT NULL, sales_order varchar(50) NOT NULL, plant_code varchar(50) NULL, product_line varchar(50) NULL, sales_order_item varchar(50) NOT NULL, bay_id varchar(50) NULL, order_no varchar(50) NOT NULL, order_status int NULL, order_level int NOT NULL, sap_order_status varchar(50) NULL, material_no varchar(50) NOT NULL, material_descr varchar(255) NULL, final_material varchar(50) NULL, init_production_center varchar(50) NULL, final_production_center varchar(50) NULL, module_code varchar(50) NULL, material_no_fellow varchar(50) NULL, order_no_fellow varchar(50) NULL, item_component_list varchar(50) NULL, reservation varchar(50) NULL, reservation_item varchar(50) NULL, material_picking1_times int NULL, material_picking2_times int NULL, first_date timestamp NULL, basic_start_time timestamp NULL, basic_finish_time timestamp NULL, product_start_time timestamp NULL, mes_plan_start_time timestamp NULL, mes_plan_end_time timestamp NULL, mes_task_exec_time timestamp NULL, mes_plan_u_time timestamp NULL, is_deleted int NULL, is_modify int NULL, create_user varchar(50) NULL, update_user varchar(50) NULL, create_time timestamp NULL, update_time timestamp NOT NULL, PRIMARY KEY (id) NOT ENFORCED ) WITH ( 'connector' = 'sqlservercdc-x' ,'username' = 'sa' ,'password' = 'sa@123' ,'cat' = 'insert,delete,update' ,'url' = 'jdbc:sqlserver://192.168.8.212:1433;databaseName=CDCTest' ,'table' = 'dbo.ETO_PDM_Header' ,'timestamp-format.standard' = 'SQL' ,'database' = 'CDCTest' ,'poll-interval' = '1000' );

CREATE TABLE sink ( id numeric NOT NULL, parent_id numeric NOT NULL, project_no varchar(50) NOT NULL, sales_order varchar(50) NOT NULL, plant_code varchar(50) NULL, product_line varchar(50) NULL, sales_order_item varchar(50) NOT NULL, bay_id varchar(50) NULL, order_no varchar(50) NOT NULL, order_status int NULL, order_level int NOT NULL, sap_order_status varchar(50) NULL, material_no varchar(50) NOT NULL, material_descr varchar(255) NULL, final_material varchar(50) NULL, init_production_center varchar(50) NULL, final_production_center varchar(50) NULL, module_code varchar(50) NULL, material_no_fellow varchar(50) NULL, order_no_fellow varchar(50) NULL, item_component_list varchar(50) NULL, reservation varchar(50) NULL, reservation_item varchar(50) NULL, material_picking1_times int NULL, material_picking2_times int NULL, first_date timestamp NULL, basic_start_time timestamp NULL, basic_finish_time timestamp NULL, product_start_time timestamp NULL, mes_plan_start_time timestamp NULL, mes_plan_end_time timestamp NULL, mes_task_exec_time timestamp NULL, mes_plan_u_time timestamp NULL, is_deleted int NULL, is_modify int NULL, create_user varchar(50) NULL, update_user varchar(50) NULL, create_time timestamp NULL, update_time timestamp NULL, PRIMARY KEY (id) NOT ENFORCED ) WITH ( 'connector' = 'postgresql-x', 'url' = 'jdbc:postgresql://192.168.0.116:52345/postgres', 'table-name' = 'siemens2_ETO_PDM_Header', 'username' = 'postgres', 'password' = 'P@stgres123', 'sink.buffer-flush.max-rows' = '1024', 'sink.buffer-flush.interval' = '1000', 'sink.all-replace' = 'true', 'sink.parallelism' = '1' );

insert into sink select * from source u;

任务提交脚本: sh bin/chunjun-standalone.sh -jobName sqlserverCDCToPostgresql -job sqlserver_cdc_pg.sql

提交成功之后,flink运行正常,数据就是没有同步 sqlserver数据库cdc的配置也是按照chunjun文档去配置的

What you expected to happen

数据未同步

How to reproduce

按照问题描述去操作

Anything else

No response

Version

master

Are you willing to submit PR?

Code of Conduct

frank451209123 commented 2 years ago

flink taskmanger日志信息见下: 2022-09-08 18:09:49,974 INFO org.apache.flink.runtime.taskexecutor.TaskManagerRunner [] - -------------------------------------------------------------------------------- 2022-09-08 18:09:49,977 INFO org.apache.flink.runtime.taskexecutor.TaskManagerRunner [] - Preconfiguration: 2022-09-08 18:09:49,977 INFO org.apache.flink.runtime.taskexecutor.TaskManagerRunner [] -

TM_RESOURCE_PARAMS extraction logs: jvm_params: -Xmx536870902 -Xms536870902 -XX:MaxDirectMemorySize=268435458 -XX:MaxMetaspaceSize=268435456 dynamic_configs: -D taskmanager.memory.framework.off-heap.size=134217728b -D taskmanager.memory.network.max=134217730b -D taskmanager.memory.network.min=134217730b -D taskmanager.memory.framework.heap.size=134217728b -D taskmanager.memory.managed.size=536870920b -D taskmanager.cpu.cores=4.0 -D taskmanager.memory.task.heap.size=402653174b -D taskmanager.memory.task.off-heap.size=0b -D taskmanager.memory.jvm-metaspace.size=268435456b -D taskmanager.memory.jvm-overhead.max=201326592b -D taskmanager.memory.jvm-overhead.min=201326592b logs: INFO [] - Loading configuration property: jobmanager.rpc.address, localhost INFO [] - Loading configuration property: jobmanager.rpc.port, 6123 INFO [] - Loading configuration property: jobmanager.memory.process.size, 1600m INFO [] - Loading configuration property: taskmanager.memory.process.size, 1728m INFO [] - Loading configuration property: taskmanager.numberOfTaskSlots, 4 INFO [] - Loading configuration property: parallelism.default, 1 INFO [] - Loading configuration property: jobmanager.execution.failover-strategy, region INFO [] - The derived from fraction jvm overhead memory (172.800mb (181193935 bytes)) is less than its min value 192.000mb (201326592 bytes), min value will be used instead INFO [] - Final TaskExecutor Memory configuration: INFO [] - Total Process Memory: 1.688gb (1811939328 bytes) INFO [] - Total Flink Memory: 1.250gb (1342177280 bytes) INFO [] - Total JVM Heap Memory: 512.000mb (536870902 bytes) INFO [] - Framework: 128.000mb (134217728 bytes) INFO [] - Task: 384.000mb (402653174 bytes) INFO [] - Total Off-heap Memory: 768.000mb (805306378 bytes) INFO [] - Managed: 512.000mb (536870920 bytes) INFO [] - Total JVM Direct Memory: 256.000mb (268435458 bytes) INFO [] - Framework: 128.000mb (134217728 bytes) INFO [] - Task: 0 bytes INFO [] - Network: 128.000mb (134217730 bytes) INFO [] - JVM Metaspace: 256.000mb (268435456 bytes) INFO [] - JVM Overhead: 192.000mb (201326592 bytes)

2022-09-08 18:09:49,977 INFO org.apache.flink.runtime.taskexecutor.TaskManagerRunner [] - -------------------------------------------------------------------------------- 2022-09-08 18:09:49,977 INFO org.apache.flink.runtime.taskexecutor.TaskManagerRunner [] - Starting TaskManager (Version: 1.12.7, Scala: 2.12, Rev:88d9950, Date:2021-12-14T23:39:33+01:00) 2022-09-08 18:09:49,977 INFO org.apache.flink.runtime.taskexecutor.TaskManagerRunner [] - OS current user: root 2022-09-08 18:09:49,978 INFO org.apache.flink.runtime.taskexecutor.TaskManagerRunner [] - Current Hadoop/Kerberos user: 2022-09-08 18:09:49,978 INFO org.apache.flink.runtime.taskexecutor.TaskManagerRunner [] - JVM: OpenJDK 64-Bit Server VM - Oracle Corporation - 1.8/25.40-b25 2022-09-08 18:09:49,978 INFO org.apache.flink.runtime.taskexecutor.TaskManagerRunner [] - Maximum heap size: 512 MiBytes 2022-09-08 18:09:49,978 INFO org.apache.flink.runtime.taskexecutor.TaskManagerRunner [] - JAVA_HOME: /usr/local/tools/java-se-8u40-ri 2022-09-08 18:09:49,978 INFO org.apache.flink.runtime.taskexecutor.TaskManagerRunner [] - No Hadoop Dependency available 2022-09-08 18:09:49,978 INFO org.apache.flink.runtime.taskexecutor.TaskManagerRunner [] - JVM Options: 2022-09-08 18:09:49,978 INFO org.apache.flink.runtime.taskexecutor.TaskManagerRunner [] - -XX:+UseG1GC 2022-09-08 18:09:49,979 INFO org.apache.flink.runtime.taskexecutor.TaskManagerRunner [] - -Xmx536870902 2022-09-08 18:09:49,979 INFO org.apache.flink.runtime.taskexecutor.TaskManagerRunner [] - -Xms536870902 2022-09-08 18:09:49,979 INFO org.apache.flink.runtime.taskexecutor.TaskManagerRunner [] - -XX:MaxDirectMemorySize=268435458 2022-09-08 18:09:49,979 INFO org.apache.flink.runtime.taskexecutor.TaskManagerRunner [] - -XX:MaxMetaspaceSize=268435456 2022-09-08 18:09:49,979 INFO org.apache.flink.runtime.taskexecutor.TaskManagerRunner [] - -Dlog.file=/home/flink-1.12.7/log/flink-root-taskexecutor-1-hadoop51.log 2022-09-08 18:09:49,979 INFO org.apache.flink.runtime.taskexecutor.TaskManagerRunner [] - -Dlog4j.configuration=file:/home/flink-1.12.7/conf/log4j.properties 2022-09-08 18:09:49,979 INFO org.apache.flink.runtime.taskexecutor.TaskManagerRunner [] - -Dlog4j.configurationFile=file:/home/flink-1.12.7/conf/log4j.properties 2022-09-08 18:09:49,979 INFO org.apache.flink.runtime.taskexecutor.TaskManagerRunner [] - -Dlogback.configurationFile=file:/home/flink-1.12.7/conf/logback.xml 2022-09-08 18:09:49,979 INFO org.apache.flink.runtime.taskexecutor.TaskManagerRunner [] - Program Arguments: 2022-09-08 18:09:49,981 INFO org.apache.flink.runtime.taskexecutor.TaskManagerRunner [] - --configDir 2022-09-08 18:09:49,981 INFO org.apache.flink.runtime.taskexecutor.TaskManagerRunner [] - /home/flink-1.12.7/conf 2022-09-08 18:09:49,981 INFO org.apache.flink.runtime.taskexecutor.TaskManagerRunner [] - -D 2022-09-08 18:09:49,981 INFO org.apache.flink.runtime.taskexecutor.TaskManagerRunner [] - taskmanager.memory.framework.off-heap.size=134217728b 2022-09-08 18:09:49,981 INFO org.apache.flink.runtime.taskexecutor.TaskManagerRunner [] - -D 2022-09-08 18:09:49,981 INFO org.apache.flink.runtime.taskexecutor.TaskManagerRunner [] - taskmanager.memory.network.max=134217730b 2022-09-08 18:09:49,981 INFO org.apache.flink.runtime.taskexecutor.TaskManagerRunner [] - -D 2022-09-08 18:09:49,981 INFO org.apache.flink.runtime.taskexecutor.TaskManagerRunner [] - taskmanager.memory.network.min=134217730b 2022-09-08 18:09:49,981 INFO org.apache.flink.runtime.taskexecutor.TaskManagerRunner [] - -D 2022-09-08 18:09:49,981 INFO org.apache.flink.runtime.taskexecutor.TaskManagerRunner [] - taskmanager.memory.framework.heap.size=134217728b 2022-09-08 18:09:49,982 INFO org.apache.flink.runtime.taskexecutor.TaskManagerRunner [] - -D 2022-09-08 18:09:49,982 INFO org.apache.flink.runtime.taskexecutor.TaskManagerRunner [] - taskmanager.memory.managed.size=536870920b 2022-09-08 18:09:49,982 INFO org.apache.flink.runtime.taskexecutor.TaskManagerRunner [] - -D 2022-09-08 18:09:49,982 INFO org.apache.flink.runtime.taskexecutor.TaskManagerRunner [] - taskmanager.cpu.cores=4.0 2022-09-08 18:09:49,982 INFO org.apache.flink.runtime.taskexecutor.TaskManagerRunner [] - -D 2022-09-08 18:09:49,982 INFO org.apache.flink.runtime.taskexecutor.TaskManagerRunner [] - taskmanager.memory.task.heap.size=402653174b 2022-09-08 18:09:49,982 INFO org.apache.flink.runtime.taskexecutor.TaskManagerRunner [] - -D 2022-09-08 18:09:49,982 INFO org.apache.flink.runtime.taskexecutor.TaskManagerRunner [] - taskmanager.memory.task.off-heap.size=0b 2022-09-08 18:09:49,982 INFO org.apache.flink.runtime.taskexecutor.TaskManagerRunner [] - -D 2022-09-08 18:09:49,982 INFO org.apache.flink.runtime.taskexecutor.TaskManagerRunner [] - taskmanager.memory.jvm-metaspace.size=268435456b 2022-09-08 18:09:49,983 INFO org.apache.flink.runtime.taskexecutor.TaskManagerRunner [] - -D 2022-09-08 18:09:49,983 INFO org.apache.flink.runtime.taskexecutor.TaskManagerRunner [] - taskmanager.memory.jvm-overhead.max=201326592b 2022-09-08 18:09:49,983 INFO org.apache.flink.runtime.taskexecutor.TaskManagerRunner [] - -D 2022-09-08 18:09:49,983 INFO org.apache.flink.runtime.taskexecutor.TaskManagerRunner [] - taskmanager.memory.jvm-overhead.min=201326592b 2022-09-08 18:09:49,983 INFO org.apache.flink.runtime.taskexecutor.TaskManagerRunner [] - Classpath: /home/flink-1.12.7/lib/flink-csv-1.12.7.jar:/home/flink-1.12.7/lib/flink-json-1.12.7.jar:/home/flink-1.12.7/lib/flink-shaded-zookeeper-3.4.14.jar:/home/flink-1.12.7/lib/flink-table_2.12-1.12.7.jar:/home/flink-1.12.7/lib/flink-table-blink_2.12-1.12.7.jar:/home/flink-1.12.7/lib/guava-18.0.jar:/home/flink-1.12.7/lib/log4j-1.2-api-2.16.0.jar:/home/flink-1.12.7/lib/log4j-api-2.16.0.jar:/home/flink-1.12.7/lib/log4j-core-2.16.0.jar:/home/flink-1.12.7/lib/log4j-slf4j-impl-2.16.0.jar:/home/flink-1.12.7/lib/flink-dist_2.12-1.12.7.jar::/home/hadoop/hadoop-3.1.3/etc/hadoop::/usr/local/hbase/hbase-2.1.7/conf 2022-09-08 18:09:49,983 INFO org.apache.flink.runtime.taskexecutor.TaskManagerRunner [] - -------------------------------------------------------------------------------- 2022-09-08 18:09:49,984 INFO org.apache.flink.runtime.taskexecutor.TaskManagerRunner [] - Registered UNIX signal handlers for [TERM, HUP, INT] 2022-09-08 18:09:49,988 INFO org.apache.flink.runtime.taskexecutor.TaskManagerRunner [] - Maximum number of open file descriptors is 655360. 2022-09-08 18:09:49,999 INFO org.apache.flink.configuration.GlobalConfiguration [] - Loading configuration property: jobmanager.rpc.address, localhost 2022-09-08 18:09:50,000 INFO org.apache.flink.configuration.GlobalConfiguration [] - Loading configuration property: jobmanager.rpc.port, 6123 2022-09-08 18:09:50,000 INFO org.apache.flink.configuration.GlobalConfiguration [] - Loading configuration property: jobmanager.memory.process.size, 1600m 2022-09-08 18:09:50,000 INFO org.apache.flink.configuration.GlobalConfiguration [] - Loading configuration property: taskmanager.memory.process.size, 1728m 2022-09-08 18:09:50,000 INFO org.apache.flink.configuration.GlobalConfiguration [] - Loading configuration property: taskmanager.numberOfTaskSlots, 4 2022-09-08 18:09:50,000 INFO org.apache.flink.configuration.GlobalConfiguration [] - Loading configuration property: parallelism.default, 1 2022-09-08 18:09:50,001 INFO org.apache.flink.configuration.GlobalConfiguration [] - Loading configuration property: jobmanager.execution.failover-strategy, region 2022-09-08 18:09:50,060 INFO org.apache.flink.core.fs.FileSystem [] - Hadoop is not in the classpath/dependencies. The extended set of supported File Systems via Hadoop is not available. 2022-09-08 18:09:50,111 INFO org.apache.flink.runtime.security.modules.HadoopModuleFactory [] - Cannot create Hadoop Security Module because Hadoop cannot be found in the Classpath. 2022-09-08 18:09:50,128 INFO org.apache.flink.runtime.security.modules.JaasModule [] - Jaas file will be created as /tmp/jaas-5880000321688009128.conf. 2022-09-08 18:09:50,135 INFO org.apache.flink.runtime.security.contexts.HadoopSecurityContextFactory [] - Cannot install HadoopSecurityContext because Hadoop cannot be found in the Classpath. 2022-09-08 18:09:50,209 INFO org.apache.flink.configuration.Configuration [] - Config uses fallback configuration key 'jobmanager.rpc.address' instead of key 'rest.address' 2022-09-08 18:09:50,221 INFO org.apache.flink.runtime.util.LeaderRetrievalUtils [] - Trying to select the network interface and address to use by connecting to the leading JobManager. 2022-09-08 18:09:50,221 INFO org.apache.flink.runtime.util.LeaderRetrievalUtils [] - TaskManager will try to connect for PT10S before falling back to heuristics 2022-09-08 18:09:50,790 INFO org.apache.flink.runtime.net.ConnectionUtils [] - Trying to connect to address localhost/127.0.0.1:6123 2022-09-08 18:09:50,791 INFO org.apache.flink.runtime.net.ConnectionUtils [] - Failed to connect from address 'hadoop51/192.168.0.51': Connection refused 2022-09-08 18:09:50,791 INFO org.apache.flink.runtime.net.ConnectionUtils [] - Failed to connect from address '/127.0.0.1': Connection refused 2022-09-08 18:09:50,791 INFO org.apache.flink.runtime.net.ConnectionUtils [] - Failed to connect from address '/192.168.122.1': Connection refused 2022-09-08 18:09:50,792 INFO org.apache.flink.runtime.net.ConnectionUtils [] - Failed to connect from address '/fe80:0:0:0:5871:f3c7:66d5:861e%em4': Network is unreachable 2022-09-08 18:09:50,792 INFO org.apache.flink.runtime.net.ConnectionUtils [] - Failed to connect from address '/192.168.0.51': Connection refused 2022-09-08 18:09:50,792 INFO org.apache.flink.runtime.net.ConnectionUtils [] - Failed to connect from address '/0:0:0:0:0:0:0:1%lo': Network is unreachable 2022-09-08 18:09:50,793 INFO org.apache.flink.runtime.net.ConnectionUtils [] - Failed to connect from address '/127.0.0.1': Connection refused 2022-09-08 18:09:50,793 INFO org.apache.flink.runtime.net.ConnectionUtils [] - Failed to connect from address '/192.168.122.1': Connection refused 2022-09-08 18:09:50,793 INFO org.apache.flink.runtime.net.ConnectionUtils [] - Failed to connect from address '/fe80:0:0:0:5871:f3c7:66d5:861e%em4': Network is unreachable 2022-09-08 18:09:50,793 INFO org.apache.flink.runtime.net.ConnectionUtils [] - Failed to connect from address '/192.168.0.51': Connection refused 2022-09-08 18:09:50,794 INFO org.apache.flink.runtime.net.ConnectionUtils [] - Failed to connect from address '/0:0:0:0:0:0:0:1%lo': Network is unreachable 2022-09-08 18:09:50,794 INFO org.apache.flink.runtime.net.ConnectionUtils [] - Failed to connect from address '/127.0.0.1': Connection refused 2022-09-08 18:09:51,194 INFO org.apache.flink.runtime.net.ConnectionUtils [] - Trying to connect to address localhost/127.0.0.1:6123 2022-09-08 18:09:51,195 INFO org.apache.flink.runtime.net.ConnectionUtils [] - Failed to connect from address 'hadoop51/192.168.0.51': Connection refused 2022-09-08 18:09:51,196 INFO org.apache.flink.runtime.net.ConnectionUtils [] - Failed to connect from address '/127.0.0.1': Connection refused 2022-09-08 18:09:51,197 INFO org.apache.flink.runtime.net.ConnectionUtils [] - Failed to connect from address '/192.168.122.1': Connection refused 2022-09-08 18:09:51,197 INFO org.apache.flink.runtime.net.ConnectionUtils [] - Failed to connect from address '/fe80:0:0:0:5871:f3c7:66d5:861e%em4': Network is unreachable 2022-09-08 18:09:51,198 INFO org.apache.flink.runtime.net.ConnectionUtils [] - Failed to connect from address '/192.168.0.51': Connection refused 2022-09-08 18:09:51,198 INFO org.apache.flink.runtime.net.ConnectionUtils [] - Failed to connect from address '/0:0:0:0:0:0:0:1%lo': Network is unreachable 2022-09-08 18:09:51,199 INFO org.apache.flink.runtime.net.ConnectionUtils [] - Failed to connect from address '/127.0.0.1': Connection refused 2022-09-08 18:09:51,199 INFO org.apache.flink.runtime.net.ConnectionUtils [] - Failed to connect from address '/192.168.122.1': Connection refused 2022-09-08 18:09:51,199 INFO org.apache.flink.runtime.net.ConnectionUtils [] - Failed to connect from address '/fe80:0:0:0:5871:f3c7:66d5:861e%em4': Network is unreachable 2022-09-08 18:09:51,200 INFO org.apache.flink.runtime.net.ConnectionUtils [] - Failed to connect from address '/192.168.0.51': Connection refused 2022-09-08 18:09:51,200 INFO org.apache.flink.runtime.net.ConnectionUtils [] - Failed to connect from address '/0:0:0:0:0:0:0:1%lo': Network is unreachable 2022-09-08 18:09:51,200 INFO org.apache.flink.runtime.net.ConnectionUtils [] - Failed to connect from address '/127.0.0.1': Connection refused 2022-09-08 18:09:52,001 INFO org.apache.flink.runtime.net.ConnectionUtils [] - Trying to connect to address localhost/127.0.0.1:6123 2022-09-08 18:09:52,002 INFO org.apache.flink.runtime.taskexecutor.TaskManagerRunner [] - TaskManager will use hostname/address 'hadoop51' (192.168.0.51) for communication. 2022-09-08 18:09:52,020 INFO org.apache.flink.runtime.rpc.akka.AkkaRpcServiceUtils [] - Trying to start actor system, external address 192.168.0.51:0, bind address 0.0.0.0:0. 2022-09-08 18:09:52,913 INFO akka.event.slf4j.Slf4jLogger [] - Slf4jLogger started 2022-09-08 18:09:52,949 INFO akka.remote.Remoting [] - Starting remoting 2022-09-08 18:09:53,114 INFO akka.remote.Remoting [] - Remoting started; listening on addresses :[akka.tcp://flink@192.168.0.51:42188] 2022-09-08 18:09:53,302 INFO org.apache.flink.runtime.rpc.akka.AkkaRpcServiceUtils [] - Actor system started at akka.tcp://flink@192.168.0.51:42188 2022-09-08 18:09:53,329 INFO org.apache.flink.runtime.metrics.MetricRegistryImpl [] - No metrics reporter configured, no metrics will be exposed/reported. 2022-09-08 18:09:53,334 INFO org.apache.flink.runtime.rpc.akka.AkkaRpcServiceUtils [] - Trying to start actor system, external address 192.168.0.51:0, bind address 0.0.0.0:0. 2022-09-08 18:09:53,351 INFO akka.event.slf4j.Slf4jLogger [] - Slf4jLogger started 2022-09-08 18:09:53,355 INFO akka.remote.Remoting [] - Starting remoting 2022-09-08 18:09:53,365 INFO akka.remote.Remoting [] - Remoting started; listening on addresses :[akka.tcp://flink-metrics@192.168.0.51:43631] 2022-09-08 18:09:53,379 INFO org.apache.flink.runtime.rpc.akka.AkkaRpcServiceUtils [] - Actor system started at akka.tcp://flink-metrics@192.168.0.51:43631 2022-09-08 18:09:53,396 INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService [] - Starting RPC endpoint for org.apache.flink.runtime.metrics.dump.MetricQueryService at akka://flink-metrics/user/rpc/MetricQueryService_192.168.0.51:42188-6425e2 . 2022-09-08 18:09:53,410 INFO org.apache.flink.runtime.blob.PermanentBlobCache [] - Created BLOB cache storage directory /tmp/blobStore-f4ccfde8-3434-453f-8a0d-7b8e6199adf5 2022-09-08 18:09:53,413 INFO org.apache.flink.runtime.blob.TransientBlobCache [] - Created BLOB cache storage directory /tmp/blobStore-912d96c9-be81-4fde-8a2c-fc7a06232e26 2022-09-08 18:09:53,416 INFO org.apache.flink.runtime.externalresource.ExternalResourceUtils [] - Enabled external resources: [] 2022-09-08 18:09:53,416 INFO org.apache.flink.runtime.externalresource.ExternalResourceUtils [] - Enabled external resources: [] 2022-09-08 18:09:53,416 INFO org.apache.flink.runtime.taskexecutor.TaskManagerRunner [] - Starting TaskManager with ResourceID: 192.168.0.51:42188-6425e2 2022-09-08 18:09:53,440 INFO org.apache.flink.runtime.taskexecutor.TaskManagerServices [] - Temporary file directory '/tmp': total 49 GB, usable 12 GB (24.49% usable) 2022-09-08 18:09:53,444 INFO org.apache.flink.runtime.io.disk.FileChannelManagerImpl [] - FileChannelManager uses directory /tmp/flink-io-79087611-019f-4b4c-a408-39ed66dc113f for spill files. 2022-09-08 18:09:53,454 INFO org.apache.flink.runtime.io.network.netty.NettyConfig [] - NettyConfig [server address: /0.0.0.0, server port: 0, ssl enabled: false, memory segment size (bytes): 32768, transport type: AUTO, number of server threads: 4 (manual), number of client threads: 4 (manual), server connect backlog: 0 (use Netty's default), client connect timeout (sec): 120, send/receive buffer size (bytes): 0 (use Netty's default)] 2022-09-08 18:09:53,458 INFO org.apache.flink.runtime.io.disk.FileChannelManagerImpl [] - FileChannelManager uses directory /tmp/flink-netty-shuffle-f6a957bd-7dc8-4db4-9240-fe1a5df92525 for spill files. 2022-09-08 18:09:53,659 INFO org.apache.flink.runtime.io.network.buffer.NetworkBufferPool [] - Allocated 128 MB for network buffer pool (number of memory segments: 4096, bytes per segment: 32768). 2022-09-08 18:09:53,674 INFO org.apache.flink.runtime.io.network.NettyShuffleEnvironment [] - Starting the network environment and its components. 2022-09-08 18:09:53,766 INFO org.apache.flink.runtime.io.network.netty.NettyClient [] - Transport type 'auto': using EPOLL. 2022-09-08 18:09:53,768 INFO org.apache.flink.runtime.io.network.netty.NettyClient [] - Successful initialization (took 94 ms). 2022-09-08 18:09:53,775 INFO org.apache.flink.runtime.io.network.netty.NettyServer [] - Transport type 'auto': using EPOLL. 2022-09-08 18:09:53,819 INFO org.apache.flink.runtime.io.network.netty.NettyServer [] - Successful initialization (took 49 ms). Listening on SocketAddress /0:0:0:0:0:0:0:0%0:38257. 2022-09-08 18:09:53,821 INFO org.apache.flink.runtime.taskexecutor.KvStateService [] - Starting the kvState service and its components. 2022-09-08 18:09:53,847 INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService [] - Starting RPC endpoint for org.apache.flink.runtime.taskexecutor.TaskExecutor at akka://flink/user/rpc/taskmanager0 . 2022-09-08 18:09:53,921 INFO org.apache.flink.runtime.taskexecutor.DefaultJobLeaderService [] - Start job leader service. 2022-09-08 18:09:53,923 INFO org.apache.flink.runtime.filecache.FileCache [] - User file cache uses directory /tmp/flink-dist-cache-b501d934-9dbf-4cca-b4d1-a0d7ae35e54c 2022-09-08 18:09:53,925 INFO org.apache.flink.runtime.taskexecutor.TaskExecutor [] - Connecting to ResourceManager akka.tcp://flink@localhost:6123/user/rpc/resourcemanager(00000000000000000000000000000000). 2022-09-08 18:09:54,200 INFO org.apache.flink.runtime.taskexecutor.TaskExecutor [] - Resolved ResourceManager address, beginning registration 2022-09-08 18:09:54,307 INFO org.apache.flink.runtime.taskexecutor.TaskExecutor [] - Successful registration at resource manager akka.tcp://flink@localhost:6123/user/rpc/resourcemanager_ under registration id c06fb216045dd202b10ccb78b11e22ba. 2022-09-08 18:10:16,807 INFO org.apache.flink.runtime.taskexecutor.TaskExecutor [] - Receive slot request d7d358ae1ec9a0ef253dff422887843a for job 47dcb1c312ebbbd369e3e2c3a4d92c05 from resource manager with leader id 00000000000000000000000000000000. 2022-09-08 18:10:16,814 INFO org.apache.flink.runtime.taskexecutor.TaskExecutor [] - Allocated slot for d7d358ae1ec9a0ef253dff422887843a. 2022-09-08 18:10:16,815 INFO org.apache.flink.runtime.taskexecutor.DefaultJobLeaderService [] - Add job 47dcb1c312ebbbd369e3e2c3a4d92c05 for job leader monitoring. 2022-09-08 18:10:16,817 INFO org.apache.flink.runtime.taskexecutor.DefaultJobLeaderService [] - Try to register at job manager akka.tcp://flink@localhost:6123/user/rpc/jobmanager_2 with leader id 00000000-0000-0000-0000-000000000000. 2022-09-08 18:10:16,833 INFO org.apache.flink.runtime.taskexecutor.DefaultJobLeaderService [] - Resolved JobManager address, beginning registration 2022-09-08 18:10:16,853 INFO org.apache.flink.runtime.taskexecutor.DefaultJobLeaderService [] - Successful registration at job manager akka.tcp://flink@localhost:6123/user/rpc/jobmanager_2 for job 47dcb1c312ebbbd369e3e2c3a4d92c05. 2022-09-08 18:10:16,854 INFO org.apache.flink.runtime.taskexecutor.TaskExecutor [] - Establish JobManager connection for job 47dcb1c312ebbbd369e3e2c3a4d92c05. 2022-09-08 18:10:16,858 INFO org.apache.flink.runtime.taskexecutor.TaskExecutor [] - Offer reserved slots to the leader of job 47dcb1c312ebbbd369e3e2c3a4d92c05. 2022-09-08 18:10:16,881 INFO org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl [] - Activate slot d7d358ae1ec9a0ef253dff422887843a. 2022-09-08 18:10:16,887 INFO org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl [] - Activate slot d7d358ae1ec9a0ef253dff422887843a. 2022-09-08 18:10:16,922 INFO org.apache.flink.runtime.taskexecutor.TaskExecutor [] - Received task Source: TableSourceScan(table=[[default_catalog, default_database, source]], fields=[id, parent_id, project_no, sales_order, plant_code, product_line, sales_order_item, bay_id, order_no, order_status, order_level, sap_order_status, material_no, material_descr, final_material, init_production_center, final_production_center, module_code, material_no_fellow, order_no_fellow, item_component_list, reservation, reservation_item, material_picking1_times, material_picking2_times, first_date, basic_start_time, basic_finish_time, product_start_time, mes_plan_start_time, mes_plan_end_time, mes_task_exec_time, mes_plan_u_time, is_deleted, is_modify, create_user, update_user, create_time, update_time]) -> Calc(select=[CAST(id) AS id, CAST(parent_id) AS parent_id, CAST(project_no) AS project_no, CAST(sales_order) AS sales_order, plant_code, product_line, CAST(sales_order_item) AS sales_order_item, bay_id, CAST(order_no) AS order_no, order_status, CAST(order_level) AS order_level, sap_order_status, CAST(material_no) AS material_no, material_descr, final_material, init_production_center, final_production_center, module_code, material_no_fellow, order_no_fellow, item_component_list, reservation, reservation_item, material_picking1_times, material_picking2_times, first_date, basic_start_time, basic_finish_time, product_start_time, mes_plan_start_time, mes_plan_end_time, mes_task_exec_time, mes_plan_u_time, is_deleted, is_modify, create_user, update_user, create_time, CAST(update_time) AS update_time]) -> Sink: Sink(table=[default_catalog.default_database.sink], fields=[id, parent_id, project_no, sales_order, plant_code, product_line, sales_order_item, bay_id, order_no, order_status, order_level, sap_order_status, material_no, material_descr, final_material, init_production_center, final_production_center, module_code, material_no_fellow, order_no_fellow, item_component_list, reservation, reservation_item, material_picking1_times, material_picking2_times, first_date, basic_start_time, basic_finish_time, product_start_time, mes_plan_start_time, mes_plan_end_time, mes_task_exec_time, mes_plan_u_time, is_deleted, is_modify, create_user, update_user, create_time, update_time]) (1/1)#0 (be99fd45957068b61784e457308cf99e), deploy into slot with allocation id d7d358ae1ec9a0ef253dff422887843a. 2022-09-08 18:10:16,923 INFO org.apache.flink.runtime.taskmanager.Task [] - Source: TableSourceScan(table=[[default_catalog, default_database, source]], fields=[id, parent_id, project_no, sales_order, plant_code, product_line, sales_order_item, bay_id, order_no, order_status, order_level, sap_order_status, material_no, material_descr, final_material, init_production_center, final_production_center, module_code, material_no_fellow, order_no_fellow, item_component_list, reservation, reservation_item, material_picking1_times, material_picking2_times, first_date, basic_start_time, basic_finish_time, product_start_time, mes_plan_start_time, mes_plan_end_time, mes_task_exec_time, mes_plan_u_time, is_deleted, is_modify, create_user, update_user, create_time, update_time]) -> Calc(select=[CAST(id) AS id, CAST(parent_id) AS parent_id, CAST(project_no) AS project_no, CAST(sales_order) AS sales_order, plant_code, product_line, CAST(sales_order_item) AS sales_order_item, bay_id, CAST(order_no) AS order_no, order_status, CAST(order_level) AS order_level, sap_order_status, CAST(material_no) AS material_no, material_descr, final_material, init_production_center, final_production_center, module_code, material_no_fellow, order_no_fellow, item_component_list, reservation, reservation_item, material_picking1_times, material_picking2_times, first_date, basic_start_time, basic_finish_time, product_start_time, mes_plan_start_time, mes_plan_end_time, mes_task_exec_time, mes_plan_u_time, is_deleted, is_modify, create_user, update_user, create_time, CAST(update_time) AS update_time]) -> Sink: Sink(table=[default_catalog.default_database.sink], fields=[id, parent_id, project_no, sales_order, plant_code, product_line, sales_order_item, bay_id, order_no, order_status, order_level, sap_order_status, material_no, material_descr, final_material, init_production_center, final_production_center, module_code, material_no_fellow, order_no_fellow, item_component_list, reservation, reservation_item, material_picking1_times, material_picking2_times, first_date, basic_start_time, basic_finish_time, product_start_time, mes_plan_start_time, mes_plan_end_time, mes_task_exec_time, mes_plan_u_time, is_deleted, is_modify, create_user, update_user, create_time, update_time]) (1/1)#0 (be99fd45957068b61784e457308cf99e) switched from CREATED to DEPLOYING. 2022-09-08 18:10:16,926 INFO org.apache.flink.runtime.taskmanager.Task [] - Loading JAR files for task Source: TableSourceScan(table=[[default_catalog, default_database, source]], fields=[id, parent_id, project_no, sales_order, plant_code, product_line, sales_order_item, bay_id, order_no, order_status, order_level, sap_order_status, material_no, material_descr, final_material, init_production_center, final_production_center, module_code, material_no_fellow, order_no_fellow, item_component_list, reservation, reservation_item, material_picking1_times, material_picking2_times, first_date, basic_start_time, basic_finish_time, product_start_time, mes_plan_start_time, mes_plan_end_time, mes_task_exec_time, mes_plan_u_time, is_deleted, is_modify, create_user, update_user, create_time, update_time]) -> Calc(select=[CAST(id) AS id, CAST(parent_id) AS parent_id, CAST(project_no) AS project_no, CAST(sales_order) AS sales_order, plant_code, product_line, CAST(sales_order_item) AS sales_order_item, bay_id, CAST(order_no) AS order_no, order_status, CAST(order_level) AS order_level, sap_order_status, CAST(material_no) AS material_no, material_descr, final_material, init_production_center, final_production_center, module_code, material_no_fellow, order_no_fellow, item_component_list, reservation, reservation_item, material_picking1_times, material_picking2_times, first_date, basic_start_time, basic_finish_time, product_start_time, mes_plan_start_time, mes_plan_end_time, mes_task_exec_time, mes_plan_u_time, is_deleted, is_modify, create_user, update_user, create_time, CAST(update_time) AS update_time]) -> Sink: Sink(table=[default_catalog.default_database.sink], fields=[id, parent_id, project_no, sales_order, plant_code, product_line, sales_order_item, bay_id, order_no, order_status, order_level, sap_order_status, material_no, material_descr, final_material, init_production_center, final_production_center, module_code, material_no_fellow, order_no_fellow, item_component_list, reservation, reservation_item, material_picking1_times, material_picking2_times, first_date, basic_start_time, basic_finish_time, product_start_time, mes_plan_start_time, mes_plan_end_time, mes_task_exec_time, mes_plan_u_time, is_deleted, is_modify, create_user, update_user, create_time, update_time]) (1/1)#0 (be99fd45957068b61784e457308cf99e) [DEPLOYING]. 2022-09-08 18:10:16,931 INFO org.apache.flink.runtime.blob.BlobClient [] - Downloading 47dcb1c312ebbbd369e3e2c3a4d92c05/p-1e5f7cb8f16ee2b15f49200098e655906da5b1d4-c118eff94db89f8129729eb499dd7625 from localhost/127.0.0.1:34063 2022-09-08 18:10:17,047 INFO org.apache.flink.runtime.taskmanager.Task [] - Registering task at network: Source: TableSourceScan(table=[[default_catalog, default_database, source]], fields=[id, parent_id, project_no, sales_order, plant_code, product_line, sales_order_item, bay_id, order_no, order_status, order_level, sap_order_status, material_no, material_descr, final_material, init_production_center, final_production_center, module_code, material_no_fellow, order_no_fellow, item_component_list, reservation, reservation_item, material_picking1_times, material_picking2_times, first_date, basic_start_time, basic_finish_time, product_start_time, mes_plan_start_time, mes_plan_end_time, mes_task_exec_time, mes_plan_u_time, is_deleted, is_modify, create_user, update_user, create_time, update_time]) -> Calc(select=[CAST(id) AS id, CAST(parent_id) AS parent_id, CAST(project_no) AS project_no, CAST(sales_order) AS sales_order, plant_code, product_line, CAST(sales_order_item) AS sales_order_item, bay_id, CAST(order_no) AS order_no, order_status, CAST(order_level) AS order_level, sap_order_status, CAST(material_no) AS material_no, material_descr, final_material, init_production_center, final_production_center, module_code, material_no_fellow, order_no_fellow, item_component_list, reservation, reservation_item, material_picking1_times, material_picking2_times, first_date, basic_start_time, basic_finish_time, product_start_time, mes_plan_start_time, mes_plan_end_time, mes_task_exec_time, mes_plan_u_time, is_deleted, is_modify, create_user, update_user, create_time, CAST(update_time) AS update_time]) -> Sink: Sink(table=[default_catalog.default_database.sink], fields=[id, parent_id, project_no, sales_order, plant_code, product_line, sales_order_item, bay_id, order_no, order_status, order_level, sap_order_status, material_no, material_descr, final_material, init_production_center, final_production_center, module_code, material_no_fellow, order_no_fellow, item_component_list, reservation, reservation_item, material_picking1_times, material_picking2_times, first_date, basic_start_time, basic_finish_time, product_start_time, mes_plan_start_time, mes_plan_end_time, mes_task_exec_time, mes_plan_u_time, is_deleted, is_modify, create_user, update_user, create_time, update_time]) (1/1)#0 (be99fd45957068b61784e457308cf99e) [DEPLOYING]. 2022-09-08 18:10:17,049 INFO org.apache.flink.runtime.taskmanager.Task [] - Obtaining local cache file for 'class_path_2'. 2022-09-08 18:10:17,054 INFO org.apache.flink.runtime.taskmanager.Task [] - Obtaining local cache file for 'class_path_1'. 2022-09-08 18:10:17,055 INFO org.apache.flink.runtime.taskmanager.Task [] - Obtaining local cache file for 'class_path_0'. 2022-09-08 18:10:17,055 INFO org.apache.flink.runtime.blob.BlobClient [] - Downloading 47dcb1c312ebbbd369e3e2c3a4d92c05/p-e7c57bc8a3d7173eab68d17eb9a58f9cd303e6e7-5402d0a87a8774605d00a510ee95b014 from localhost/127.0.0.1:34063 2022-09-08 18:10:17,056 INFO org.apache.flink.runtime.blob.BlobClient [] - Downloading 47dcb1c312ebbbd369e3e2c3a4d92c05/p-cd2f8171f8678a059d71abd099b5a343013c93b3-313b1496653fdbfdbafec9ec83581408 from localhost/127.0.0.1:34063 2022-09-08 18:10:17,056 INFO org.apache.flink.runtime.blob.BlobClient [] - Downloading 47dcb1c312ebbbd369e3e2c3a4d92c05/p-aa9a45374ffd87171858a620e49c29157ab3a313-bc5e900e71df04b2d9a299442e4dc31a from localhost/127.0.0.1:34063 2022-09-08 18:10:17,112 INFO org.apache.flink.streaming.runtime.tasks.StreamTask [] - No state backend has been configured, using default (Memory / JobManager) MemoryStateBackend (data in heap memory / checkpoints to JobManager) (checkpoints: 'null', savepoints: 'null', asynchronous: TRUE, maxStateSize: 5242880) 2022-09-08 18:10:17,123 INFO org.apache.flink.runtime.taskmanager.Task [] - Source: TableSourceScan(table=[[default_catalog, default_database, source]], fields=[id, parent_id, project_no, sales_order, plant_code, product_line, sales_order_item, bay_id, order_no, order_status, order_level, sap_order_status, material_no, material_descr, final_material, init_production_center, final_production_center, module_code, material_no_fellow, order_no_fellow, item_component_list, reservation, reservation_item, material_picking1_times, material_picking2_times, first_date, basic_start_time, basic_finish_time, product_start_time, mes_plan_start_time, mes_plan_end_time, mes_task_exec_time, mes_plan_u_time, is_deleted, is_modify, create_user, update_user, create_time, update_time]) -> Calc(select=[CAST(id) AS id, CAST(parent_id) AS parent_id, CAST(project_no) AS project_no, CAST(sales_order) AS sales_order, plant_code, product_line, CAST(sales_order_item) AS sales_order_item, bay_id, CAST(order_no) AS order_no, order_status, CAST(order_level) AS order_level, sap_order_status, CAST(material_no) AS material_no, material_descr, final_material, init_production_center, final_production_center, module_code, material_no_fellow, order_no_fellow, item_component_list, reservation, reservation_item, material_picking1_times, material_picking2_times, first_date, basic_start_time, basic_finish_time, product_start_time, mes_plan_start_time, mes_plan_end_time, mes_task_exec_time, mes_plan_u_time, is_deleted, is_modify, create_user, update_user, create_time, CAST(update_time) AS update_time]) -> Sink: Sink(table=[default_catalog.default_database.sink], fields=[id, parent_id, project_no, sales_order, plant_code, product_line, sales_order_item, bay_id, order_no, order_status, order_level, sap_order_status, material_no, material_descr, final_material, init_production_center, final_production_center, module_code, material_no_fellow, order_no_fellow, item_component_list, reservation, reservation_item, material_picking1_times, material_picking2_times, first_date, basic_start_time, basic_finish_time, product_start_time, mes_plan_start_time, mes_plan_end_time, mes_task_exec_time, mes_plan_u_time, is_deleted, is_modify, create_user, update_user, create_time, update_time]) (1/1)#0 (be99fd45957068b61784e457308cf99e) switched from DEPLOYING to RUNNING. 2022-09-08 18:10:17,388 WARN org.apache.flink.metrics.MetricGroup [] - The operator name Sink: Sink(table=[default_catalog.default_database.sink], fields=[id, parent_id, project_no, sales_order, plant_code, product_line, sales_order_item, bay_id, order_no, order_status, order_level, sap_order_status, material_no, material_descr, final_material, init_production_center, final_production_center, module_code, material_no_fellow, order_no_fellow, item_component_list, reservation, reservation_item, material_picking1_times, material_picking2_times, first_date, basic_start_time, basic_finish_time, product_start_time, mes_plan_start_time, mes_plan_end_time, mes_task_exec_time, mes_plan_u_time, is_deleted, is_modify, create_user, update_user, create_time, update_time]) exceeded the 80 characters length limit and was truncated. 2022-09-08 18:10:17,768 WARN org.apache.flink.metrics.MetricGroup [] - The operator name Calc(select=[CAST(id) AS id, CAST(parent_id) AS parent_id, CAST(project_no) AS project_no, CAST(sales_order) AS sales_order, plant_code, product_line, CAST(sales_order_item) AS sales_order_item, bay_id, CAST(order_no) AS order_no, order_status, CAST(order_level) AS order_level, sap_order_status, CAST(material_no) AS material_no, material_descr, final_material, init_production_center, final_production_center, module_code, material_no_fellow, order_no_fellow, item_component_list, reservation, reservation_item, material_picking1_times, material_picking2_times, first_date, basic_start_time, basic_finish_time, product_start_time, mes_plan_start_time, mes_plan_end_time, mes_task_exec_time, mes_plan_u_time, is_deleted, is_modify, create_user, update_user, create_time, CAST(update_time) AS update_time]) exceeded the 80 characters length limit and was truncated. 2022-09-08 18:10:17,771 WARN org.apache.flink.metrics.MetricGroup [] - The operator name Source: TableSourceScan(table=[[default_catalog, default_database, source]], fields=[id, parent_id, project_no, sales_order, plant_code, product_line, sales_order_item, bay_id, order_no, order_status, order_level, sap_order_status, material_no, material_descr, final_material, init_production_center, final_production_center, module_code, material_no_fellow, order_no_fellow, item_component_list, reservation, reservation_item, material_picking1_times, material_picking2_times, first_date, basic_start_time, basic_finish_time, product_start_time, mes_plan_start_time, mes_plan_end_time, mes_task_exec_time, mes_plan_u_time, is_deleted, is_modify, create_user, update_user, create_time, update_time]) exceeded the 80 characters length limit and was truncated. 2022-09-08 18:10:17,856 INFO com.dtstack.chunjun.sink.DtOutputFormatSinkFunction [] - Start initialize output format state 2022-09-08 18:10:17,913 INFO com.dtstack.chunjun.sink.DtOutputFormatSinkFunction [] - Is restored:false 2022-09-08 18:10:17,913 INFO com.dtstack.chunjun.sink.DtOutputFormatSinkFunction [] - End initialize output format state 2022-09-08 18:10:18,252 INFO com.dtstack.chunjun.connector.jdbc.sink.JdbcOutputFormat [] - initTimingSubmitTask() ,initialDelay:1000, delay:1000, MILLISECONDS 2022-09-08 18:10:18,457 INFO com.dtstack.chunjun.connector.jdbc.sink.JdbcOutputFormat [] - write sql:INSERT INTO "siemens2_ETO_PDM_Header"("id", "parent_id", "project_no", "sales_order", "plant_code", "product_line", "sales_order_item", "bay_id", "order_no", "order_status", "order_level", "sap_order_status", "material_no", "material_descr", "final_material", "init_production_center", "final_production_center", "module_code", "material_no_fellow", "order_no_fellow", "item_component_list", "reservation", "reservation_item", "material_picking1_times", "material_picking2_times", "first_date", "basic_start_time", "basic_finish_time", "product_start_time", "mes_plan_start_time", "mes_plan_end_time", "mes_task_exec_time", "mes_plan_u_time", "is_deleted", "is_modify", "create_user", "update_user", "create_time", "update_time") VALUES (:id, :parent_id, :project_no, :sales_order, :plant_code, :product_line, :sales_order_item, :bay_id, :order_no, :order_status, :order_level, :sap_order_status, :material_no, :material_descr, :final_material, :init_production_center, :final_production_center, :module_code, :material_no_fellow, :order_no_fellow, :item_component_list, :reservation, :reservation_item, :material_picking1_times, :material_picking2_times, :first_date, :basic_start_time, :basic_finish_time, :product_start_time, :mes_plan_start_time, :mes_plan_end_time, :mes_task_exec_time, :mes_plan_u_time, :is_deleted, :is_modify, :create_user, :update_user, :create_time, :update_time) ON CONFLICT ("id") DO UPDATE SET "parent_id"=EXCLUDED."parent_id", "project_no"=EXCLUDED."project_no", "sales_order"=EXCLUDED."sales_order", "plant_code"=EXCLUDED."plant_code", "product_line"=EXCLUDED."product_line", "sales_order_item"=EXCLUDED."sales_order_item", "bay_id"=EXCLUDED."bay_id", "order_no"=EXCLUDED."order_no", "order_status"=EXCLUDED."order_status", "order_level"=EXCLUDED."order_level", "sap_order_status"=EXCLUDED."sap_order_status", "material_no"=EXCLUDED."material_no", "material_descr"=EXCLUDED."material_descr", "final_material"=EXCLUDED."final_material", "init_production_center"=EXCLUDED."init_production_center", "final_production_center"=EXCLUDED."final_production_center", "module_code"=EXCLUDED."module_code", "material_no_fellow"=EXCLUDED."material_no_fellow", "order_no_fellow"=EXCLUDED."order_no_fellow", "item_component_list"=EXCLUDED."item_component_list", "reservation"=EXCLUDED."reservation", "reservation_item"=EXCLUDED."reservation_item", "material_picking1_times"=EXCLUDED."material_picking1_times", "material_picking2_times"=EXCLUDED."material_picking2_times", "first_date"=EXCLUDED."first_date", "basic_start_time"=EXCLUDED."basic_start_time", "basic_finish_time"=EXCLUDED."basic_finish_time", "product_start_time"=EXCLUDED."product_start_time", "mes_plan_start_time"=EXCLUDED."mes_plan_start_time", "mes_plan_end_time"=EXCLUDED."mes_plan_end_time", "mes_task_exec_time"=EXCLUDED."mes_task_exec_time", "mes_plan_u_time"=EXCLUDED."mes_plan_u_time", "is_deleted"=EXCLUDED."is_deleted", "is_modify"=EXCLUDED."is_modify", "create_user"=EXCLUDED."create_user", "update_user"=EXCLUDED."update_user", "create_time"=EXCLUDED."create_time", "update_time"=EXCLUDED."update_time" 2022-09-08 18:10:18,542 INFO com.dtstack.chunjun.connector.jdbc.sink.JdbcOutputFormat [] - subTask[0}] wait finished 2022-09-08 18:10:18,669 INFO com.dtstack.chunjun.connector.jdbc.sink.JdbcOutputFormat [] - [JdbcOutputFormat] open successfully, checkpointMode = AT_LEAST_ONCE, checkpointEnabled = false, flushIntervalMills = 1000, batchSize = 1024, [JdbcConf]: { "semantic" : "at-least-once", "errorRecord" : 0, "checkFormat" : true, "parallelism" : 1, "executeDdlAble" : false, "pollingInterval" : 5000, "increment" : false, "flushIntervalMills" : 1000, "polling" : false, "mode" : "UPDATE", "password" : "**", "restoreColumnIndex" : -1, "connection" : [ { "table" : [ "siemens2_ETO_PDM_Header" ], "jdbcUrl" : "jdbc:postgresql://192.168.0.116:52345/postgres", "allReplace" : true } ], "table" : "siemens2_ETO_PDM_Header", "queryTimeOut" : 0, "fetchSize" : 0, "useMaxFunc" : false, "uniqueKey" : [ "id" ], "column" : [ { "name" : "id", "type" : "DECIMAL(10, 0) NOT NULL", "index" : 0, "notNull" : false, "part" : false }, { "name" : "parent_id", "type" : "DECIMAL(10, 0) NOT NULL", "index" : 1, "notNull" : false, "part" : false }, { "name" : "project_no", "type" : "VARCHAR(50) NOT NULL", "index" : 2, "notNull" : false, "part" : false }, { "name" : "sales_order", "type" : "VARCHAR(50) NOT NULL", "index" : 3, "notNull" : false, "part" : false }, { "name" : "plant_code", "type" : "VARCHAR(50)", "index" : 4, "notNull" : false, "part" : false }, { "name" : "product_line", "type" : "VARCHAR(50)", "index" : 5, "notNull" : false, "part" : false }, { "name" : "sales_order_item", "type" : "VARCHAR(50) NOT NULL", "index" : 6, "notNull" : false, "part" : false }, { "name" : "bay_id", "type" : "VARCHAR(50)", "index" : 7, "notNull" : false, "part" : false }, { "name" : "order_no", "type" : "VARCHAR(50) NOT NULL", "index" : 8, "notNull" : false, "part" : false }, { "name" : "order_status", "type" : "INT", "index" : 9, "notNull" : false, "part" : false }, { "name" : "order_level", "type" : "INT NOT NULL", "index" : 10, "notNull" : false, "part" : false }, { "name" : "sap_order_status", "type" : "VARCHAR(50)", "index" : 11, "notNull" : false, "part" : false }, { "name" : "material_no", "type" : "VARCHAR(50) NOT NULL", "index" : 12, "notNull" : false, "part" : false }, { "name" : "material_descr", "type" : "VARCHAR(255)", "index" : 13, "notNull" : false, "part" : false }, { "name" : "final_material", "type" : "VARCHAR(50)", "index" : 14, "notNull" : false, "part" : false }, { "name" : "init_production_center", "type" : "VARCHAR(50)", "index" : 15, "notNull" : false, "part" : false }, { "name" : "final_production_center", "type" : "VARCHAR(50)", "index" : 16, "notNull" : false, "part" : false }, { "name" : "module_code", "type" : "VARCHAR(50)", "index" : 17, "notNull" : false, "part" : false }, { "name" : "material_no_fellow", "type" : "VARCHAR(50)", "index" : 18, "notNull" : false, "part" : false }, { "name" : "order_no_fellow", "type" : "VARCHAR(50)", "index" : 19, "notNull" : false, "part" : false }, { "name" : "item_component_list", "type" : "VARCHAR(50)", "index" : 20, "notNull" : false, "part" : false }, { "name" : "reservation", "type" : "VARCHAR(50)", "index" : 21, "notNull" : false, "part" : false }, { "name" : "reservation_item", "type" : "VARCHAR(50)", "index" : 22, "notNull" : false, "part" : false }, { "name" : "material_picking1_times", "type" : "INT", "index" : 23, "notNull" : false, "part" : false }, { "name" : "material_picking2_times", "type" : "INT", "index" : 24, "notNull" : false, "part" : false }, { "name" : "first_date", "type" : "TIMESTAMP(6)", "index" : 25, "notNull" : false, "part" : false }, { "name" : "basic_start_time", "type" : "TIMESTAMP(6)", "index" : 26, "notNull" : false, "part" : false }, { "name" : "basic_finish_time", "type" : "TIMESTAMP(6)", "index" : 27, "notNull" : false, "part" : false }, { "name" : "product_start_time", "type" : "TIMESTAMP(6)", "index" : 28, "notNull" : false, "part" : false }, { "name" : "mes_plan_start_time", "type" : "TIMESTAMP(6)", "index" : 29, "notNull" : false, "part" : false }, { "name" : "mes_plan_end_time", "type" : "TIMESTAMP(6)", "index" : 30, "notNull" : false, "part" : false }, { "name" : "mes_task_exec_time", "type" : "TIMESTAMP(6)", "index" : 31, "notNull" : false, "part" : false }, { "name" : "mes_plan_u_time", "type" : "TIMESTAMP(6)", "index" : 32, "notNull" : false, "part" : false }, { "name" : "is_deleted", "type" : "INT", "index" : 33, "notNull" : false, "part" : false }, { "name" : "is_modify", "type" : "INT", "index" : 34, "notNull" : false, "part" : false }, { "name" : "create_user", "type" : "VARCHAR(50)", "index" : 35, "notNull" : false, "part" : false }, { "name" : "update_user", "type" : "VARCHAR(50)", "index" : 36, "notNull" : false, "part" : false }, { "name" : "create_time", "type" : "TIMESTAMP(6)", "index" : 37, "notNull" : false, "part" : false }, { "name" : "update_time", "type" : "TIMESTAMP(6)", "index" : 38, "notNull" : false, "part" : false } ], "errorPercentage" : -1, "fieldNameList" : [ ], "withNoLock" : false, "increColumnIndex" : -1, "allReplace" : true, "initReporter" : true, "jdbcUrl" : "jdbc:postgresql://192.168.0.116:52345/postgres", "connectTimeOut" : 0, "batchSize" : 1024, "speedBytes" : 0, "rowSizeCalculatorType" : "objectSizeCalculator", "metricPluginName" : "prometheus", "username" : "postgres" } 2022-09-08 18:10:18,671 INFO com.dtstack.chunjun.source.DtInputFormatSourceFunction [] - Start initialize input format state, is restored:false 2022-09-08 18:10:18,673 INFO com.dtstack.chunjun.source.DtInputFormatSourceFunction [] - End initialize input format state 2022-09-08 18:10:18,693 INFO com.dtstack.chunjun.connector.sqlservercdc.inputFormat.SqlServerCdcInputFormat [] - sqlServer cdc openInternal split number:0 start... 2022-09-08 18:10:18,959 INFO com.dtstack.chunjun.connector.sqlservercdc.inputFormat.SqlServerCdcInputFormat [] - SqlserverCdcInputFormat[sqlserverCDCToPostgresql]open: end 2022-09-08 18:10:18,960 INFO com.dtstack.chunjun.connector.sqlservercdc.listener.SqlServerCdcListener [] - SqlServerCdcListener start running..... 2022-09-08 18:10:18,974 INFO com.dtstack.chunjun.connector.sqlservercdc.inputFormat.SqlServerCdcInputFormat [] - [SqlServerCdcInputFormat] open successfully, inputSplit = GenericSplit (0/1), [SqlServerCdcConf]: { "semantic" : "at-least-once", "databaseName" : "CDCTest", "errorRecord" : 0, "checkFormat" : true, "parallelism" : 1, "executeDdlAble" : false, "errorPercentage" : -1, "flushIntervalMills" : 10000, "fieldNameList" : [ ], "url" : "jdbc:sqlserver://192.168.8.212:1433;databaseName=CDCTest", "pavingData" : true, "password" : "**", "pollInterval" : 1000, "splitUpdate" : false, "cat" : "insert,delete,update", "tableList" : [ "dbo.ETO_PDM_Header" ], "timestampFormat" : "sql", "autoResetConnection" : false, "batchSize" : 1, "autoCommit" : false, "speedBytes" : 0, "rowSizeCalculatorType" : "objectSizeCalculator", "metricPluginName" : "prometheus", "username" : "sa" } 2022-09-08 19:26:11,736 INFO org.apache.flink.runtime.taskmanager.Task [] - Attempting to cancel task Source: TableSourceScan(table=[[default_catalog, default_database, source]], fields=[id, parent_id, project_no, sales_order, plant_code, product_line, sales_order_item, bay_id, order_no, order_status, order_level, sap_order_status, material_no, material_descr, final_material, init_production_center, final_production_center, module_code, material_no_fellow, order_no_fellow, item_component_list, reservation, reservation_item, material_picking1_times, material_picking2_times, first_date, basic_start_time, basic_finish_time, product_start_time, mes_plan_start_time, mes_plan_end_time, mes_task_exec_time, mes_plan_u_time, is_deleted, is_modify, create_user, update_user, create_time, update_time]) -> Calc(select=[CAST(id) AS id, CAST(parent_id) AS parent_id, CAST(project_no) AS project_no, CAST(sales_order) AS sales_order, plant_code, product_line, CAST(sales_order_item) AS sales_order_item, bay_id, CAST(order_no) AS order_no, order_status, CAST(order_level) AS order_level, sap_order_status, CAST(material_no) AS material_no, material_descr, final_material, init_production_center, final_production_center, module_code, material_no_fellow, order_no_fellow, item_component_list, reservation, reservation_item, material_picking1_times, material_picking2_times, first_date, basic_start_time, basic_finish_time, product_start_time, mes_plan_start_time, mes_plan_end_time, mes_task_exec_time, mes_plan_u_time, is_deleted, is_modify, create_user, update_user, create_time, CAST(update_time) AS update_time]) -> Sink: Sink(table=[default_catalog.default_database.sink], fields=[id, parent_id, project_no, sales_order, plant_code, product_line, sales_order_item, bay_id, order_no, order_status, order_level, sap_order_status, material_no, material_descr, final_material, init_production_center, final_production_center, module_code, material_no_fellow, order_no_fellow, item_component_list, reservation, reservation_item, material_picking1_times, material_picking2_times, first_date, basic_start_time, basic_finish_time, product_start_time, mes_plan_start_time, mes_plan_end_time, mes_task_exec_time, mes_plan_u_time, is_deleted, is_modify, create_user, update_user, create_time, update_time]) (1/1)#0 (be99fd45957068b61784e457308cf99e). 2022-09-08 19:26:11,738 INFO org.apache.flink.runtime.taskmanager.Task [] - Source: TableSourceScan(table=[[default_catalog, default_database, source]], fields=[id, parent_id, project_no, sales_order, plant_code, product_line, sales_order_item, bay_id, order_no, order_status, order_level, sap_order_status, material_no, material_descr, final_material, init_production_center, final_production_center, module_code, material_no_fellow, order_no_fellow, item_component_list, reservation, reservation_item, material_picking1_times, material_picking2_times, first_date, basic_start_time, basic_finish_time, product_start_time, mes_plan_start_time, mes_plan_end_time, mes_task_exec_time, mes_plan_u_time, is_deleted, is_modify, create_user, update_user, create_time, update_time]) -> Calc(select=[CAST(id) AS id, CAST(parent_id) AS parent_id, CAST(project_no) AS project_no, CAST(sales_order) AS sales_order, plant_code, product_line, CAST(sales_order_item) AS sales_order_item, bay_id, CAST(order_no) AS order_no, order_status, CAST(order_level) AS order_level, sap_order_status, CAST(material_no) AS material_no, material_descr, final_material, init_production_center, final_production_center, module_code, material_no_fellow, order_no_fellow, item_component_list, reservation, reservation_item, material_picking1_times, material_picking2_times, first_date, basic_start_time, basic_finish_time, product_start_time, mes_plan_start_time, mes_plan_end_time, mes_task_exec_time, mes_plan_u_time, is_deleted, is_modify, create_user, update_user, create_time, CAST(update_time) AS update_time]) -> Sink: Sink(table=[default_catalog.default_database.sink], fields=[id, parent_id, project_no, sales_order, plant_code, product_line, sales_order_item, bay_id, order_no, order_status, order_level, sap_order_status, material_no, material_descr, final_material, init_production_center, final_production_center, module_code, material_no_fellow, order_no_fellow, item_component_list, reservation, reservation_item, material_picking1_times, material_picking2_times, first_date, basic_start_time, basic_finish_time, product_start_time, mes_plan_start_time, mes_plan_end_time, mes_task_exec_time, mes_plan_u_time, is_deleted, is_modify, create_user, update_user, create_time, update_time]) (1/1)#0 (be99fd45957068b61784e457308cf99e) switched from RUNNING to CANCELING. 2022-09-08 19:26:11,738 INFO org.apache.flink.runtime.taskmanager.Task [] - Triggering cancellation of task code Source: TableSourceScan(table=[[default_catalog, default_database, source]], fields=[id, parent_id, project_no, sales_order, plant_code, product_line, sales_order_item, bay_id, order_no, order_status, order_level, sap_order_status, material_no, material_descr, final_material, init_production_center, final_production_center, module_code, material_no_fellow, order_no_fellow, item_component_list, reservation, reservation_item, material_picking1_times, material_picking2_times, first_date, basic_start_time, basic_finish_time, product_start_time, mes_plan_start_time, mes_plan_end_time, mes_task_exec_time, mes_plan_u_time, is_deleted, is_modify, create_user, update_user, create_time, update_time]) -> Calc(select=[CAST(id) AS id, CAST(parent_id) AS parent_id, CAST(project_no) AS project_no, CAST(sales_order) AS sales_order, plant_code, product_line, CAST(sales_order_item) AS sales_order_item, bay_id, CAST(order_no) AS order_no, order_status, CAST(order_level) AS order_level, sap_order_status, CAST(material_no) AS material_no, material_descr, final_material, init_production_center, final_production_center, module_code, material_no_fellow, order_no_fellow, item_component_list, reservation, reservation_item, material_picking1_times, material_picking2_times, first_date, basic_start_time, basic_finish_time, product_start_time, mes_plan_start_time, mes_plan_end_time, mes_task_exec_time, mes_plan_u_time, is_deleted, is_modify, create_user, update_user, create_time, CAST(update_time) AS update_time]) -> Sink: Sink(table=[default_catalog.default_database.sink], fields=[id, parent_id, project_no, sales_order, plant_code, product_line, sales_order_item, bay_id, order_no, order_status, order_level, sap_order_status, material_no, material_descr, final_material, init_production_center, final_production_center, module_code, material_no_fellow, order_no_fellow, item_component_list, reservation, reservation_item, material_picking1_times, material_picking2_times, first_date, basic_start_time, basic_finish_time, product_start_time, mes_plan_start_time, mes_plan_end_time, mes_task_exec_time, mes_plan_u_time, is_deleted, is_modify, create_user, update_user, create_time, update_time]) (1/1)#0 (be99fd45957068b61784e457308cf99e). 2022-09-08 19:26:11,748 ERROR com.dtstack.chunjun.connector.sqlservercdc.inputFormat.SqlServerCdcInputFormat [] - takeEvent interrupted error:java.lang.InterruptedException at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2014) at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2088) at java.util.concurrent.LinkedBlockingDeque.pollFirst(LinkedBlockingDeque.java:522) at java.util.concurrent.LinkedBlockingDeque.poll(LinkedBlockingDeque.java:684) at com.dtstack.chunjun.connector.sqlservercdc.inputFormat.SqlServerCdcInputFormat.nextRecordInternal(SqlServerCdcInputFormat.java:127) at com.dtstack.chunjun.source.format.BaseRichInputFormat.nextRecord(BaseRichInputFormat.java:197) at com.dtstack.chunjun.source.format.BaseRichInputFormat.nextRecord(BaseRichInputFormat.java:67) at com.dtstack.chunjun.source.DtInputFormatSourceFunction.run(DtInputFormatSourceFunction.java:133) at org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:110) at org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:66) at org.apache.flink.streaming.runtime.tasks.SourceStreamTask$LegacySourceFunctionThread.run(SourceStreamTask.java:267)

2022-09-08 19:26:11,758 WARN com.dtstack.chunjun.dirty.log.LogDirtyDataCollector [] - ====================Dirty Data===================== DirtyDataEntry[jobId='47dcb1c312ebbbd369e3e2c3a4d92c05', jobName='sqlserverCDCToPostgresql', operatorName='Source: TableSourceScan(table=[[default_catalog, default_database, source]], fie', dirtyContent='null', errorMessage='com.dtstack.chunjun.throwable.ReadRecordException: takeEvent interrupted error java.lang.InterruptedException at com.dtstack.chunjun.connector.sqlservercdc.inputFormat.SqlServerCdcInputFormat.nextRecordInternal(SqlServerCdcInputFormat.java:130) at com.dtstack.chunjun.source.format.BaseRichInputFormat.nextRecord(BaseRichInputFormat.java:197) at com.dtstack.chunjun.source.format.BaseRichInputFormat.nextRecord(BaseRichInputFormat.java:67) at com.dtstack.chunjun.source.DtInputFormatSourceFunction.run(DtInputFormatSourceFunction.java:133) at org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:110) at org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:66) at org.apache.flink.streaming.runtime.tasks.SourceStreamTask$LegacySourceFunctionThread.run(SourceStreamTask.java:267) Caused by: java.lang.InterruptedException at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2014) at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2088) at java.util.concurrent.LinkedBlockingDeque.pollFirst(LinkedBlockingDeque.java:522) at java.util.concurrent.LinkedBlockingDeque.poll(LinkedBlockingDeque.java:684) at com.dtstack.chunjun.connector.sqlservercdc.inputFormat.SqlServerCdcInputFormat.nextRecordInternal(SqlServerCdcInputFormat.java:127) ... 6 more ', fieldName='null', createTime=2022-09-08 19:26:11.749]

=================================================== 2022-09-08 19:26:11,757 ERROR com.dtstack.chunjun.source.DtInputFormatSourceFunction [] - Exception happened, start to close format com.dtstack.chunjun.throwable.NoRestartException: The dirty consumer shutdown, due to the consumed count exceed the max-consumed [0] at com.dtstack.chunjun.dirty.consumer.DirtyDataCollector.addConsumed(DirtyDataCollector.java:105) ~[blob_p-1e5f7cb8f16ee2b15f49200098e655906da5b1d4-c118eff94db89f8129729eb499dd7625:?] at com.dtstack.chunjun.dirty.consumer.DirtyDataCollector.offer(DirtyDataCollector.java:79) ~[blob_p-1e5f7cb8f16ee2b15f49200098e655906da5b1d4-c118eff94db89f8129729eb499dd7625:?] at com.dtstack.chunjun.dirty.manager.DirtyManager.collect(DirtyManager.java:140) ~[blob_p-1e5f7cb8f16ee2b15f49200098e655906da5b1d4-c118eff94db89f8129729eb499dd7625:?] at com.dtstack.chunjun.source.format.BaseRichInputFormat.nextRecord(BaseRichInputFormat.java:199) ~[blob_p-1e5f7cb8f16ee2b15f49200098e655906da5b1d4-c118eff94db89f8129729eb499dd7625:?] at com.dtstack.chunjun.source.format.BaseRichInputFormat.nextRecord(BaseRichInputFormat.java:67) ~[blob_p-1e5f7cb8f16ee2b15f49200098e655906da5b1d4-c118eff94db89f8129729eb499dd7625:?] at com.dtstack.chunjun.source.DtInputFormatSourceFunction.run(DtInputFormatSourceFunction.java:133) [blob_p-1e5f7cb8f16ee2b15f49200098e655906da5b1d4-c118eff94db89f8129729eb499dd7625:?] at org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:110) [flink-dist_2.12-1.12.7.jar:1.12.7] at org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:66) [flink-dist_2.12-1.12.7.jar:1.12.7] at org.apache.flink.streaming.runtime.tasks.SourceStreamTask$LegacySourceFunctionThread.run(SourceStreamTask.java:267) [flink-dist_2.12-1.12.7.jar:1.12.7] 2022-09-08 19:26:11,763 WARN com.dtstack.chunjun.connector.sqlservercdc.inputFormat.SqlServerCdcInputFormat [] - shutdown SqlServerCdcListener...... 2022-09-08 19:26:11,763 INFO com.dtstack.chunjun.dirty.log.LogDirtyDataCollector [] - Print consumer closed. 2022-09-08 19:26:31,764 INFO com.dtstack.chunjun.connector.sqlservercdc.inputFormat.SqlServerCdcInputFormat [] - subtask input close finished 2022-09-08 19:26:31,773 INFO com.dtstack.chunjun.connector.jdbc.sink.JdbcOutputFormat [] - taskNumber[0] close() 2022-09-08 19:26:51,775 INFO com.dtstack.chunjun.dirty.log.LogDirtyDataCollector [] - Print consumer closed. 2022-09-08 19:26:51,775 INFO com.dtstack.chunjun.connector.jdbc.sink.JdbcOutputFormat [] - subtask[0}] close() finished 2022-09-08 19:26:51,785 INFO org.apache.flink.runtime.taskmanager.Task [] - Source: TableSourceScan(table=[[default_catalog, default_database, source]], fields=[id, parent_id, project_no, sales_order, plant_code, product_line, sales_order_item, bay_id, order_no, order_status, order_level, sap_order_status, material_no, material_descr, final_material, init_production_center, final_production_center, module_code, material_no_fellow, order_no_fellow, item_component_list, reservation, reservation_item, material_picking1_times, material_picking2_times, first_date, basic_start_time, basic_finish_time, product_start_time, mes_plan_start_time, mes_plan_end_time, mes_task_exec_time, mes_plan_u_time, is_deleted, is_modify, create_user, update_user, create_time, update_time]) -> Calc(select=[CAST(id) AS id, CAST(parent_id) AS parent_id, CAST(project_no) AS project_no, CAST(sales_order) AS sales_order, plant_code, product_line, CAST(sales_order_item) AS sales_order_item, bay_id, CAST(order_no) AS order_no, order_status, CAST(order_level) AS order_level, sap_order_status, CAST(material_no) AS material_no, material_descr, final_material, init_production_center, final_production_center, module_code, material_no_fellow, order_no_fellow, item_component_list, reservation, reservation_item, material_picking1_times, material_picking2_times, first_date, basic_start_time, basic_finish_time, product_start_time, mes_plan_start_time, mes_plan_end_time, mes_task_exec_time, mes_plan_u_time, is_deleted, is_modify, create_user, update_user, create_time, CAST(update_time) AS update_time]) -> Sink: Sink(table=[default_catalog.default_database.sink], fields=[id, parent_id, project_no, sales_order, plant_code, product_line, sales_order_item, bay_id, order_no, order_status, order_level, sap_order_status, material_no, material_descr, final_material, init_production_center, final_production_center, module_code, material_no_fellow, order_no_fellow, item_component_list, reservation, reservation_item, material_picking1_times, material_picking2_times, first_date, basic_start_time, basic_finish_time, product_start_time, mes_plan_start_time, mes_plan_end_time, mes_task_exec_time, mes_plan_u_time, is_deleted, is_modify, create_user, update_user, create_time, update_time]) (1/1)#0 (be99fd45957068b61784e457308cf99e) switched from CANCELING to CANCELED. 2022-09-08 19:26:51,785 INFO org.apache.flink.runtime.taskmanager.Task [] - Freeing task resources for Source: TableSourceScan(table=[[default_catalog, default_database, source]], fields=[id, parent_id, project_no, sales_order, plant_code, product_line, sales_order_item, bay_id, order_no, order_status, order_level, sap_order_status, material_no, material_descr, final_material, init_production_center, final_production_center, module_code, material_no_fellow, order_no_fellow, item_component_list, reservation, reservation_item, material_picking1_times, material_picking2_times, first_date, basic_start_time, basic_finish_time, product_start_time, mes_plan_start_time, mes_plan_end_time, mes_task_exec_time, mes_plan_u_time, is_deleted, is_modify, create_user, update_user, create_time, update_time]) -> Calc(select=[CAST(id) AS id, CAST(parent_id) AS parent_id, CAST(project_no) AS project_no, CAST(sales_order) AS sales_order, plant_code, product_line, CAST(sales_order_item) AS sales_order_item, bay_id, CAST(order_no) AS order_no, order_status, CAST(order_level) AS order_level, sap_order_status, CAST(material_no) AS material_no, material_descr, final_material, init_production_center, final_production_center, module_code, material_no_fellow, order_no_fellow, item_component_list, reservation, reservation_item, material_picking1_times, material_picking2_times, first_date, basic_start_time, basic_finish_time, product_start_time, mes_plan_start_time, mes_plan_end_time, mes_task_exec_time, mes_plan_u_time, is_deleted, is_modify, create_user, update_user, create_time, CAST(update_time) AS update_time]) -> Sink: Sink(table=[default_catalog.default_database.sink], fields=[id, parent_id, project_no, sales_order, plant_code, product_line, sales_order_item, bay_id, order_no, order_status, order_level, sap_order_status, material_no, material_descr, final_material, init_production_center, final_production_center, module_code, material_no_fellow, order_no_fellow, item_component_list, reservation, reservation_item, material_picking1_times, material_picking2_times, first_date, basic_start_time, basic_finish_time, product_start_time, mes_plan_start_time, mes_plan_end_time, mes_task_exec_time, mes_plan_u_time, is_deleted, is_modify, create_user, update_user, create_time, update_time]) (1/1)#0 (be99fd45957068b61784e457308cf99e). 2022-09-08 19:26:51,790 INFO org.apache.flink.runtime.taskexecutor.TaskExecutor [] - Un-registering task and sending final execution state CANCELED to JobManager for task Source: TableSourceScan(table=[[default_catalog, default_database, source]], fields=[id, parent_id, project_no, sales_order, plant_code, product_line, sales_order_item, bay_id, order_no, order_status, order_level, sap_order_status, material_no, material_descr, final_material, init_production_center, final_production_center, module_code, material_no_fellow, order_no_fellow, item_component_list, reservation, reservation_item, material_picking1_times, material_picking2_times, first_date, basic_start_time, basic_finish_time, product_start_time, mes_plan_start_time, mes_plan_end_time, mes_task_exec_time, mes_plan_u_time, is_deleted, is_modify, create_user, update_user, create_time, update_time]) -> Calc(select=[CAST(id) AS id, CAST(parent_id) AS parent_id, CAST(project_no) AS project_no, CAST(sales_order) AS sales_order, plant_code, product_line, CAST(sales_order_item) AS sales_order_item, bay_id, CAST(order_no) AS order_no, order_status, CAST(order_level) AS order_level, sap_order_status, CAST(material_no) AS material_no, material_descr, final_material, init_production_center, final_production_center, module_code, material_no_fellow, order_no_fellow, item_component_list, reservation, reservation_item, material_picking1_times, material_picking2_times, first_date, basic_start_time, basic_finish_time, product_start_time, mes_plan_start_time, mes_plan_end_time, mes_task_exec_time, mes_plan_u_time, is_deleted, is_modify, create_user, update_user, create_time, CAST(update_time) AS update_time]) -> Sink: Sink(table=[default_catalog.default_database.sink], fields=[id, parent_id, project_no, sales_order, plant_code, product_line, sales_order_item, bay_id, order_no, order_status, order_level, sap_order_status, material_no, material_descr, final_material, init_production_center, final_production_center, module_code, material_no_fellow, order_no_fellow, item_component_list, reservation, reservation_item, material_picking1_times, material_picking2_times, first_date, basic_start_time, basic_finish_time, product_start_time, mes_plan_start_time, mes_plan_end_time, mes_task_exec_time, mes_plan_u_time, is_deleted, is_modify, create_user, update_user, create_time, update_time]) (1/1)#0 be99fd45957068b61784e457308cf99e. 2022-09-08 19:26:51,853 INFO org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl [] - Free slot TaskSlot(index:0, state:ACTIVE, resource profile: ResourceProfile{cpuCores=1.0000000000000000, taskHeapMemory=96.000mb (100663293 bytes), taskOffHeapMemory=0 bytes, managedMemory=128.000mb (134217730 bytes), networkMemory=32.000mb (33554432 bytes)}, allocationId: d7d358ae1ec9a0ef253dff422887843a, jobId: 47dcb1c312ebbbd369e3e2c3a4d92c05). 2022-09-08 19:26:51,856 INFO org.apache.flink.runtime.taskexecutor.DefaultJobLeaderService [] - Remove job 47dcb1c312ebbbd369e3e2c3a4d92c05 from job leader monitoring. 2022-09-08 19:26:51,857 INFO org.apache.flink.runtime.taskexecutor.TaskExecutor [] - Close JobManager connection for job 47dcb1c312ebbbd369e3e2c3a4d92c05. 2022-09-08 19:27:03,888 INFO org.apache.flink.runtime.taskexecutor.TaskExecutor [] - Receive slot request 47efabd84e62dba67f8ed58327637411 for job e1f46d26224b42aa18b94370510e5cd3 from resource manager with leader id 00000000000000000000000000000000. 2022-09-08 19:27:03,889 INFO org.apache.flink.runtime.taskexecutor.TaskExecutor [] - Allocated slot for 47efabd84e62dba67f8ed58327637411. 2022-09-08 19:27:03,889 INFO org.apache.flink.runtime.taskexecutor.DefaultJobLeaderService [] - Add job e1f46d26224b42aa18b94370510e5cd3 for job leader monitoring. 2022-09-08 19:27:03,889 INFO org.apache.flink.runtime.taskexecutor.DefaultJobLeaderService [] - Try to register at job manager akka.tcp://flink@localhost:6123/user/rpc/jobmanager_3 with leader id 00000000-0000-0000-0000-000000000000. 2022-09-08 19:27:03,896 INFO org.apache.flink.runtime.taskexecutor.DefaultJobLeaderService [] - Resolved JobManager address, beginning registration 2022-09-08 19:27:03,904 INFO org.apache.flink.runtime.taskexecutor.DefaultJobLeaderService [] - Successful registration at job manager akka.tcp://flink@localhost:6123/user/rpc/jobmanager_3 for job e1f46d26224b42aa18b94370510e5cd3. 2022-09-08 19:27:03,904 INFO org.apache.flink.runtime.taskexecutor.TaskExecutor [] - Establish JobManager connection for job e1f46d26224b42aa18b94370510e5cd3. 2022-09-08 19:27:03,904 INFO org.apache.flink.runtime.taskexecutor.TaskExecutor [] - Offer reserved slots to the leader of job e1f46d26224b42aa18b94370510e5cd3. 2022-09-08 19:27:03,910 INFO org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl [] - Activate slot 47efabd84e62dba67f8ed58327637411. 2022-09-08 19:27:03,915 INFO org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl [] - Activate slot 47efabd84e62dba67f8ed58327637411. 2022-09-08 19:27:03,916 INFO org.apache.flink.runtime.taskexecutor.TaskExecutor [] - Received task Source: TableSourceScan(table=[[default_catalog, default_database, source]], fields=[id, parent_id, project_no, sales_order, plant_code, product_line, sales_order_item, bay_id, order_no, order_status, order_level, sap_order_status, material_no, material_descr, final_material, init_production_center, final_production_center, module_code, material_no_fellow, order_no_fellow, item_component_list, reservation, reservation_item, material_picking1_times, material_picking2_times, first_date, basic_start_time, basic_finish_time, product_start_time, mes_plan_start_time, mes_plan_end_time, mes_task_exec_time, mes_plan_u_time, is_deleted, is_modify, create_user, update_user, create_time, update_time]) -> Calc(select=[CAST(id) AS id, CAST(parent_id) AS parent_id, CAST(project_no) AS project_no, CAST(sales_order) AS sales_order, plant_code, product_line, CAST(sales_order_item) AS sales_order_item, bay_id, CAST(order_no) AS order_no, order_status, CAST(order_level) AS order_level, sap_order_status, CAST(material_no) AS material_no, material_descr, final_material, init_production_center, final_production_center, module_code, material_no_fellow, order_no_fellow, item_component_list, reservation, reservation_item, material_picking1_times, material_picking2_times, first_date, basic_start_time, basic_finish_time, product_start_time, mes_plan_start_time, mes_plan_end_time, mes_task_exec_time, mes_plan_u_time, is_deleted, is_modify, create_user, update_user, create_time, CAST(update_time) AS update_time]) -> Sink: Sink(table=[default_catalog.default_database.sink], fields=[id, parent_id, project_no, sales_order, plant_code, product_line, sales_order_item, bay_id, order_no, order_status, order_level, sap_order_status, material_no, material_descr, final_material, init_production_center, final_production_center, module_code, material_no_fellow, order_no_fellow, item_component_list, reservation, reservation_item, material_picking1_times, material_picking2_times, first_date, basic_start_time, basic_finish_time, product_start_time, mes_plan_start_time, mes_plan_end_time, mes_task_exec_time, mes_plan_u_time, is_deleted, is_modify, create_user, update_user, create_time, update_time]) (1/1)#0 (842d4576a79911446754e60eda36d2be), deploy into slot with allocation id 47efabd84e62dba67f8ed58327637411. 2022-09-08 19:27:03,917 INFO org.apache.flink.runtime.taskmanager.Task [] - Source: TableSourceScan(table=[[default_catalog, default_database, source]], fields=[id, parent_id, project_no, sales_order, plant_code, product_line, sales_order_item, bay_id, order_no, order_status, order_level, sap_order_status, material_no, material_descr, final_material, init_production_center, final_production_center, module_code, material_no_fellow, order_no_fellow, item_component_list, reservation, reservation_item, material_picking1_times, material_picking2_times, first_date, basic_start_time, basic_finish_time, product_start_time, mes_plan_start_time, mes_plan_end_time, mes_task_exec_time, mes_plan_u_time, is_deleted, is_modify, create_user, update_user, create_time, update_time]) -> Calc(select=[CAST(id) AS id, CAST(parent_id) AS parent_id, CAST(project_no) AS project_no, CAST(sales_order) AS sales_order, plant_code, product_line, CAST(sales_order_item) AS sales_order_item, bay_id, CAST(order_no) AS order_no, order_status, CAST(order_level) AS order_level, sap_order_status, CAST(material_no) AS material_no, material_descr, final_material, init_production_center, final_production_center, module_code, material_no_fellow, order_no_fellow, item_component_list, reservation, reservation_item, material_picking1_times, material_picking2_times, first_date, basic_start_time, basic_finish_time, product_start_time, mes_plan_start_time, mes_plan_end_time, mes_task_exec_time, mes_plan_u_time, is_deleted, is_modify, create_user, update_user, create_time, CAST(update_time) AS update_time]) -> Sink: Sink(table=[default_catalog.default_database.sink], fields=[id, parent_id, project_no, sales_order, plant_code, product_line, sales_order_item, bay_id, order_no, order_status, order_level, sap_order_status, material_no, material_descr, final_material, init_production_center, final_production_center, module_code, material_no_fellow, order_no_fellow, item_component_list, reservation, reservation_item, material_picking1_times, material_picking2_times, first_date, basic_start_time, basic_finish_time, product_start_time, mes_plan_start_time, mes_plan_end_time, mes_task_exec_time, mes_plan_u_time, is_deleted, is_modify, create_user, update_user, create_time, update_time]) (1/1)#0 (842d4576a79911446754e60eda36d2be) switched from CREATED to DEPLOYING. 2022-09-08 19:27:03,918 INFO org.apache.flink.runtime.taskmanager.Task [] - Loading JAR files for task Source: TableSourceScan(table=[[default_catalog, default_database, source]], fields=[id, parent_id, project_no, sales_order, plant_code, product_line, sales_order_item, bay_id, order_no, order_status, order_level, sap_order_status, material_no, material_descr, final_material, init_production_center, final_production_center, module_code, material_no_fellow, order_no_fellow, item_component_list, reservation, reservation_item, material_picking1_times, material_picking2_times, first_date, basic_start_time, basic_finish_time, product_start_time, mes_plan_start_time, mes_plan_end_time, mes_task_exec_time, mes_plan_u_time, is_deleted, is_modify, create_user, update_user, create_time, update_time]) -> Calc(select=[CAST(id) AS id, CAST(parent_id) AS parent_id, CAST(project_no) AS project_no, CAST(sales_order) AS sales_order, plant_code, product_line, CAST(sales_order_item) AS sales_order_item, bay_id, CAST(order_no) AS order_no, order_status, CAST(order_level) AS order_level, sap_order_status, CAST(material_no) AS material_no, material_descr, final_material, init_production_center, final_production_center, module_code, material_no_fellow, order_no_fellow, item_component_list, reservation, reservation_item, material_picking1_times, material_picking2_times, first_date, basic_start_time, basic_finish_time, product_start_time, mes_plan_start_time, mes_plan_end_time, mes_task_exec_time, mes_plan_u_time, is_deleted, is_modify, create_user, update_user, create_time, CAST(update_time) AS update_time]) -> Sink: Sink(table=[default_catalog.default_database.sink], fields=[id, parent_id, project_no, sales_order, plant_code, product_line, sales_order_item, bay_id, order_no, order_status, order_level, sap_order_status, material_no, material_descr, final_material, init_production_center, final_production_center, module_code, material_no_fellow, order_no_fellow, item_component_list, reservation, reservation_item, material_picking1_times, material_picking2_times, first_date, basic_start_time, basic_finish_time, product_start_time, mes_plan_start_time, mes_plan_end_time, mes_task_exec_time, mes_plan_u_time, is_deleted, is_modify, create_user, update_user, create_time, update_time]) (1/1)#0 (842d4576a79911446754e60eda36d2be) [DEPLOYING]. 2022-09-08 19:27:03,918 INFO org.apache.flink.runtime.blob.BlobClient [] - Downloading e1f46d26224b42aa18b94370510e5cd3/p-1e5f7cb8f16ee2b15f49200098e655906da5b1d4-9e5182b22a4d5cb6a6815ecfa4d21796 from localhost/127.0.0.1:34063 2022-09-08 19:27:03,980 INFO org.apache.flink.runtime.taskmanager.Task [] - Registering task at network: Source: TableSourceScan(table=[[default_catalog, default_database, source]], fields=[id, parent_id, project_no, sales_order, plant_code, product_line, sales_order_item, bay_id, order_no, order_status, order_level, sap_order_status, material_no, material_descr, final_material, init_production_center, final_production_center, module_code, material_no_fellow, order_no_fellow, item_component_list, reservation, reservation_item, material_picking1_times, material_picking2_times, first_date, basic_start_time, basic_finish_time, product_start_time, mes_plan_start_time, mes_plan_end_time, mes_task_exec_time, mes_plan_u_time, is_deleted, is_modify, create_user, update_user, create_time, update_time]) -> Calc(select=[CAST(id) AS id, CAST(parent_id) AS parent_id, CAST(project_no) AS project_no, CAST(sales_order) AS sales_order, plant_code, product_line, CAST(sales_order_item) AS sales_order_item, bay_id, CAST(order_no) AS order_no, order_status, CAST(order_level) AS order_level, sap_order_status, CAST(material_no) AS material_no, material_descr, final_material, init_production_center, final_production_center, module_code, material_no_fellow, order_no_fellow, item_component_list, reservation, reservation_item, material_picking1_times, material_picking2_times, first_date, basic_start_time, basic_finish_time, product_start_time, mes_plan_start_time, mes_plan_end_time, mes_task_exec_time, mes_plan_u_time, is_deleted, is_modify, create_user, update_user, create_time, CAST(update_time) AS update_time]) -> Sink: Sink(table=[default_catalog.default_database.sink], fields=[id, parent_id, project_no, sales_order, plant_code, product_line, sales_order_item, bay_id, order_no, order_status, order_level, sap_order_status, material_no, material_descr, final_material, init_production_center, final_production_center, module_code, material_no_fellow, order_no_fellow, item_component_list, reservation, reservation_item, material_picking1_times, material_picking2_times, first_date, basic_start_time, basic_finish_time, product_start_time, mes_plan_start_time, mes_plan_end_time, mes_task_exec_time, mes_plan_u_time, is_deleted, is_modify, create_user, update_user, create_time, update_time]) (1/1)#0 (842d4576a79911446754e60eda36d2be) [DEPLOYING]. 2022-09-08 19:27:03,981 INFO org.apache.flink.runtime.taskmanager.Task [] - Obtaining local cache file for 'class_path_2'. 2022-09-08 19:27:03,981 INFO org.apache.flink.runtime.taskmanager.Task [] - Obtaining local cache file for 'class_path_1'. 2022-09-08 19:27:03,982 INFO org.apache.flink.runtime.taskmanager.Task [] - Obtaining local cache file for 'class_path_0'. 2022-09-08 19:27:03,982 INFO org.apache.flink.runtime.blob.BlobClient [] - Downloading e1f46d26224b42aa18b94370510e5cd3/p-e7c57bc8a3d7173eab68d17eb9a58f9cd303e6e7-cc002bf7f965c5cd9385c93c518c2347 from localhost/127.0.0.1:34063 2022-09-08 19:27:03,982 INFO org.apache.flink.streaming.runtime.tasks.StreamTask [] - No state backend has been configured, using default (Memory / JobManager) MemoryStateBackend (data in heap memory / checkpoints to JobManager) (checkpoints: 'null', savepoints: 'null', asynchronous: TRUE, maxStateSize: 5242880) 2022-09-08 19:27:03,982 INFO org.apache.flink.runtime.blob.BlobClient [] - Downloading e1f46d26224b42aa18b94370510e5cd3/p-cd2f8171f8678a059d71abd099b5a343013c93b3-1ef11b219d8865157948395948db6a6e from localhost/127.0.0.1:34063 2022-09-08 19:27:03,982 INFO org.apache.flink.runtime.taskmanager.Task [] - Source: TableSourceScan(table=[[default_catalog, default_database, source]], fields=[id, parent_id, project_no, sales_order, plant_code, product_line, sales_order_item, bay_id, order_no, order_status, order_level, sap_order_status, material_no, material_descr, final_material, init_production_center, final_production_center, module_code, material_no_fellow, order_no_fellow, item_component_list, reservation, reservation_item, material_picking1_times, material_picking2_times, first_date, basic_start_time, basic_finish_time, product_start_time, mes_plan_start_time, mes_plan_end_time, mes_task_exec_time, mes_plan_u_time, is_deleted, is_modify, create_user, update_user, create_time, update_time]) -> Calc(select=[CAST(id) AS id, CAST(parent_id) AS parent_id, CAST(project_no) AS project_no, CAST(sales_order) AS sales_order, plant_code, product_line, CAST(sales_order_item) AS sales_order_item, bay_id, CAST(order_no) AS order_no, order_status, CAST(order_level) AS order_level, sap_order_status, CAST(material_no) AS material_no, material_descr, final_material, init_production_center, final_production_center, module_code, material_no_fellow, order_no_fellow, item_component_list, reservation, reservation_item, material_picking1_times, material_picking2_times, first_date, basic_start_time, basic_finish_time, product_start_time, mes_plan_start_time, mes_plan_end_time, mes_task_exec_time, mes_plan_u_time, is_deleted, is_modify, create_user, update_user, create_time, CAST(update_time) AS update_time]) -> Sink: Sink(table=[default_catalog.default_database.sink], fields=[id, parent_id, project_no, sales_order, plant_code, product_line, sales_order_item, bay_id, order_no, order_status, order_level, sap_order_status, material_no, material_descr, final_material, init_production_center, final_production_center, module_code, material_no_fellow, order_no_fellow, item_component_list, reservation, reservation_item, material_picking1_times, material_picking2_times, first_date, basic_start_time, basic_finish_time, product_start_time, mes_plan_start_time, mes_plan_end_time, mes_task_exec_time, mes_plan_u_time, is_deleted, is_modify, create_user, update_user, create_time, update_time]) (1/1)#0 (842d4576a79911446754e60eda36d2be) switched from DEPLOYING to RUNNING. 2022-09-08 19:27:03,982 INFO org.apache.flink.runtime.blob.BlobClient [] - Downloading e1f46d26224b42aa18b94370510e5cd3/p-aa9a45374ffd87171858a620e49c29157ab3a313-c8e075564c4bd1c52d36f58e1dc59627 from localhost/127.0.0.1:34063 2022-09-08 19:27:04,047 WARN org.apache.flink.metrics.MetricGroup [] - The operator name Sink: Sink(table=[default_catalog.default_database.sink], fields=[id, parent_id, project_no, sales_order, plant_code, product_line, sales_order_item, bay_id, order_no, order_status, order_level, sap_order_status, material_no, material_descr, final_material, init_production_center, final_production_center, module_code, material_no_fellow, order_no_fellow, item_component_list, reservation, reservation_item, material_picking1_times, material_picking2_times, first_date, basic_start_time, basic_finish_time, product_start_time, mes_plan_start_time, mes_plan_end_time, mes_task_exec_time, mes_plan_u_time, is_deleted, is_modify, create_user, update_user, create_time, update_time]) exceeded the 80 characters length limit and was truncated. 2022-09-08 19:27:04,118 WARN org.apache.flink.metrics.MetricGroup [] - The operator name Calc(select=[CAST(id) AS id, CAST(parent_id) AS parent_id, CAST(project_no) AS project_no, CAST(sales_order) AS sales_order, plant_code, product_line, CAST(sales_order_item) AS sales_order_item, bay_id, CAST(order_no) AS order_no, order_status, CAST(order_level) AS order_level, sap_order_status, CAST(material_no) AS material_no, material_descr, final_material, init_production_center, final_production_center, module_code, material_no_fellow, order_no_fellow, item_component_list, reservation, reservation_item, material_picking1_times, material_picking2_times, first_date, basic_start_time, basic_finish_time, product_start_time, mes_plan_start_time, mes_plan_end_time, mes_task_exec_time, mes_plan_u_time, is_deleted, is_modify, create_user, update_user, create_time, CAST(update_time) AS update_time]) exceeded the 80 characters length limit and was truncated. 2022-09-08 19:27:04,120 WARN org.apache.flink.metrics.MetricGroup [] - The operator name Source: TableSourceScan(table=[[default_catalog, default_database, source]], fields=[id, parent_id, project_no, sales_order, plant_code, product_line, sales_order_item, bay_id, order_no, order_status, order_level, sap_order_status, material_no, material_descr, final_material, init_production_center, final_production_center, module_code, material_no_fellow, order_no_fellow, item_component_list, reservation, reservation_item, material_picking1_times, material_picking2_times, first_date, basic_start_time, basic_finish_time, product_start_time, mes_plan_start_time, mes_plan_end_time, mes_task_exec_time, mes_plan_u_time, is_deleted, is_modify, create_user, update_user, create_time, update_time]) exceeded the 80 characters length limit and was truncated. 2022-09-08 19:27:04,120 INFO com.dtstack.chunjun.sink.DtOutputFormatSinkFunction [] - Start initialize output format state 2022-09-08 19:27:04,123 INFO com.dtstack.chunjun.sink.DtOutputFormatSinkFunction [] - Is restored:false 2022-09-08 19:27:04,123 INFO com.dtstack.chunjun.sink.DtOutputFormatSinkFunction [] - End initialize output format state 2022-09-08 19:27:04,305 INFO com.dtstack.chunjun.connector.jdbc.sink.JdbcOutputFormat [] - initTimingSubmitTask() ,initialDelay:1000, delay:1000, MILLISECONDS 2022-09-08 19:27:04,457 INFO com.dtstack.chunjun.connector.jdbc.sink.JdbcOutputFormat [] - write sql:INSERT INTO "siemens2_ETO_PDM_Header"("id", "parent_id", "project_no", "sales_order", "plant_code", "product_line", "sales_order_item", "bay_id", "order_no", "order_status", "order_level", "sap_order_status", "material_no", "material_descr", "final_material", "init_production_center", "final_production_center", "module_code", "material_no_fellow", "order_no_fellow", "item_component_list", "reservation", "reservation_item", "material_picking1_times", "material_picking2_times", "first_date", "basic_start_time", "basic_finish_time", "product_start_time", "mes_plan_start_time", "mes_plan_end_time", "mes_task_exec_time", "mes_plan_u_time", "is_deleted", "is_modify", "create_user", "update_user", "create_time", "update_time") VALUES (:id, :parent_id, :project_no, :sales_order, :plant_code, :product_line, :sales_order_item, :bay_id, :order_no, :order_status, :order_level, :sap_order_status, :material_no, :material_descr, :final_material, :init_production_center, :final_production_center, :module_code, :material_no_fellow, :order_no_fellow, :item_component_list, :reservation, :reservation_item, :material_picking1_times, :material_picking2_times, :first_date, :basic_start_time, :basic_finish_time, :product_start_time, :mes_plan_start_time, :mes_plan_end_time, :mes_task_exec_time, :mes_plan_u_time, :is_deleted, :is_modify, :create_user, :update_user, :create_time, :update_time) ON CONFLICT ("id") DO UPDATE SET "parent_id"=EXCLUDED."parent_id", "project_no"=EXCLUDED."project_no", "sales_order"=EXCLUDED."sales_order", "plant_code"=EXCLUDED."plant_code", "product_line"=EXCLUDED."product_line", "sales_order_item"=EXCLUDED."sales_order_item", "bay_id"=EXCLUDED."bay_id", "order_no"=EXCLUDED."order_no", "order_status"=EXCLUDED."order_status", "order_level"=EXCLUDED."order_level", "sap_order_status"=EXCLUDED."sap_order_status", "material_no"=EXCLUDED."material_no", "material_descr"=EXCLUDED."material_descr", "final_material"=EXCLUDED."final_material", "init_production_center"=EXCLUDED."init_production_center", "final_production_center"=EXCLUDED."final_production_center", "module_code"=EXCLUDED."module_code", "material_no_fellow"=EXCLUDED."material_no_fellow", "order_no_fellow"=EXCLUDED."order_no_fellow", "item_component_list"=EXCLUDED."item_component_list", "reservation"=EXCLUDED."reservation", "reservation_item"=EXCLUDED."reservation_item", "material_picking1_times"=EXCLUDED."material_picking1_times", "material_picking2_times"=EXCLUDED."material_picking2_times", "first_date"=EXCLUDED."first_date", "basic_start_time"=EXCLUDED."basic_start_time", "basic_finish_time"=EXCLUDED."basic_finish_time", "product_start_time"=EXCLUDED."product_start_time", "mes_plan_start_time"=EXCLUDED."mes_plan_start_time", "mes_plan_end_time"=EXCLUDED."mes_plan_end_time", "mes_task_exec_time"=EXCLUDED."mes_task_exec_time", "mes_plan_u_time"=EXCLUDED."mes_plan_u_time", "is_deleted"=EXCLUDED."is_deleted", "is_modify"=EXCLUDED."is_modify", "create_user"=EXCLUDED."create_user", "update_user"=EXCLUDED."update_user", "create_time"=EXCLUDED."create_time", "update_time"=EXCLUDED."update_time" 2022-09-08 19:27:04,487 INFO com.dtstack.chunjun.connector.jdbc.sink.JdbcOutputFormat [] - subTask[0}] wait finished 2022-09-08 19:27:04,517 INFO com.dtstack.chunjun.connector.jdbc.sink.JdbcOutputFormat [] - [JdbcOutputFormat] open successfully, checkpointMode = AT_LEAST_ONCE, checkpointEnabled = false, flushIntervalMills = 1000, batchSize = 1024, [JdbcConf]: { "semantic" : "at-least-once", "errorRecord" : 0, "checkFormat" : true, "parallelism" : 1, "executeDdlAble" : false, "pollingInterval" : 5000, "increment" : false, "flushIntervalMills" : 1000, "polling" : false, "mode" : "UPDATE", "password" : "**", "restoreColumnIndex" : -1, "connection" : [ { "table" : [ "siemens2_ETO_PDM_Header" ], "jdbcUrl" : "jdbc:postgresql://192.168.0.116:52345/postgres", "allReplace" : true } ], "table" : "siemens2_ETO_PDM_Header", "queryTimeOut" : 0, "fetchSize" : 0, "useMaxFunc" : false, "uniqueKey" : [ "id" ], "column" : [ { "name" : "id", "type" : "DECIMAL(10, 0) NOT NULL", "index" : 0, "notNull" : false, "part" : false }, { "name" : "parent_id", "type" : "DECIMAL(10, 0) NOT NULL", "index" : 1, "notNull" : false, "part" : false }, { "name" : "project_no", "type" : "VARCHAR(50) NOT NULL", "index" : 2, "notNull" : false, "part" : false }, { "name" : "sales_order", "type" : "VARCHAR(50) NOT NULL", "index" : 3, "notNull" : false, "part" : false }, { "name" : "plant_code", "type" : "VARCHAR(50)", "index" : 4, "notNull" : false, "part" : false }, { "name" : "product_line", "type" : "VARCHAR(50)", "index" : 5, "notNull" : false, "part" : false }, { "name" : "sales_order_item", "type" : "VARCHAR(50) NOT NULL", "index" : 6, "notNull" : false, "part" : false }, { "name" : "bay_id", "type" : "VARCHAR(50)", "index" : 7, "notNull" : false, "part" : false }, { "name" : "order_no", "type" : "VARCHAR(50) NOT NULL", "index" : 8, "notNull" : false, "part" : false }, { "name" : "order_status", "type" : "INT", "index" : 9, "notNull" : false, "part" : false }, { "name" : "order_level", "type" : "INT NOT NULL", "index" : 10, "notNull" : false, "part" : false }, { "name" : "sap_order_status", "type" : "VARCHAR(50)", "index" : 11, "notNull" : false, "part" : false }, { "name" : "material_no", "type" : "VARCHAR(50) NOT NULL", "index" : 12, "notNull" : false, "part" : false }, { "name" : "material_descr", "type" : "VARCHAR(255)", "index" : 13, "notNull" : false, "part" : false }, { "name" : "final_material", "type" : "VARCHAR(50)", "index" : 14, "notNull" : false, "part" : false }, { "name" : "init_production_center", "type" : "VARCHAR(50)", "index" : 15, "notNull" : false, "part" : false }, { "name" : "final_production_center", "type" : "VARCHAR(50)", "index" : 16, "notNull" : false, "part" : false }, { "name" : "module_code", "type" : "VARCHAR(50)", "index" : 17, "notNull" : false, "part" : false }, { "name" : "material_no_fellow", "type" : "VARCHAR(50)", "index" : 18, "notNull" : false, "part" : false }, { "name" : "order_no_fellow", "type" : "VARCHAR(50)", "index" : 19, "notNull" : false, "part" : false }, { "name" : "item_component_list", "type" : "VARCHAR(50)", "index" : 20, "notNull" : false, "part" : false }, { "name" : "reservation", "type" : "VARCHAR(50)", "index" : 21, "notNull" : false, "part" : false }, { "name" : "reservation_item", "type" : "VARCHAR(50)", "index" : 22, "notNull" : false, "part" : false }, { "name" : "material_picking1_times", "type" : "INT", "index" : 23, "notNull" : false, "part" : false }, { "name" : "material_picking2_times", "type" : "INT", "index" : 24, "notNull" : false, "part" : false }, { "name" : "first_date", "type" : "TIMESTAMP(6)", "index" : 25, "notNull" : false, "part" : false }, { "name" : "basic_start_time", "type" : "TIMESTAMP(6)", "index" : 26, "notNull" : false, "part" : false }, { "name" : "basic_finish_time", "type" : "TIMESTAMP(6)", "index" : 27, "notNull" : false, "part" : false }, { "name" : "product_start_time", "type" : "TIMESTAMP(6)", "index" : 28, "notNull" : false, "part" : false }, { "name" : "mes_plan_start_time", "type" : "TIMESTAMP(6)", "index" : 29, "notNull" : false, "part" : false }, { "name" : "mes_plan_end_time", "type" : "TIMESTAMP(6)", "index" : 30, "notNull" : false, "part" : false }, { "name" : "mes_task_exec_time", "type" : "TIMESTAMP(6)", "index" : 31, "notNull" : false, "part" : false }, { "name" : "mes_plan_u_time", "type" : "TIMESTAMP(6)", "index" : 32, "notNull" : false, "part" : false }, { "name" : "is_deleted", "type" : "INT", "index" : 33, "notNull" : false, "part" : false }, { "name" : "is_modify", "type" : "INT", "index" : 34, "notNull" : false, "part" : false }, { "name" : "create_user", "type" : "VARCHAR(50)", "index" : 35, "notNull" : false, "part" : false }, { "name" : "update_user", "type" : "VARCHAR(50)", "index" : 36, "notNull" : false, "part" : false }, { "name" : "create_time", "type" : "TIMESTAMP(6)", "index" : 37, "notNull" : false, "part" : false }, { "name" : "update_time", "type" : "TIMESTAMP(6)", "index" : 38, "notNull" : false, "part" : false } ], "errorPercentage" : -1, "fieldNameList" : [ ], "withNoLock" : false, "increColumnIndex" : -1, "allReplace" : true, "initReporter" : true, "jdbcUrl" : "jdbc:postgresql://192.168.0.116:52345/postgres", "connectTimeOut" : 0, "batchSize" : 1024, "speedBytes" : 0, "rowSizeCalculatorType" : "objectSizeCalculator", "metricPluginName" : "prometheus", "username" : "postgres" } 2022-09-08 19:27:04,518 INFO com.dtstack.chunjun.source.DtInputFormatSourceFunction [] - Start initialize input format state, is restored:false 2022-09-08 19:27:04,520 INFO com.dtstack.chunjun.source.DtInputFormatSourceFunction [] - End initialize input format state 2022-09-08 19:27:04,531 INFO com.dtstack.chunjun.connector.sqlservercdc.inputFormat.SqlServerCdcInputFormat [] - sqlServer cdc openInternal split number:0 start... 2022-09-08 19:27:04,677 INFO com.dtstack.chunjun.connector.sqlservercdc.inputFormat.SqlServerCdcInputFormat [] - SqlserverCdcInputFormat[sqlserverCDCToPostgresql2]open: end 2022-09-08 19:27:04,677 INFO com.dtstack.chunjun.connector.sqlservercdc.listener.SqlServerCdcListener [] - SqlServerCdcListener start running..... 2022-09-08 19:27:04,685 INFO com.dtstack.chunjun.connector.sqlservercdc.inputFormat.SqlServerCdcInputFormat [] - [SqlServerCdcInputFormat] open successfully, inputSplit = GenericSplit (0/1), [SqlServerCdcConf]: { "semantic" : "at-least-once", "databaseName" : "CDCTest", "errorRecord" : 0, "checkFormat" : true, "parallelism" : 1, "executeDdlAble" : false, "errorPercentage" : -1, "flushIntervalMills" : 10000, "fieldNameList" : [ ], "url" : "jdbc:sqlserver://192.168.8.212:1433;databaseName=CDCTest", "pavingData" : true, "password" : "**", "pollInterval" : 1000, "splitUpdate" : false, "cat" : "insert,delete,update", "tableList" : [ "dbo.ETO_PDM_Header" ], "timestampFormat" : "sql", "autoResetConnection" : false, "batchSize" : 1, "autoCommit" : false, "speedBytes" : 0, "rowSizeCalculatorType" : "objectSizeCalculator", "metricPluginName" : "prometheus", "username" : "sa" }