[X] I had searched in the issues and found no similar issues.
What happened
Read kafka data in real time. Incremental data writing to the hive cannot be implemented. After a task is started, only one file is written to the hdfs using the hive metadata service
Search before asking
What happened
Read kafka data in real time. Incremental data writing to the hive cannot be implemented. After a task is started, only one file is written to the hdfs using the hive metadata service
SeaTunnel Version
2.3.5
SeaTunnel Config
Running Command
Error Exception
Zeta or Flink or Spark Version
flink 1.15
Java or Scala Version
No response
Screenshots
No response
Are you willing to submit PR?
Code of Conduct