Closed Guanpx closed 2 years ago
What is the precision of the decimal, you mean the decimal written by Spark can not be read with flink ?
What is the precision of the decimal, you mean the decimal written by Spark can not be read with flink ? Sorry for the late reply,column precision is DECIMAL(20,10) and we use flink write data always, that Exception occured when upgrade hudi 0.10.1 to hudi 0.11.0
@Guanpx : do you still face the issue or got ti resolved? if yes, can you post how you got it resolved. Can you try out 0.12 and let us know if its still an issue. any updates would be appreciable.
I dont find reason; clean history data and reload that will be ok; Maybe recordkey has comma(,)
Describe the problem you faced
upgrade hudi 0.10.1 to hudi 0.11.0, with flink and cow table
To Reproduce
Steps to reproduce the behavior:
Environment Description
Hudi version : 0.11.0
Flink version : 1.13.2
Hive version : 2.1.1
Hadoop version : 3.0.0
Storage (HDFS/S3/GCS..) : HDFS
Running on Docker? (yes/no) : no
Additional context
Add any other context about the problem here.
Stacktrace