Closed BruceKellan closed 2 years ago
Can someone help me, Can I create partitions in advance?
@BruceKellan not very clear what the issue is. what do you mean by this?
In next day, dwd_data's max time was '2022-03-08 23:59:59.000'. It seem that it cannot read new data in day=2022-03-09
@xushiyan
When I start the flink application in day 2022-03-08
, the partition day=2022=03-09/type=Login
does not exist.
The next day (2022-03-09), the partition day=2022=03-09/type=Login
will be generated in the table ods_data
,
The expected behavior is that the flink application can read new data for 2022-03-09, but not now.
I have some idea to optimize and I will open a new issue. Thanks your reply.
HI,I encounter this problem again in flink1.16.1-hudi13.1. have you solved this problem?
yeah, 0.14.0 supports this fearure.
THANKS
https://github.com/apache/hudi/issues/11090 I have met a new problem these days,could you please help me.I have post it in github.
---Original--- From: "Danny @.> Date: Thu, Apr 25, 2024 11:26 AM To: @.>; Cc: @.**@.>; Subject: Re: [apache/hudi] [SUPPORT] Flink Streaming read about dynamic daypartition (Issue #4993)
yeah, 0.14.0 supports this fearure.
— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.***>
To Reproduce
ods_data
, streaming insert into it.ods_data
partitioned by day, type.dwd_data
fromods_data
, sql like this:dwd_data
's max time was '2022-03-08 23:59:59.000'. It seem that it cannot read new data inday=2022-03-09
Expected behavior
flink sql + hudi can discover new partition dynamically. Job will auto read new data in
day=2022-03-09
.Environment Description