-
Hello,
We have been experimenting with a multi-writer setup and have confirmed that it works perfectly with two writers. The image below shows our sample setup:
![image](https://github.com/apach…
-
**Describe the problem you faced**
Hoodie MAGIC was written twice to a log file which led to "invalid block byte type found" when reading it during compaction.
**Environment Description**
* H…
xccui updated
1 month ago
-
**Describe the problem you faced**
One of my Spark jobs is reading data from a Hudi COPY_ON_WRITE table using the snapshot query type. The job runs once a day, but the table is updated every hour. …
-
When can i use hoodie? Will the user see these data i stored in browser?
-
First of all, thank you all for such an amazing tool 💯
I'm building an app that I want to be resilient when dealing with adverse network conditions and I think Hoodie is the right choice so I've b…
-
Hi.
I noticed that the tutorial/doc misleads people to the old doc, which I think has much difference than the current hoodie. I spent time reading through the old sample/doc and I noticed someone el…
-
We have much hudi table of version 0.6.0, We want to upgrade to version 0.14.1 or 0.15.0, so we did some test. When we write table of version 0.6.0 with client of 0.15.0, some error happen
**To Re…
-
**Describe the problem you faced**
Hi, I am trying a use case to use multi writer to write data into different partitions with version 0.14. I found this medium article which says I can do multi writ…
-
**_Tips before filing an issue_**
- Have you gone through our [FAQs](https://hudi.apache.org/learn/faq/)?
- Join the mailing list to engage in conversations and get faster support at dev-subscribe@h…
-
**Problem**
We were running `Spark 3.2.1` along with `HUDI 0.11.1`. The jar link is: https://repo1.maven.org/maven2/org/apache/hudi/hudi-spark3.2-bundle_2.12/0.11.1/hudi-spark3.2-bundle_2.12-0.11.1…