-
### What is the problem the feature request solves?
Comet currently only support either Spark's built-in data sources, or Iceberg ([WIP](https://github.com/apache/iceberg/pull/9841)). We should also …
-
Any plans to support delta lake? Keep the CDM specific manifests / metadata etc. in ADLS Gen 2 and data in delta. Also, this removes a lot of operational burden including partitioning etc.
I like…
-
Support transactions that e.g. insert data concurrently, or such that modify data within disjoint data sets (eg partitions).
-
## Bug
### Describe the problem
The [PROTOCOL](https://github.com/delta-io/delta/blob/master/PROTOCOL.md) doesn't describe the CRC files in delta_log, is it intentional?
Example: 00000000000…
-
Does MyDuck Sever support data persistence to object storage(eg. S3 , COS)?
-
Hi folks, thanks for building such a great tool and extension.
I'm curious where write support falls on the priority list. I'm working on a data lakehouse+etl architecture and the write step is qu…
-
One of the things that it is really nice to have is the delta lake support (https://delta.io/) for Spark. It is wildly used right now (pushed by Databricks).
Is there any good way to start looking at…
-
## Feature request
#### Which Delta project/connector is this regarding?
- [x] Spark
- [ ] Standalone
- [ ] Flink
- [ ] Kernel
- [ ] Other (fill in here)
### Overview
It's not possib…
-
I updated my delta lake to version 3.0.0 and user pyspark 3.5.0.
I Use this code to save my df in delta lake:
```
df\
.write\
.format("delta")\
.saveAsTable(
…
-
The Delta connector currently only collects statistics for NDVs, using an HLL, and column sizes. This was done because in most situations the transaction log contains the rest of the statistics we nee…