-
## Feature request
#### Which Delta project/connector is this regarding?
- [x] Spark
- [ ] Standalone
- [ ] Flink
- [ ] Kernel
- [ ] Other (fill in here)
### Overview
We are hopi…
-
We are using Spark Rapids + Spark Thrift Server to serve SQL request on Spark 3.3.0 and Rapids 23.10 and we have a Delta Table partitioned by a column, runName.
We executed a SQL query `SELECT DIST…
-
Follow-up of https://github.com/trinodb/trino/pull/21052
Also, we should verify the count of file listing when addressing this issue.
ebyhr updated
2 months ago
-
**Describe the bug**
Unable to write to seasweedfs using delta format hdfs client libraries. Parquet format works.
```python
#%%
import pyspark
#%% when not using spark submit
builder = pysp…
-
Currently in order to support lakeFS over a lakehouse, which basically means that it would be possible to communicate with lakeFS on Databricks and Delta Lake tables, it's needed to deactivate multi c…
-
### Short description
Delta Lake (https://delta.io/) provides a transactional storage layer on top of data lakes that could be used to stream data to and from S3 compatible storage.
### Deta…
-
### Checks
- [X] I have checked that this issue has not already been reported.
- [X] I have confirmed this bug exists on the [latest version](https://pypi.org/project/polars/) of Polars.
##…
-
Today Iceberg writes only support merge-on-read mode. Copy-on-write mode is a frequent ask for users that want better file layout without the need to run compactions frequently.
Technically this co…
-
## Description
Add slts for delta lake.
Using unity catalog. We can probably just the aws databricks deployment, but there might be some setup/teardown involved for ensuring we query the rig…
-
Hi
delta lake provides ACID transformation to parquet files.
AWS provides a delta-lake connector for its commercial presto.
Starbust blogged recently that they are in this process.
Is there a…