-
**Describe the bug**
I am trying to convert the results obtained from checkpoint to a json normalized format and read into a pyspark dataframe. Appreciate your help on the same.
[Environment Set up…
-
Under "Lakehouse > Building Data Lake > Hive" there are two segments that include a "TODO" placeholder, which means an incomplete documentation.
See following link: https://doris.apache.org/docs/lakeh…
-
## Tell us about the problem you're trying to solve
**Edit: the GCS connector 'Bucket Path' setup has similar (but incomplete) functionality to the S3 connector but is undocumented. Critically the S3…
zbrak updated
1 month ago
-
Creating this issue around findings for optimization scope in Iceberg Connector -
Presto calls Iceberg `Scan::planFiles` (https://github.com/apache/iceberg/blob/apache-iceberg-1.3.1/api/src/main/ja…
-
Hello to all,
I'm trying to get the information for the guardrails of direct lake for a set of specific lakehouse the idea is to have a report that will allow us to follow the guardrails when the d…
-
## Feature request
#### Which Delta project/connector is this regarding?
- [x] Spark
- [x] Standalone
- [ ] Flink
- [ ] Kernel
- [ ] Other
### Overview
I want to propose is an _e…
-
**Describe the bug**
I was trying to use daft to query a hudi-based lakehouse. An example is shown below. It can print the schema successfully but it'd fail when converting to pandas dataframe with a…
-
**Describe the bug**
What I am trying to do is to create a blank semantic model, connect it to a lakehouse in the workspace, define the tables and the columns that I need from the Lakehouse and to ad…
-
Dremio has evolved on past years, adopting project and catalog perspectives such as a lakehouse platform. This create a need for a new connector in cube that help to connect to Dremio Cloud, using the…
-
I originally had a Oct 2023 version of bc2adls. Unfortunately BC was upgraded to V24 and I experienced the following issue when exporting the schema:
"Unable to write content to request stream; cont…