-
Hi!
I have tried to follow this to create a Linked Service connecting to Fabric LakeHouse, but so far with no success.
Firstly, it was not possible to find my SPN (app registration) directly - I…
-
### Issues Policy acknowledgement
- [X] I have read and agree to submit bug reports in accordance with the [issues policy](https://www.github.com/mlflow/mlflow/blob/master/ISSUE_POLICY.md)
### Where…
-
Creating this issue around findings for optimization scope in Iceberg Connector -
Presto calls Iceberg `Scan::planFiles` (https://github.com/apache/iceberg/blob/apache-iceberg-1.3.1/api/src/main/ja…
-
### Is there an existing issue for this?
- [X] I have searched the existing issues and did not find a match.
### Who can help?
_No response_
### What are you working on?
I am trying to optimize w…
-
## New Features List for 4 Core Products:
* Mohaymen-ICT/star-protocols#14
* Mohaymen-ICT/star-protocols#15
* Mohaymen-ICT/star-protocols#16
* Mohaymen-ICT/star-protocols#17
## Technica…
-
This is more FYI than an issue, but it may be nice for other people to know.
We are using BC2ADLS in the Fabric version to copy data from BC to Onelake in Microsoft Fabric.
This worked really well…
-
**What**
That is a requirement for `pg_lakehouse`, and we should probably do it for `pg_search` too, while we're at it. Specifically we need the UUID file to not be written as a file to the filesyste…
-
Is there any way, to capture the semantic model is built on top of which lakehouse or warehouse?
-
Hello,
I was trying to trigger the creation of file in "reset" folder that is used in the "CopyBusinessCentral" notebook:
`folder_path_reset = '/lakehouse/default/Files/reset/'`
```
%%pyspark
…
-
Hi everyone,
i want to raise a discussion about the current behavior in drill regarding parquet timestamps.
Drill uses `INT64` for timestamps and you can switch to `INT96` by setting `store.par…