-
## Description
In ETL pipelines, loading transformed data into various data warehouses is a critical requirement. Currently, the `ibis.TableDataset ` connector in Kedro does not support data insertio…
-
### Description
Right now, when we load 500 rows in the SQL runner, we do this by adding a CTE and a `limit` clause wrapped around the SQL query.
The issue with this is that the ordering isn't gu…
-
### Description:
UAI : Example JSONs - Prepare example JSONs based on the UAI use case.
UAI : Gather JSONs from network participants (NPs) to review.
UAI : Review Beckn Compliance for NPs.
[Link](htt…
-
PDF:[p1249-cohen.pdf](https://github.com/mrdrivingduck/paper-outline/files/7095526/p1249-cohen.pdf)
讲述大规模数据仓库如何扩展。由于扩展过程中不可避免的 dumping 和 reloading 操作,因此是一个复杂且易错的过程。本文的核心是提出 robust 和事务一致的原语,从而实现高效的数…
-
## Description
In ETL pipelines, updating the existing records in data warehouses is a critical requirement. Currently, the `ibis.TableDataset ` connector in Kedro does not support `Upsert`() into Ib…
-
**Description**
When I call `set_workspace_warehouse_config` method from the `WarehousesAPI`, I get an exception
`databricks.sdk.core.DatabricksError: enable_serverless_compute is required`. But th…
-
### Information about bug
Aging stock report is not getting correct data. Aging of stock supposed to calculate based on GRN,, but its calculating based on warehouse transfer date. Eg. We have one com…
-
### Use Cases or Problem Statement
A new privilege [MANAGE SHARE TARGET](https://docs.snowflake.com/en/release-notes/bcr-bundles/2024_07/bcr-1734) was added to Snowflake. Trying to manage this throug…
-
### Component(s)
exporter/file
### Is your feature request related to a problem? Please describe.
Parquet Format:
Parquet is a columnar storage file format optimized for big data processing framew…
-
In 2.0 (I haven't validated 1.1 or earlier), Robotics grants the robot frames only. To unlock passive provider and storage chests, either Construction Robotics or Logistic Robotics must then be resear…