-
Write a Task that executes the DS Docker container to upload and catalog local data.
Eventually this Task needs to be part of the new Aopplication package DAG and be invoked after execution of the pro…
-
Help us understand your request (check below):
- [x] other
**Describe what you're trying to do**
https://github.com/ArctosDB/code-table-work/issues/87 has introduced a new term. I'm not sure…
-
### Describe the problem
As discussed, we want to shift away from the catalogs repo being our source of truth so we can include more of the dynamic info that comes from our data pipeline.
### Propo…
-
1) Use scalar_one() over scalar() since the expected is to return one user/list
2) Extra } on @router.delete("/users/{user_id}/lists/{list_id}}")
3) On def get_stores(): the try has the db transact…
-
### What happened
I am running table compaction using Spark Actions. my spark action code is:
```scala
sparkActions
.rewriteDataFiles(table)
.option(RewriteDataFiles.PARTI…
-
Collaborators at CUAHSI will work on this to extend the record registration to the data catalog.
#JIRA=CAM-54
-
Hello,
I was confused that the following snippet returned a "normal" `xarray` object instead of one backed by dask arrays:
```py
cat = intake.open_catalog("https://data.nextgems-h2020.eu/catalo…
-
Write a DAG/Task that invokes the DS Docker container to stage-in data from the DS catalog.
Eventually this Task needs to be executed as part of the new Application Package CWL DAG, and followed by th…
-
As a Data Engineer I would like to use the Unity Catalog on the Parking solution sample so I understand how it works and I don't need to work with the legacy Hive Metastore
**DoD**
- Successfully dep…
-
## Enhancement
We're using Iceberg/S3 as a backing store, and looking at Starrocks for compute (**i.e. ingestion into iceberg outside of SR, but using a SR external iceberg catalog for read-only comp…