-
This is a proposed alternative to how types are defined in the 3D Metadata spec, inspired by [basic datatypes in USD](https://graphics.pixar.com/usd/docs/api/_usd__page__datatypes.html).
# Proposed…
-
Hi, I have the following MariaDB table in my TRIADB project and I would like to construct a similar one in Clickhouse. I want also to use arrays for the composite indexes.
```sql
CREATE TABLE DAT…
-
The [Apache Parquet](http://parquet.apache.org/) Format is an open-source columnar storage format used in high performance data analysis systems. The STOQS UI offers several Measured Parameter Data Ac…
-
Is there any reason why you did not go with [apache arrow](https://arrow.apache.org/) format from the beginning?
It would be at least nice, if you allowed `to_arrow_table` and `from_arrow_table` co…
-
**Describe the bug**
Calculated measure with a filter, which did work with external preaggregation using Postgres, now is throwing divide by zero error using Cubestore.
**To Reproduce**
Create tw…
-
**Describe the bug**
Running tpcds query fails with
```
Caused by: org.apache.spark.SparkException:
Job aborted due to stage failure: ClassNotFound with classloader:
scala.tools.nsc.interprete…
-
Hi,
I have built citus extension from the github master for Mac.
Server is PostgreSQL 13.2 on Mac using "Postgresapp.com"'s Mac App.
I have done a little performance testing based on the follow…
-
From what I understood, Parquet is for storage and arrow for in memory querying, are you planning to offer this on the JS side or that project is mainly for learning only?
Similarly, it seems in th…
-
I'm trying to copy a big database into Spark using spark_read_csv, but I'm getting the following error as output:
> Error: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 …
-
Support the Apache Arrow data format in Arcon. It will require #45 to be completed first.