Not sure what I was expecting. It seems when scanning a Postgres native table, all time is spent in reading from it, and - at least in this simple case - dwarfs any vectorization in DuckDB.
Feel free to close if this is expected.
I created a simple table:
=> \d x
Table "x"
Column | Type | Collation | Nullable | Default
--------+---------+-----------+----------+---------
y | numeric | | |
z | numeric | | |
=> select count(*) from x;
count
----------
16777216
(1 row)
=> analyze x;
ANALYZE
=> set max_parallel_workers=0;
SET
Then:
=> select sum(z) from x;
sum
--------------------------------
8385930.2158780710128370101138
(1 row)
Time: 3167.987 ms (00:03.168)
=> set duckdb.execution to true;
SET
=> select sum(z) from x;
sum
-------------------
8385930.215876569
(1 row)
Time: 6474.084 ms (00:06.474)
Not sure what I was expecting. It seems when scanning a Postgres native table, all time is spent in reading from it, and - at least in this simple case - dwarfs any vectorization in DuckDB.
Feel free to close if this is expected.
I created a simple table:
Then:
Some details:
So all time is spent in the Postgres scan.
Some perf counters: