QGIS3 handles natively PostgreSQL arrays: they are converted into QList and then into Python lists with PyQt.
However, it appears it is very slow.
Here is a short benchmark:
1/ create a database with the 3 following tables:
create table t_array as select 1 as id, array_agg(random()) as values from generate_series(1,100000);
create table t_string as select 1 as id, array_to_string(array_agg(random()),',') as values from generate_series(1,100000);
-- simulate a bytea of 100000 8-bytes floating point numbers
create table t_bytea as select 1 as id, decode(repeat('0102030405060708', 100000), 'hex') as values;
2/ Load them in QGIS and run:
import time
import array
l_array = QgsProject.instance().mapLayersByName("t_array")[0]
l_string = QgsProject.instance().mapLayersByName("t_string")[0]
l_bytea = QgsProject.instance().mapLayersByName("t_bytea")[0]
start = time.time()
for f in l_array.getFeatures():
assert len(f["values"])==100000
print("array", time.time() - start)
start = time.time()
for f in l_string.getFeatures():
tt = [None if x == 'NULL' else float(x) for x in f["values"].split(",")]
assert len(tt) == 100000
print("string", time.time() - start)
start = time.time()
for f in l_bytea.getFeatures():
tt = array.array("d", f["values"].data())
assert len(tt) == 100000
print("bytea", time.time() - start)
QGIS3 handles natively PostgreSQL arrays: they are converted into QList and then into Python lists with PyQt.
However, it appears it is very slow.
Here is a short benchmark:
1/ create a database with the 3 following tables:
2/ Load them in QGIS and run:
I have the following results (in seconds):