Besides modeling matters, I’ve used IRIStbx intensively on managing time series database and there is no doubt that the toolbox is very good at it. However, I experienced that, when working with a very large database (e.g. >100,000 series), the function like ‘db2array’ seems to be very slow.
Is there any way to speed up db2array()?
Most of the tasks I’ve done is just converting a database containing a bunch of simple 1-colomn time series objects.
Compare to a simple piece of codes below…
range = mm(2002,1):mm(2017,12);
list = dbnames(db);
x = zeros(length(range),length(list));
for i = 1:length(list)
x(:,i) = db.(list{i})(range);
end
it runs 4X faster than db2array(db). I know that db2array does a lot more complicated jobs (e.g. handling n-dim array). But for this kind of simple task, can we command the db2array function to do only necessary steps? Or if it’s not feasible to alter db2array, it’d be good if we can have another function properly written for this (maybe db2arrray2D).
Hi,
Besides modeling matters, I’ve used IRIStbx intensively on managing time series database and there is no doubt that the toolbox is very good at it. However, I experienced that, when working with a very large database (e.g. >100,000 series), the function like ‘db2array’ seems to be very slow. Is there any way to speed up db2array()?
Most of the tasks I’ve done is just converting a database containing a bunch of simple 1-colomn time series objects. Compare to a simple piece of codes below…
it runs 4X faster than db2array(db). I know that db2array does a lot more complicated jobs (e.g. handling n-dim array). But for this kind of simple task, can we command the db2array function to do only necessary steps? Or if it’s not feasible to alter db2array, it’d be good if we can have another function properly written for this (maybe db2arrray2D).
Best, Tos