I am currently experimenting with tspDB and have a scenario that can be compared to use case 3.
I have data for the last 50 years on a monthly timestep and the columns are correlated (they actually represent geographical areas).
These areas (or stocks in terminology of use-case 3 in your demo) exceed the dimensional limits of psql as they count over 65000.
I found that the maximum of columns that I can use lies at about 1000, (psql max is 1600).
Is there some way (maybe over multiindex or sth) to make use of all of the data to train the model?
Hello,
I am currently experimenting with tspDB and have a scenario that can be compared to use case 3. I have data for the last 50 years on a monthly timestep and the columns are correlated (they actually represent geographical areas). These areas (or stocks in terminology of use-case 3 in your demo) exceed the dimensional limits of psql as they count over 65000. I found that the maximum of columns that I can use lies at about 1000, (psql max is 1600). Is there some way (maybe over multiindex or sth) to make use of all of the data to train the model?
Thanks a lot in advance!