Closed mtomic-quant closed 1 year ago
@mtomic-quant Thank you for raising the issue.
I've released a new beta version that addresses it and improves performance when downloading data for long periods. You can test it by installing the specific version 1.31.3-beta.1
npm install dukascopy-node@1.31.3-beta.1
Let me know how it goes
more info here: https://github.com/Leo4815162342/dukascopy-node/pull/119
dukascopy-node@1.31.3-beta.2
- few small fixes and updates
@Leo4815162342 great stuff!
released stable: dukascopy-node@1.31.3
- https://github.com/Leo4815162342/dukascopy-node/releases/tag/v1.31.3
I use Dukascopy via CLI and I mainly download tick data, for large period of time such as tick data for last 10 years.
I noticed I get
JavaScript heap out of memory
type of error which I assume means there was not enough RAM to process the data after the download. However I have 128GB of RAM and I sometimes get this error even with only 2 years of tick data.My questions are:
Could I download 10 years of tick data with a single command (at the moment I download 1 year at once) but process this data in batches so I do not run out of RAM?
Once the download slider reaches 100% its stuck there for some time which I assume is data processing. Any way this can be sped up and all cores utilised?
Thank you