Closed marcusfechner closed 3 years ago
hi @marcusfechner for long selected periods of time we recommend lowering the batch size and increasing the batch pause. See: https://github.com/Leo4815162342/dukascopy-tools/tree/master/packages/dukascopy-node/examples/with-custom-batching#keep-in-mind For your case I'd recommend to stick to the default values (batch size of 10, and batch pause of 1000ms):
npx dukascopy-cli -i aaplususd -v -from 1980-06-02 -to 2021-06-06 -t m1 -f csv -bs 10 -bp 1000
Or breakdown your queries in smaller chunks either by month or by year, since the resulting csv file can be very big
Thanks Leo, so actually the servers should be the problem here as to many requests are performed, got it. I just thought I could speed up the process as I am creating a dataset with the history of all symbols.
Thanks for clarifying the issue.
Hey Leo, first of all thanks for the nice scipt, it is the only one of all dukascopy libraries that actually fetches the same data as I would get from the original homepage.
Now to the problem: I am using the dukascopy-cli and when I increase the batch-size to more than 40, I always get the error...
... after the progressbar reaches 100%
The exact command I used:
Are you experiencing the same issue with higher batch-sizes? If you need additional information, please reach out :)