Leo4815162342 / dukascopy-node

✨ Download historical price tick data for Crypto, Stocks, ETFs, CFDs, Forex via CLI and Node.js ✨
https://dukascopy-node.app
MIT License
365 stars 68 forks source link

Error in data stream, when increasing the default batch-size #41

Closed marcusfechner closed 3 years ago

marcusfechner commented 3 years ago

Hey Leo, first of all thanks for the nice scipt, it is the only one of all dukascopy libraries that actually fetches the same data as I would get from the original homepage.

Now to the problem: I am using the dukascopy-cli and when I increase the batch-size to more than 40, I always get the error...

Something went wrong: "Error in data stream"

... after the progressbar reaches 100%

The exact command I used:

npx dukascopy-cli -i aaplususd -v -from 1980-06-02 -to 2021-06-06 -t m1 -f csv -bs 40

Are you experiencing the same issue with higher batch-sizes? If you need additional information, please reach out :) dukascopy

Leo4815162342 commented 3 years ago

hi @marcusfechner for long selected periods of time we recommend lowering the batch size and increasing the batch pause. See: https://github.com/Leo4815162342/dukascopy-tools/tree/master/packages/dukascopy-node/examples/with-custom-batching#keep-in-mind For your case I'd recommend to stick to the default values (batch size of 10, and batch pause of 1000ms):

npx dukascopy-cli -i aaplususd -v -from 1980-06-02 -to 2021-06-06 -t m1 -f csv -bs 10 -bp 1000

Or breakdown your queries in smaller chunks either by month or by year, since the resulting csv file can be very big

marcusfechner commented 3 years ago

Thanks Leo, so actually the servers should be the problem here as to many requests are performed, got it. I just thought I could speed up the process as I am creating a dataset with the history of all symbols.

Thanks for clarifying the issue.