Closed mokhtarabadi closed 3 years ago
As mentioned in Discord, I will have a look but it will take a couple of days.
I think writing a new script simply convert the downloaded data to passivbot data frame is enough because the data not up to date mean data always available until last month, so for this month can using API like before
also, I write a simple java application to convert downloaded data to 100000 chunks, but not working in passivbot and the bot removed files, what is the historical schema?
I planned to use the daily data because it's more fine-grained and you can get data up until the day before you start. Then you only need to fetch it for the current day via the API. And I wanted to integrate it into the downloader because it should be automatic.
Regarding the schema, it's from every 100k trade id to 199999 as a chunk. The headers are trade_id, price, qty, timestamp, is_buyer_maker. And the name is first trade id, last trade id, first timestamp, last timestamp.
I will probably implement it today.
PR #123 includes the improvements.
When I test new downloader, I saw sometimes error 404 not found, it's normal?
Yes, that's expected. That can happen if the current day does not exist yet as an archive. In this case, fetching will be done through the API.
now everything works, close if need
I think can use the archive of the Binance future for backtesting instead of downloading aggrades from Binance API, this will improve downloading many times
useful links: https://www.binance.com/en/landing/data https://github.com/binance/binance-public-data/