Open firepol opened 6 years ago
a workaround I did was install the mongoDB on a virtual instance, change the configuration so it maps to the IP and not the localhost (this will allow external connections).
on any Virtual machines or bare metal I want to run Zenbot on, I change the config to point to the one mongodb that I configured for remote access.
Example I trade on bittrex, bit bittrex doesnt allow downloading of trade tradedata history. I have one VM doing stratagey noop on bittrex.ltc-usd on a 1 minute period and just let it go, on another machine I have it trading bittrex.ltc-usd on a 1 hr period. this way I have a 1 min tradehistory stored, even though I only calculating on per hour.
And on even another I'm backfilling gdax.ltc-usd --days=15.
All instances use the same database, so all history is stored.
That's a cool workaround, but it would be even better if you could export your data that I don't have, share it with the community so e.g. I could import it in my fresh/empty db... as I started using zenbot only recently. That was the idea, we could think about a place where to store this data, it would be a big plus, could be really useful.
On the DB PC
mongodump --db=zenbot4 -o --gzip <filename>
Receiving PC:
mongorestore --gzip --archive <filename>
If someone in the community wanted to do something to host a massive collection of data it could be done a couple "quick" ways.
Mongo is pretty sweet like that. There is a way to export to .CSV but I don't know that off the top of my head.
I currently am doing something similar but all the backtesting PC's are running "replicaSets" of the "tracker" PC. I had the database on a single PC and was having an issue reading the data fast enough to give the processor a full load.
As I worte in #1155 I backfill data for many crypto currencies that I follow. For now I dedicated this task to a low consumption (and slow) PC... but I'd like to run the simulations on a faster PC.
The thing is, now I need to backfill all data again on my faster PC.
It would be really awesome to be able to export all the backfilled data (a separate file per selector) transfer the files to my other machine, then import.
This would allow also other things, e.g. some exchanges don't allow backfill, so imagine if somebody sets a zenbot instance to monitor and save the data in real time for those exchanges and then share the results, so that all zenbot users could benefit from it and download and import them?
At best even set an option for this, specify e.g. an URL (can be a static page containing tar.gz file, zip files or whatever compression format you like for such purpose) and let zenbot try to fetch the exported data (if existing) from there for these exchanges without backfilling.