ronaldlam / Autotrageur

Automated arbitrageur
2 stars 1 forks source link

Analytics: Move minute scraping to a Linux machine and automate backup #248

Open jaonewguy opened 5 years ago

jaonewguy commented 5 years ago

The minute scrapers are running locally right now and require a manual backup to Google Drive.

This is not a feasible solution long-term and we should investigate automating the process, including backup to something like S3.

ronaldlam commented 5 years ago

So now that we have a little 30gb GCP vm, there are a few options:

  1. Keep existing setup, run periodic cronjobs to back up binary logs: https://www.techrepublic.com/article/how-to-set-up-auto-rsync-backups-using-ssh/
    • Can use either worker1 (existing) or worker2 (gcp) for forex and exchange minute data
    • Pros: easier
    • Cons: explained below
  2. Set up replication server: https://mariadb.com/kb/en/library/setting-up-replication/
    • Can have all scripts run on worker1
    • Pros: db immediately available on GCP, extensible
    • Cons: more difficult to set up...
  3. Swap out production server: worker1 gets scrape/backup duty, worker2 runs bot
    • Workload split for above still apply
    • Pros: better networking (we don't have to deal with Shaw), can relocate servers for better performance.
    • Cons: API keys on external network