Closed dzfranklin closed 1 month ago
I think this will be simpler and less resource-intensive for me than following minutely updates
Keeping the database up-to-date using minutely/hourly/daily diffs is like 0.6GB per week, whereas the clone download will be 300-560GB. This seems fairly wasteful and certainly not an intended scenario for cloning, even without parallel downloads. For starters, you can apply minutely diffs once per hour, or even once per day.
Is it a permissible use of the resources of your server to modify download_clone.sh to download in parallel?
(For context I am considering running a batch job every week that downloads a clone, generates areas, and then copies the database files to my server. I think this will be simpler and less resource-intensive for me than following minutely updates.)