But it took >12h to upload a single *.subreads.bam file (~750GB). When I use AWS CLI to copy 3TB of similar data, it takes about 2 hours.
This use case was specific, but will likely happen again: we have ~30 samples that have ~3 TB each of subreads files that we want to preserve, but no one uses. So we would like to back them up to outside of the submission area. That way we don't have to pay for storing this data in Terra/GCP, but we still retain the data.
Where the copy command does not recalculate checksums.
I have listed the command as cp not mv because we will want to backup the entire directory then delete the huge files in place. This is easier/simpler than moving specific files -- and there will likely be instances where copying is actually needed (as may be the case for creating "releases").
I tried to use:
But it took >12h to upload a single *.subreads.bam file (~750GB). When I use AWS CLI to copy 3TB of similar data, it takes about 2 hours.
This use case was specific, but will likely happen again: we have ~30 samples that have ~3 TB each of subreads files that we want to preserve, but no one uses. So we would like to back them up to outside of the submission area. That way we don't have to pay for storing this data in Terra/GCP, but we still retain the data.
It would be nice to issue a command like:
Where the copy command does not recalculate checksums.
I have listed the command as cp not mv because we will want to backup the entire directory then delete the huge files in place. This is easier/simpler than moving specific files -- and there will likely be instances where copying is actually needed (as may be the case for creating "releases").