spinalcordtoolbox / exvivo-template

High resolution ex-vivo MRI template of the cervical spinal cord.
1 stars 0 forks source link

max download bandwidth reached #1

Closed jcohenadad closed 3 years ago

jcohenadad commented 3 years ago

copied from slack message by @charleygros

git clone https://github.com/sct-data/exvivo-template.git
leads to:
Cloning into 'exvivo-template'...
remote: Enumerating objects: 61, done.
remote: Counting objects: 100% (61/61), done.
remote: Compressing objects: 100% (46/46), done.
remote: Total 61 (delta 19), reused 46 (delta 11), pack-reused 0
Unpacking objects: 100% (61/61), done.
Checking connectivity... done.
Downloading template/label_spinalsegments.nii.gz (2.2 MB)
Error downloading object: template/label_spinalsegments.nii.gz (f23c66e): Smudge error: Error downloading template/label_spinalsegments.nii.gz (f23c66e5bbc6c9417d34dea501e7b45c3de03902b5bd4f6cd9c275633cdd8908): batch response: This repository is over its data quota. Account responsible for LFS bandwidth should purchase more data packs to restore access.
Errors logged to /home/charley/Downloads/template_regi_test/exvivo-template/.git/lfs/logs/20210304T140208.350098739.log
Use `git lfs logs last` to view the log.
error: external filter git-lfs smudge -- %f failed 2
error: external filter git-lfs smudge -- %f failed
fatal: template/label_spinalsegments.nii.gz: smudge filter lfs failed
warning: Clone succeeded, but checkout failed.
You can inspect what was checked out with 'git status'
and retry the checkout with 'git checkout -f HEAD'

This is indeed a known issue, see https://github.com/neuropoly/spinalcordtoolbox/issues/3214 where the issue is being discussed.

jcohenadad commented 3 years ago

Suggestions:

jcohenadad commented 3 years ago

ok-- since this is urgent, i paid for it:

Screen Shot 2021-03-04 at 9 20 17 AM

can someone pls try again to see if the download works?

in the meantime, @ajora agreed to find a solution to split <100MB data

dyt811 commented 3 years ago

I looked into alternatives such as GitLab and Azure DevOps repo, both of these can server as additional upstream/remote/mirror for the repo (especially data only repos). GitLab caps at 10GB storage (no bandwidth limit it seems), and Azure DevOps has no limit it appears. I recommend trying both while expanding search option. Amazon S3 would be a last resort if it comes to that.

If someone could paste a Dropbox/GoogleDrive/FTP link for these repo file content, I can start try to setup mirrors in those services. I would avoid any unnecessary git clone until this issue is fully addressed since GitHub is SOOOOOOOO stingy with their bandwidth. Then once mirrors are up, we can issue PRs for any repos that leverage these files to build up redundancy.

jcohenadad commented 3 years ago

UPDATE: i should have purchased a datapack, not an upgrade of the account: image

Drulex commented 3 years ago

Created https://github.com/Drulex/ghsplit

in the meantime, @Ajora agreed to find a solution to split <100MB data

@jcohenadad I think you meant @Drulex