Closed balerion closed 3 years ago
The smallest datapack is 5$/month, for 50Gb. This would solve the issue imo, but we need to agree on payment resposibility. My uni has a weird policy about not paying monthly subscriptions, perhaps I can try to push a little, but they are usually strict.
Is it so that you currently can't pull those files? I saw this before but I was of opinion they are not enforcing this limit for open repositories.
Edit: It is indeed so that I can't push the changes
batch response: This repository is over its data quota. Account responsible for LFS bandwidth should purchase more data packs to restore access.
FLASH DAQ HDF repository also makes use of git LFS. They are using desy hosted gitlab but perhaps we can take assistance from them and store the LFS objects on DESY servers, rather than github.
OK, in any case, there could be a hextof-data repo. If you decide to use DESY service in the end, you can list the datasets on hextof-data
as links, this also applies to any dataset deposited to Zenodo. I think for storing calibration datasets, hextof-data is sufficient.
OK, in any case, there could be a hextof-data repo. If you decide to use DESY service in the end, you can list the datasets on
hextof-data
as links, this also applies to any dataset deposited to Zenodo. I think for storing calibration datasets, hextof-data is sufficient.
It was challenging to move the LFS files from github to desy gitlab but now I did manage to do it (in a hextof-data repo there). Please check if forking the data works as expected and suggest how to continue. Shall I remove the data files from the repository here? Please also give me usernames for accounts which need ownership of that repository.
My DESY username is agustsss, you can add me to that repo.
Did the LFS cleanup work? if so, we don't need to delete the data from hextof-processor, but it is also useless to have it duplicated in the different repos. I would remove it... @balerion @RealPolitiX what do you think?
In my earlier reply here, I mentioned that I can not push changes (for LFS cleanup) due to quota restrictions. We'd need to buy a plan or wait till it resets. I thought using DESY gitlab for data makes sense since I didn't see any quota restrictions etc. there.
Sorry, I missed that. I agree that using the DESY GitLab for LFS is a better option, as licenses might be a problem to finance (even when cheap..).
If the data is available on the GitLab repository, and even mirrored in a github one through links (I feel its redundant though), I see no reason to keep it in the hextof-processor, so if possible I would just remove it all. If however the LFS quota blocks us from doing so, lets set a reminder to do this once we have quotas again.
@steinnymir https://github.com/momentoscope/hextof-processor/issues/71#issuecomment-765343803_ Sure, I think it is a good idea. I would also remove it. Let's wait for the counter to reset.
I am unclear why the quota hasn't reset. However, we should remove the LFS files so that commits can be published to the repository. Should I contact GitHub support, or would someone else do it?
Yes, you can do it. I don't understand why it didnt unlock, but I guess if you explain them we need the quota unlocked to remove usage of this feature, it should not be too much of a problem.
They reset the quota so I used bfg to clean the repo from .h5 files, and also remove the Git LFS integration. The repo should now be clear of data files. If you can confirm this, please close this issue.
Cloning a new copy of the repo results in a 37MB folder now. And I see no trace of LFS. Thanks for solving this, I will close the issue now.
So there appears to be an LFS 1Gb bandwidth quota:
Which means, right now I cannot even LFS fetch. Is this the right approach after all?