Open ilhamv opened 3 months ago
Look for __ptxcache__
files .o
.ptx
files specifically per @braxtoncuneo
You can use du -sh *
in a directory for a human-readable list of how large each item in the directory is.
The large files all seem to be due to inf_shem361 examples, the answer.h5 and data .npz files.
Possible ways to handle that:
The plan is to replace the infinite medium 361-group problem with an infinite medium few-group problem (probably the 7 group c5g7 data).
The largest memory seems to come from
68M .git/objects/b3
142M .git/objects/pack
Now I'm less sure if the ~4 MB 361-group data is actually the culprit. I'll try to use https://rtyley.github.io/bfg-repo-cleaner/ which may provide us with more info.
So,,,
Deleted files
-------------
Filename Git id
-------------------------------------------------------
Miniconda3-latest-Linux-ppc64le.sh | cdb26f99 (94.9 MB)
analytic.zip | b3859ac8 (92.5 MB)
Now the .git/objects
folder is 44M. More reasonable!
However, the next step is:
Finally, once you're happy with the updated state of your repo, push it back up (note that because your clone command used the --mirror flag, this push will update all refs on your remote server):
$ git push
At this point, you're ready for everyone to ditch their old copies of the repo and do fresh clones of the nice, new pristine data. It's best to delete all old clones, as they'll have dirty history that you don't want to risk pushing back into your newly cleaned repo.
Any thoughts? @clemekay @jpmorgan98
We may be able to reduce the size further when we remove the SHEM361 test problems and examples. I'll rerun the repo cleaner. Nevertheless, we still need to think about the final step of the cleaning I mentioned in the previous comment.
Can we use this to remove unwanted files accidentally tracked in the git history?