Open rhaas80 opened 1 year ago
This has become urgent now since it prevents tests from running:
failed due to out of disk space errors (the failures just before were actual test failures).
As a stopgap measure I have removed some of the old versions from the checked out data:
git rm -r version_?? version_1?? version_2?? version_3??
in git hash b8f63046f5 "temoprarily remove some records files to free space" of tests which can be reverted if and when we have a proper fix (using --no-checkout
alluded in the description).
For some reason, I hadn't added the fetch-depth = 1
fix yet. I've added it in a pr. Will work on the --no-checkout fix now
So the disk space warning is indeed back in: https://github.com/EinsteinToolkit/tests/actions/runs/4585746865 and that run actually died. One wonders which repository has the largest impact.
If
gh-pages
then there are in principle ways to reduce its on-disk size significantly though I more suspect the culprit is at least partially the compiled ET code. If I do a:then the different checkouts are sized (
du -hs *
):So
gh-pages
is actually sizable and the majority is actually in the checked out files (and not the.git
git objects directory which is only 486MB).Since there are only a very few files we actually modify (other than add new files) we should be able to start with a "checkout" that has nothing actually checked out (using git clone's
--no-checkout
option) and operate on that one after manually checking out the couple of files we do modify (egversion.txt
). This means one has to manually add the files that one wants to add or modify and we need to check that github'scheckout
action does not "helpfully" run agit commit -a
at the end of the worflow (which it may well do) since that would record all the never checked out files as deleted files.