go-acme / lego

Let's Encrypt/ACME client and library written in Go
https://go-acme.github.io/lego/
MIT License
7.96k stars 1.02k forks source link

Prepare release v4.13.1 #1960

Closed ldez closed 1 year ago

ldez commented 1 year ago

During the release of v4.12.2 a problem about "no space left on device" appears for the first time https://github.com/go-acme/lego/actions/runs/5313815049 The problem happens during the build of the binaries: Go creates a lot a small artifacts during a build.

I was thinking that it's not really related to disk space but to inode, so I decreased the number of concurrent builds to 1. And this fixed the problem for v4.12.3. https://github.com/go-acme/lego/actions/runs/5316467651

Now the problem happens again with v4.13.0, I still think it's related to inode and the build artifacts. https://github.com/go-acme/lego/actions/runs/5609596730/jobs/10263519288

I added a new step to the release process: free-disk-space-ubuntu.

Before the step:

$ df -h
Filesystem      Size  Used Avail Use% Mounted on
/dev/root        84G   61G   23G  73% /
tmpfs           3.4G  172K  3.4G   1% /dev/shm
tmpfs           1.4G  1.1M  1.4G   1% /run
tmpfs           5.0M     0  5.0M   0% /run/lock
/dev/sdb15      105M  6.1M   99M   6% /boot/efi
/dev/sda1        14G  4.1G  9.0G  31% /mnt
tmpfs           694M   12K  694M   1% /run/user/1001

After the step:

$ df -h
Filesystem      Size  Used Avail Use% Mounted on
/dev/root        84G   38G   46G  46% /
tmpfs           3.4G  172K  3.4G   1% /dev/shm
tmpfs           1.4G  1.1M  1.4G   1% /run
tmpfs           5.0M     0  5.0M   0% /run/lock
/dev/sdb15      105M  6.1M   99M   6% /boot/efi
/dev/sda1        14G  4.1G  9.0G  31% /mnt
tmpfs           694M   12K  694M   1% /run/user/1001

I tried the release flow (with --snapshot --skip-publish) on my fork and this seems to work. https://github.com/ldez/lego/actions/runs/5610504773/jobs/10265562181

Note: every try take about 50 min and there is no real debug access for GitHub action, it's a painful process to debug this.

dmke commented 1 year ago

I was thinking that it's not really related to disk space but to inode

Could be both, Lego has huge dependency tree, and Go build artifacts are not that small...

Does actions/setup-go also prepare a build cache? If so, we could try to disable it and run go clean -cache after building a binary (not sure if goreleaser supports a post-build command). This definitively will increase the time finish, though.

Note: every try take about 50 min and there is no real debug access for GitHub action, it's a painful process to debug this.

To debug such issues, I've had some success running actions locally (or in a VM) using act.

ldez commented 1 year ago

Could be both, Lego has huge dependency tree, and Go build artifacts are not that small...

The problem happens after the build of several binaries, so it's not directly related to dependencies (mod cache) but to build cache.

The build cache increase with LOC so with dependencies at some point. The build cache is huge but as I said reducing the build parallelism has fixed the problem so it was more related to the number of small files than the size.

Does actions/setup-go also prepare a build cache?

I cleaned all the cache before the run.

If so, we could try to disable it and run go clean -cache after building a binary (not sure if goreleaser supports a post-build command). This definitively will increase the time finish, though.

I tried to find an option to be able to run go clean -cache but there are no options (at least in the free version of goreleaser)