llvm / llvm-project

The LLVM Project is a collection of modular and reusable compiler and toolchain technologies.
http://llvm.org
Other
27.68k stars 11.39k forks source link

Release 13.0.1 missing binaries for linux/x86_64 #53892

Open froody opened 2 years ago

froody commented 2 years ago

On the releases page for 13.0.0 there is clang+llvm-13.0.0-x86_64-linux-gnu-ubuntu-20.04.tar.xz, but for 13.0.1 there's no corresponding binary. Is it possible to generate the same variants for 13.0.1 as 13.0.0?

str4d commented 2 years ago

Likewise, there are no binaries for Ubuntu 16.04, which were provided for 13.0.0. I recall that LLVM had previously planned to drop Ubuntu 16.04 support in LLVM 11, but those binaries are also required for Debian 9 (which doesn't reach End-of-Life until end of June 2022) due to sharing a glibc version, and subsequent LLVM releases including 13.0.0 have provided them.

I am confused as to why support for pre-18.04 Ubuntu releases would be dropped in a patch release; was this intentional? If not, could we have those variants generated for 13.0.1?

If it was intentional, it would be helpful to document LLVM's policy for dropping platform support. There is a Support Policy document, but it only covers what parts of the LLVM codebase are supported, not the platforms that supported releases are built for.

str4d commented 2 years ago

A previous example of release platform inconsistency:

At the time, I assumed this meant that the 12.0.1 patches were not relevant to x86_64-apple-darwin, and I had missed the fact that Ubuntu 20.04 was missing (as I only depend on the lowest-supported Ubuntu target).

tstellar commented 2 years ago

The reason there is platform inconsistency is because the binaries are provided by volunteers. We don't have any official builds that we produce consistently. For anyone interested in trying to fix this, I recommend writing a GitHub workflow that we can use to produce official binaries.

str4d commented 2 years ago

The reason there is platform inconsistency is because the binaries are provided by volunteers. We don't have any official builds that we produce consistently.

That is good information to know, thanks.

For anyone interested in trying to fix this, I recommend writing a GitHub workflow that we can use to produce official binaries.

Sure, now that I'm aware of the current process, I'd be happy to contribute an automated one (at least for the platforms corresponding to standard GitHub builders).

tstellar commented 2 years ago

Sure, now that I'm aware of the current process, I'd be happy to contribute an automated one (at least for the platforms corresponding to standard GitHub builders).

You can build inside containers on the GitHub builders, so you aren't limited to just the distributions supported by the builders (Currently Ubuntu, Windows, Mac OS).

str4d commented 2 years ago

What distribution process should I follow to reproduce the distributions currently produced? I've found https://llvm.org/docs/BuildingADistribution.html which references clang/cmake/caches/DistributionExample.cmake, but that currently causes an error locally, and I haven't found anything else listing what a "standard" distribution is configured with.

tstellar commented 2 years ago

For building the release binaries, all you need to do is run the test-release.sh script. More details here: https://llvm.org/docs/ReleaseProcess.html#test-release-sh

str4d commented 1 year ago

I've finally had time to finish working on this, and have a branch up that modifies the release tasks workflow to build binaries (currently only building on Ubuntu 18.04 and 20.04 for testing purposes): https://github.com/str4d/llvm-project/blob/release-workflow/.github/workflows/release-tasks.yml

The issue I'm running into is that the builders take longer than six hours to run, which exceeds the maximum job runtime for GitHub-hosted workers.

llvmbot commented 1 year ago

@llvm/issue-subscribers-backend-x86

tstellar commented 1 year ago

@str4d Thanks for working on this. A few suggestions:

str4d commented 1 year ago
  • I think it would be better to factor this workflow out into it's own file

Heh, that's how I originally had it set up when I started working on it in April, but when I picked it back up I saw the new release tasks script that was added in July and figured that was where it should go. I can easily split it back out (though I'll need to duplicate the validation step that I currently share), and for now I'll change it to only run manually (rather than trying to have the release tasks workflow trigger it, though I do know how to do this if desired).

  • We have access to more powerful runners that you can use.

Nice! I'll get the branch updated to also have setups for all the workers that GitHub Actions has (on which I can run the workflow up to the "start the multi-hour build phase"). Then we can see about migrating it to the set of runners that you use.

str4d commented 1 year ago

Okay, I now have a separate workflow that runs manually:

Currently it only builds binaries for targets matching GitHub-hosted runners. I did look into containers, but I could not find recent FreeBSD Docker images that I felt were trustworthy, and I don't know enough about the LLVM release process to know how cross-compiling might work.

tstellar commented 1 year ago

@str4d I enabled the more powerful runners and also hacked the workflow so it would trigger on a pull request: https://github.com/llvm/llvm-project-release-prs/pull/208

Unfortunately, it looks like the more powerful runners may no longer be free, so we will have to wait to enable this until we get a budget or verify that we aren't going to be charged. I will work on this.

tstellar commented 1 year ago

@str4d Sorry for the delay. We have the more powerful builders enabled now, and I ran a test run with your patch: https://github.com/llvm/llvm-project-release-prs/actions/runs/3833935156/jobs/6631222305

Unfortunately some of the tests fail. I think we need to add an option to test-release.sh that will package the binaries even if the tests fail. We also may want to use the --ninja flag since I think that will give us faster builds. If you want to update your branch with these fixes, I can test it again.

str4d commented 1 year ago

I think we need to add an option to test-release.sh that will package the binaries even if the tests fail.

I don't know enough about the test-release.sh script to do this. If someone else can suggest the necessary changes I can include them.

We also may want to use the --ninja flag since I think that will give us faster builds.

I've added the -use-ninja flag to test-release.sh, and CI now installs Ninja on the configured builders. I also rebased onto the llvmorg-15.0.6 tag.

https://github.com/str4d/llvm-project/commit/69b47d855fad658e0e8760a2c07f5464b22d39cf

tstellar commented 1 year ago

I think we need to add an option to test-release.sh that will package the binaries even if the tests fail.

I don't know enough about the test-release.sh script to do this. If someone else can suggest the necessary changes I can include them.

Ok, I can try to come up with a patch.

str4d commented 1 year ago

For reference, I pulled out the unique targets that have published binary releases for 13.0.0, 14.0.0, and/or 15.0.0:

jacob-carlborg-apoex commented 1 year ago

@str4d FYI I've created a GitHub action [1] that allows to run commands on platforms not supported by GitHub (they run in a VM). It currently supports FreeBSD, OpenBSD and NetBSD.

Linux on ARM and PPC can easily be run using Docker and QEMU. Here's an example from one of my pipelines [2].

[1] https://github.com/marketplace/actions/cross-platform-action [2] https://github.com/jacob-carlborg/lime/blob/f4d9c8c4265b61b2844b93a01f38b2d09433830e/.github/workflows/ci.yml#L163-L179

tstellar commented 1 year ago

@str4d Would you be able to submit the patch on Phabricator?

tstellar commented 1 year ago

@str4d A few more suggestions:

str4d commented 1 year ago

@str4d Would you be able to submit the patch on Phabricator?

I'll see if I can figure this out.

tstellar commented 1 year ago

@str4d I went ahead and submitted the patch here: https://reviews.llvm.org/D143535

str4d commented 4 months ago

I was blocked from upgrading to LLVM 16+ in my use case because the Intel macOS binaries stopped being built by volunteers, and were removed from my automated workflow when it was upstreamed (due to timeouts per above). Now that we've dropped support for Intel macOS in our use case, I'm back looking at this again.

It looks like @tstellar chose Ubuntu 22.04 as the target to build for when upstreaming my workflow, and removed the Ubuntu 18.04 and 20.04 versions. Unfortunately, this results in a glibc version dependency that is newer than several of the other Debian-based versions that are still in their support windows:

(EDIT: I now see that whoever is building the Linux binaries manually built the LLVM 18 binary on Ubuntu 18.04, so at least on LLVM 18 we have current glibc compatibility.)

Platform glibc End of Support Notes
Ubuntu 18.04 2.27 April 2023 Binaries available for LLVM 15 and below, as well as LLVM 18.1.4
Debian Buster 2.28 June 2024
Ubuntu 20.04 2.31 April 2025
Debian Bullseye 2.31 June 2026
Ubuntu 22.04 2.35 April 2027 Binaries available for LLVM 16 and 17
Debian Bookworm 2.36 June 2028

Debian Buster reaches EoS in two months, so I think we can ignore that. Hence, if the automated release workflow built with Ubuntu 20.04 instead of Ubuntu 22.04, we would have maximal glibc coverage.

@tstellar it looks like PRs are now done directly in GitHub? If that's the case, I'll open a PR with this proposed change.

str4d commented 4 months ago

Additionally, now that GitHub is deploying ARM macOS runners, it would be a good idea to look into adding that as an automated target (replacing the current volunteer-uploaded builds, reducing burden on them). The ARM macOS runners are noticeably faster than their Intel macOS counterparts AFAICT, so timeouts should hopefully not be an issue.

jacob-carlborg commented 4 months ago

@str4d For Linux, how about building fully statically linked binaries? Then you don't need to worry about which distribution or version. Build in a container running Alpine, which uses Musl libc.

jacob-carlborg commented 4 months ago

BTW, the Linux runners have been updated and are faster now as well.

str4d commented 4 months ago

@str4d For Linux, how about building fully statically linked binaries? Then you don't need to worry about which distribution or version. Build in a container running Alpine, which uses Musl libc.

I personally don't mind that, but I'm not an LLVM developer so a) IDK if there are other considerations necessary here in terms of how LLVM compilers behave when compiled against Musl vs glibc, and b) I have no idea how to actually set up LLVM to be compiled in that way (i.e. if compiling in an Alpine container is actually sufficient, or if there are additional LLVM build system considerations). In my own experience with Musl in end-user binaries (for applications, not compilers), I've generally had to provide two sets of binaries for different users, and I could imagine the same possibly applying here (but again, I'm not an LLVM developer so IDK).

In any case, the automated release workflow I wrote is intentionally pluggable, so it should be relatively straightforward to define another runner that does a Musl build.

jacob-carlborg commented 4 months ago

I've given this some more thought and my idea would only work for the binaries and not the pre compiled libraries.

adzenith commented 4 months ago

@str4d I went ahead and submitted the patch here: https://reviews.llvm.org/D143535

Looks like this diff was merged and the yml file is in the repo. Is there another step that's required here to get this to fire when new releases are cut? I saw comments about needing paid runners in some PRs that touched that file. Is funding the limiting factor here? (Can we contribute somewhere?) The releases also each still include a comment about how the binaries are built by volunteers, so I'm unsure how this all works currently.

str4d commented 4 months ago

@adzenith the release binaries workflow is already firing on releases: https://github.com/llvm/llvm-project/actions/workflows/release-binaries.yml

However, it appears to be failing tests, which I think is leading to the final binary tarball not being assembled, and therefore the subsequent upload fails due to a missing file. I do not know enough about the LLVM test infrastructure to know how to address these, but I've pulled (what I believe to be) the test results from the most recent run (which was manually triggered, and seems to have more failures than the prior automated run):

2024-04-05T05:59:31.9924169Z Failed Tests (25):
2024-04-05T05:59:31.9931865Z   BOLT :: perf2bolt/perf_test.test
2024-04-05T05:59:31.9932405Z   Flang :: Lower/io-implied-do-fixes.f90
2024-04-05T05:59:31.9932863Z   Flang :: Lower/io-statement-2.f90
2024-04-05T05:59:31.9933496Z   Flang :: Lower/vector-subscript-io.f90
2024-04-05T05:59:31.9934127Z   lldb-api :: commands/expression/context-object/TestContextObject.py
2024-04-05T05:59:31.9935128Z   lldb-api :: commands/expression/import-std-module/deque-dbg-info-content/TestDbgInfoContentDequeFromStdModule.py
2024-04-05T05:59:31.9936347Z   lldb-api :: commands/expression/import-std-module/list-dbg-info-content/TestDbgInfoContentListFromStdModule.py
2024-04-05T05:59:31.9937579Z   lldb-api :: commands/expression/import-std-module/vector-dbg-info-content/TestDbgInfoContentVectorFromStdModule.py
2024-04-05T05:59:31.9938709Z   lldb-api :: functionalities/asan/TestMemoryHistory.py
2024-04-05T05:59:31.9939462Z   lldb-api :: functionalities/data-formatter/data-formatter-advanced/TestDataFormatterAdv.py
2024-04-05T05:59:31.9940391Z   lldb-api :: functionalities/data-formatter/data-formatter-categories/TestDataFormatterCategories.py
2024-04-05T05:59:31.9941277Z   lldb-api :: functionalities/data-formatter/data-formatter-cpp/TestDataFormatterCpp.py
2024-04-05T05:59:31.9942200Z   lldb-api :: functionalities/data-formatter/data-formatter-python-synth/TestDataFormatterPythonSynth.py
2024-04-05T05:59:31.9943175Z   lldb-api :: functionalities/data-formatter/data-formatter-smart-array/TestDataFormatterSmartArray.py
2024-04-05T05:59:31.9944205Z   lldb-api :: functionalities/data-formatter/data-formatter-stl/libcxx/iterator/TestDataFormatterLibccIterator.py
2024-04-05T05:59:31.9945217Z   lldb-api :: functionalities/data-formatter/data-formatter-stl/libcxx/map/TestDataFormatterLibccMap.py
2024-04-05T05:59:31.9946301Z   lldb-api :: functionalities/data-formatter/data-formatter-stl/libcxx/unordered_map/TestDataFormatterLibccUnorderedMap.py
2024-04-05T05:59:31.9947305Z   lldb-api :: functionalities/data-formatter/data-formatter-synth/TestDataFormatterSynth.py
2024-04-05T05:59:31.9948048Z   lldb-api :: functionalities/gdb_remote_client/TestGDBRemoteClient.py
2024-04-05T05:59:31.9948782Z   lldb-api :: functionalities/process_save_core_minidump/TestProcessSaveCoreMinidump.py
2024-04-05T05:59:31.9949460Z   lldb-api :: lang/c/global_variables/TestGlobalVariables.py
2024-04-05T05:59:31.9950006Z   lldb-api :: lang/cpp/class_static/TestStaticVariables.py
2024-04-05T05:59:31.9950676Z   lldb-api :: python_api/formatters/TestFormattersSBAPI.py
2024-04-05T05:59:31.9951152Z   lldb-api :: python_api/value/TestValueAPI.py
2024-04-05T05:59:31.9951821Z   lldb-shell :: SymbolFile/DWARF/x86/debug-types-dwo-cross-reference.cpp
2024-04-05T05:59:31.9952196Z 
2024-04-05T05:59:31.9952200Z 
2024-04-05T05:59:31.9952296Z Testing Time: 1720.15s
2024-04-05T05:59:31.9952473Z 
2024-04-05T05:59:31.9952581Z Total Discovered Tests: 126112
2024-04-05T05:59:31.9952881Z   Skipped          :     77 (0.06%)
2024-04-05T05:59:31.9953189Z   Unsupported      :   4844 (3.84%)
2024-04-05T05:59:31.9953502Z   Passed           : 120830 (95.81%)
2024-04-05T05:59:31.9953818Z   Expectedly Failed:    335 (0.27%)
2024-04-05T05:59:31.9954128Z   Unresolved       :      1 (0.00%)
2024-04-05T05:59:31.9954428Z   Failed           :     25 (0.02%)
2024-04-05T05:59:32.6323217Z FAILED: CMakeFiles/check-all /home/runner/work/llvm-project/llvm-project/final/build/tools/clang/stage2-instrumented-bins/tools/clang/stage2-bins/CMakeFiles/check-all