chaotic-aur / packages

Read-only mirror of Chaotic-AUR's main repository. Issues and bug reports welcome! 📑
https://gitlab.com/chaotic-aur/pkgbuilds
GNU General Public License v3.0
338 stars 20 forks source link
aur chaotic-aur packages repository

Chaotic-AUR PKGBUILDs

This is the right place to submit package requests, bug reports, or outdated packages of Chaotic-AUR 📜

⚠️ We switched to our entirely new infra 4.0 on October 18, which means we now operate on GitLab CI! While we still accept issues and bug reports on our previous packages repo, on GitHub, all the pipelines, updates and insights in general ongoings will majorly be happening on the other side. Still, the old repo will be kept around as a push-only mirror.

Chaotic-AUR

Some packages we have already built

Every folder in this repository will be built by our build system, for information about currently queued up packages and build logs check out our build status page! 🕵️‍♀️

A complete list of packages with their current versions is additionally available here.

Modified packages

While we would prefer to build AUR packages without modification, doing so is often not practical or possible.

To address such issues:

Special packages

Banished and rejected packages 📑

This is a list of packages that we will reject for good reasons:

Banned due to licensing issues 🛑

Build system details

Our previous build tools, the so-called toolbox was initially created by @pedrohlc to deal with one issue: having a lot of packages to compile while not having many maintainers for all the packages. Additionally, Chaotic-AUR has quite inhomogeneous builders: servers, personal devices, and one HPC which all need to be integrated somehow. The toolbox had a nice approach to this - keeping things as KISS as possible and using Git to distribute package builds between builders. These would then grab builds according to their activated routines. While this works fairly well, it had a few problems which we tried to get rid of in the new version. A few key ideas about this new setup:

How it works

The new system consists of three integral parts:

Compared to Infra 3.0, this means we have the following key differences:

The following will contain information to understand how it all works together, the full build system API documentation can be found here.

Workflows and information

Adding packages

Adding packages is as easy as creating a new folder named after the $pkgbase of the package. Put the PKGBUILD and all other required files in here. Adding AUR packages is therefore as simple as cloning its repo and removing the .git folder. CI relies on .SRCINFO files to parse most information, therefore, it is important to have them in place and up-to-date in case of self-managed packages. Finally, add a .CI folder containing the basic config (CI_PKGBUILD_SOURCE is required in case its external package, self-managed PKBUILDs don't need it), commit any changes, and push the changes back to the main branch. Please follow the conventional commit convention while doing so (cz-cli can help with that!). This means commits like:

This not only helps with having a uniform commit history, it also allows automatic changelog generation.

Removing packages

This can be done by removing the folder containing a package's PKGBUILD. A cleanup job will then automatically remove any obsolete package via the on-commit pipeline run. This will also consider any split packages that a package might produce. Renaming folders does also count as removing packages.

On-commit pipeline

Whenever pushing a new commit, the CI pipeline will carry out the following actions:

On-schedule pipelines

Hourly

Every hour, the on-schedule pipeline will carry out a few tasks:

Daily

A daily pipeline schedule has been added for specific packages which generate their pkgver dynamically. To make use of it, set CI_ON_TRIGGER=daily inside the .CI/config file of the package.

Manual scheduling

Scheduling packages without git commits

Packages can be added to the schedule manually by going to the pipeline runs page, selecting "Run pipeline" and adding PACKAGES as a variable with the package names as its value. The pipeline will then pick up the packages and schedule them. PACKAGES can also be set to all to schedule all packages. In case one or many packages are getting scheduled, it needs to follow the format pkgname1:pkgname2:pkgname3.

Running scheduled pipelines on-demand

This can be done by going to the pipeline runs page, selecting "Run pipeline" (the play symbol). A link to the pipeline page will be provided, where the pipeline logs can be obtained.

Adding interfere

Put the required interfere file in the .CI folder of a PKGBUILD folder:

Bumping pkgrel

This is now carried out by adding the required variable CI_PACKAGE_BUMP to .CI/config. See below for more information.

Dependency trees

The CI builds dependency trees automatically. They are passed to the Chaotic manager as a CI artifact and read whenever a schedule command is being executed. No manual intervention is needed.

.CI/config

The .CI/config file inside each package directory contains additional flags to control the pipelines and build processes with.

Known state variables

State will be kept in the .state worktree. It can be viewed by browsing the state branch of a PKGBUILD repository. Each package will have their own file named after the package name. The following variables are known to be stored:

Managing AUR packages

AUR packages can also be managed via this repository in an automated way using .CI_CONFIG. This means that after each scheduled and on-commit pipeline, the AUR repository will be updated to reflect the changes done to the PKGBUILD folder's files. Files not relevant to AUR maintenance (e.g. .CI folders) will be omitted. The commit message reflects the fact that the commit was created by a CI pipeline and contains the link to the source repository's commit history and the pipeline run which triggered the update commit.

Updating the CI's scripts

This is done automatically via the CI pipeline. Once changes have been detected on the template repository, all files will be updated to the current version.

Issues and pipeline failures

Last on-commit pipeline failed

This can happen in case of a few reasons, for example having provided an invalid package name. This causes the scheduled tag to not be updated. In this case, the on-schedule pipeline will not be able to run. The last on-commit pipeline needs to be fixed before the on-schedule pipeline can run again. Build failures however are not accounted as the scheduled tag would be updated already as soon as the scheduling parameters were generated. Force pushing a fixed up commit is actively encouraged in such a case, as pushing another commit will cause the CI to evaluate the previous commits it missed, leading to noticing the same issue again and bailing out instead of silently continuing. This has been a design decision to prevent failures from being overlooked.

Resetting the build queue

There might be rare cases in which a reset of the build queue is needed. This can be done by shutting down the central Redis instance, removing its dump, and restarting its service. This will, however, also wipe any logs stored inside Redis.

Live-updating logs

Logs are live-updating and can be viewed in real-time via the web server. In case GitLab is used and PACKAGE_REPOS_NOTIFIERS is set, an external CI stage will be created for every package scheduled during the CI run, linking to the log.

Prometheus metrics

Prometheus metrics are available at the /metrics endpoint of the web server. Currently, we collect default prom-client metrics as well as statistics about total event count of each build status (failed, successful, already-built, timed out) as well as metrics about overall build times. These can be collected via a Prometheus instance and then be visualized using Grafana.

Development setup

This repository features a NixOS flake, which may be used to set up the needed things like pre-commit hooks and checks, as well as needed utilities, automatically via direnv. This includes checking PKGBUILDs via shellcheck and shfmt. Needed are nix (the package manager) and direnv, after that, the environment may be entered by running direnv allow.