tarantool / mkrepo

Maintain DEB and RPM repositories on S3
69 stars 24 forks source link
deb rpm s3

Create RPM and DEB repositories in S3

mkrepo is a repository generator with pluggable backends, which allows you to maintain an RPM or DEB repository on various storages, like local filesystem or S3, and periodically regenerate metadata.

Use it in tandem with your favourite CI system to produce a better pipeline. mkrepo helps you to get rid of ad-hoc cron jobs.

As a bonus, mkrepo supports on-premises S3 servers like Minio.

Works on Linux and OS X. Should also work on BSD and Windows, but I haven't checked.

Quickstart

Create an s3 bucket named e.g. builds and put a sample package package.rpm to s3://builds/rpmrepo/Packages. Then do the following:

./mkrepo.py s3://builds/rpmrepo

After this, you will find all metadata generated in s3://builds/rpmrepo/repodata

Run tests

To run the tests, use the following command::

make test

Dependencies

Python libraries:

Command-line reference

mkrepo parses your ~/.aws/config and reads secret key and region settings. So you may skip them in command line invocation in case you have aws config.

  mkrepo.py [-h] 
            [--temp-dir TEMP_DIR]
            [--s3-access-key-id S3_ACCESS_KEY_ID]
            [--s3-secret-access-key S3_SECRET_ACCESS_KEY]
            [--s3-endpoint S3_ENDPOINT]
            [--s3-region S3_REGION]
            [--s3-public-read]
            [--sign]
            [--force]
            path [path ...]

Environment variables reference

Tips for working with GPG keys * Create a new key: ``` bash gpg --full-generate-key ``` * To view all your keys, you can use: ``` bash gpg --list-secret-keys --keyid-format LONG ``` * Scripts can use something like this to get the Key ID: ``` bash export GPG_SIGN_KEY="$(gpg --list-secret-keys --with-colons | grep ^sec: | cut -d: -f5)" ``` * Export the key in ASCII armored format: ``` bash gpg --armor --export-secret-keys MYKEYID > mykeys.asc ``` * Import the key: ``` bash cat mykeys.asc | gpg --batch --import ```

How it works

mkrepo searches the supplied path for either Packages or pool subdir. If it finds Packages, it assumes an rpm repo. If it finds pool, it assumes a deb repo.

Then it parses existing metadata files (if any) and compares timestamps recorded there with timestamps of all package files in the repo. Any packages that have different timestamps or that don't exist in metadata, are parsed and added to metadata.

Then new metadata is uploaded to S3, replacing previous one.

Credits

Thanks to Cyril Rohr and Ken Robertson, authors of the following awesome tools:

Unfortunately, we needed a solution that is completely decoupled from CI pipeline, and the mentioned tools only support package push mode, when you have to use a tool to actually push packages to s3, insted of native s3 clients.