scheme-containers / monorepo

Generators, website, issues
https://containers.scheme.org
MIT License
7 stars 1 forks source link

Triggering a Docker Hub build for each upstream commit #8

Open lassik opened 4 years ago

lassik commented 4 years ago

(Subtask of #1; discussion continued from https://github.com/scheme-containers/loko/issues/2)

@weinholt:

Btw, build triggers on Docker Hub can be used to script automatic triggering of these jobs.

@lassik:

Not sure which exact function you're referring to. I enabled auto-builds for all "schemers" containers whenever a commit is pushed to master. (Or else it was enabled by default - can't quite remember.)

@weinholt:

Like a Tron movie, Docker Hub is full of half developed features. If you click Configure automated builds then there's a Build triggers section. It lets you generate a URL that can be used to trigger a build through curl -X POST. I'm not sure where this is documented.

@lassik:

Ah, you mean watching for commits to the implementations' source repos (e.g. https://gitlab.com/weinholt/loko.git) and triggering a Docker Hub build for each of those commits?

For the head containers, I've thus far built them manually (via Manage Repository -> Builds -> Trigger button) whenever I've noticed on the GitHub front page that an implementation has new commits. It takes about a minute per day. Of course, this is not ideal in the long term and an automated solution would be better.

weinholt commented 4 years ago

Well, something can be built do it automatically. Maybe not for every commit, but something.

You can also see the webhook that Docker Hub behind the scenes here: https://github.com/scheme-containers/loko/settings/hooks

lassik commented 4 years ago

The Scheme API backend (set to appear at https://github.com/schemedoc/borg "real soon now") will have a generic cron-like scheduler to watch for changes to arbitrary URLs and propagate the latest version of each such URL into a processor graph (where each processor is an arbitrary side-effect-free Scheme procedure that gets input from another graph node and passes its output onto any node(s) that are listening to it).

lassik commented 4 years ago

It would be ideal if the API backend could listen to all implementations' repos, and anyone who wants to react to implementation commits can simply listen to one endpoint in the API for broadcasts of all implementations (or only a chosen subset).

I'm not sure how good GraphQL is at subscriptions, but not all of the API has to be GraphQL.

lassik commented 4 years ago

The current idea is that the Scheme API would be backed by a simple key-value store. Each graph node can listen to particular keys and be triggered whenever one or more of those keys gets a new value. In turn, it specifies one or more output keys for which it produces new values. As far as I can tell, this simple system can do everything needed, but I'm not 100% sure.

A URL listener would be a special graph node that gets its input from a URL (on daily, weekly, etc. basis) instead of getting it from another key in the store.

The store could just be a directory of files (where the key is the filename), or a SQLite table etc. Shouldn't matter all that much since we're unlikely to have more than 100 MiB of data to start with.

lassik commented 4 years ago

One could layer a Functional Reactive Programming framework on top of the graph, with map and filter and such, but that's probably overengineering.