Repeatr: Reproducible, hermetic Computation. Provision containers from Content-Addressable snapshots; run using familiar containers (e.g. runc); store outputs in Content-Addressable form too! JSON API; connect your own pipelines! (Or, use github.com/polydawn/stellar for pipelines!)
After creating a mux, one can get labeled writers to it, and readers that return the conjunction of any choice of labels. Every reader starts off again from the beginning. This implementation is backed by files, so it's safe for arbitrarily large amounts of data. Readers can safely run concurrently with writers; the readers will return as much as they can, and when out of data will continue to block until the writers emit close signals (even if the reader catches up with the current end of file while any relevant writer is still ongoing).
The intended/example use case for this is collecting stdout and stderr from a process, saving both of them, and being able reply either alone or both together, always in the same order they originally appeared.
There's a couple todo's scattered around; we can decide how important those are to fix before merging. There's numerous sharp edges to this code (for one example, asking for streams labels that never existed means you'll block forever waiting for them to be "closed"), but if used as directed should be functional.
Based on out-of-band conversation, I'm going to merge this; I propose that you continue adding commits on this branch and we merge again when outstanding points are satisfied.
Add a stream logging and mux/demuxing system.
After creating a mux, one can get labeled writers to it, and readers that return the conjunction of any choice of labels. Every reader starts off again from the beginning. This implementation is backed by files, so it's safe for arbitrarily large amounts of data. Readers can safely run concurrently with writers; the readers will return as much as they can, and when out of data will continue to block until the writers emit close signals (even if the reader catches up with the current end of file while any relevant writer is still ongoing).
The intended/example use case for this is collecting stdout and stderr from a process, saving both of them, and being able reply either alone or both together, always in the same order they originally appeared.
There's a couple todo's scattered around; we can decide how important those are to fix before merging. There's numerous sharp edges to this code (for one example, asking for streams labels that never existed means you'll block forever waiting for them to be "closed"), but if used as directed should be functional.