Open mdempsky opened 4 years ago
The thing that got me thinking of this was the Coq proof assistant project offers a CI service where projects that commit to certain maintenance responsibilities will received automated CI testing and Coq developer support to deal with upstream changes: https://github.com/coq/coq/blob/master/dev/ci/README-users.md
Of course, circumstances for Go are somewhat different. In Go we have to deal with flaky tests, and we also have stronger backwards compatibility guarantees (I think). But I think the basic principles of being more proactive in watching for downstream breakage is still applicable to Go.
Thanks for filing this @mdempsky.
/cc @cagedmantis @toothrot @andybons
I don't remember whether @dr2chase 's daily bent
run includes tests.
cc @dr2chase
My daily run does not include tests, but it is capable of tests, and I ran them for a recent CL.
bent
doesn't do a good job (yet) of quantifying test results -- yet another possible intern project, I suppose, though it is a small one (and bent
is internally somewhat cruddy).
Basic idea is that we'd automatically build and test a large, representative sampling of open source Go projects, to help catch regressions during the development cycle that aren't caught by tests in the Go repo itself.
Some thoughts:
If running tests is a concern for resource/security concerns, even just verifying that projects still build would be useful data sometimes. E.g., cmd/cgo changes have a high frequency of breaking builds.
I think ~nightly testing would be adequate, but even weekly would probably be useful.
/cc @dmitshur