Currently the docker image build triggers when a new github tag is added, but docker pulls the pypi packages so if the tag gets applied before pypi is pushed, then the docker image will actually contain the previous release. This is easily worked around by just pushing to pypi first, but it would be nice to not have to worry about this race condition happening.
I think the basic options are either to force docker to use the tagged version or we could also just automate the process similar to what libcloudforensics does (https://github.com/google/cloud-forensics-utils/blob/master/.github/workflows/pypi_push.yml). One thing to keep in mind if we go down the route of making docker used the tagged version is that we currently also have other builds that aren't triggered by a tag being added (e.g. -test which builds on new pushes to the master branch, or -release-test which builds on new pushes to the release branch).
Currently the docker image build triggers when a new github tag is added, but docker pulls the pypi packages so if the tag gets applied before pypi is pushed, then the docker image will actually contain the previous release. This is easily worked around by just pushing to pypi first, but it would be nice to not have to worry about this race condition happening.
I think the basic options are either to force docker to use the tagged version or we could also just automate the process similar to what libcloudforensics does (https://github.com/google/cloud-forensics-utils/blob/master/.github/workflows/pypi_push.yml). One thing to keep in mind if we go down the route of making docker used the tagged version is that we currently also have other builds that aren't triggered by a tag being added (e.g.
-test
which builds on new pushes to the master branch, or-release-test
which builds on new pushes to the release branch).