readthedocs / readthedocs-docker-images

Docker image definitions used by Read the Docs
115 stars 70 forks source link

Automate compile version & upload to S3 #179

Closed humitos closed 2 years ago

humitos commented 2 years ago

Initial idea coming from https://github.com/readthedocs/readthedocs-ops/issues/1155 showing how we can maybe automate this process.

humitos commented 2 years ago

I was able to run the first test on this --let's see how it goes. It should end up with all the artifacts uploaded to -dev S3 buckets

Screenshot_2022-04-13_16-03-10

humitos commented 2 years ago

It seems I'm pretty close to having something working already. There are some of the jobs that succeed and I see the artifacts uploaded into -dev S3 buckets 💪🏼

We can polish the script a little more, but we need to keep in mind that we need the script to keep being compatible for local development since we need to upload the artifacts to MinIO when working locally.

humitos commented 2 years ago

This could end up in weird production states. For example when some check passes and some others fail. In that case, we will be updating only some versions to production.

Note that this already happened while testing this PR.

humitos commented 2 years ago

Another thing that I don't have solved yet is how to make docker pull command to use the full name of the ubuntu image (including the date): ubuntu-22.04-2022.03.15, but then when uploading the .tar.gz to S3 name it without the date: ubuntu-20.04-python-3.10.0.tar.gz, which is how it's currently working in production (see https://github.com/readthedocs/readthedocs.org/blob/ccdad233cda5bcd3ac3acd935536c9e8cfc2e440/readthedocs/doc_builder/python_environments.py#L73-L76)

humitos commented 2 years ago

I decided to avoid using matrix together with parameters because it spins up one job per os and tool/version, having to download the docker images each time.

Now, I'm using a similar approach to the one commented in https://github.com/readthedocs/readthedocs-ops/issues/1155#issuecomment-1082615972 that:

This makes better usage of resources and should be way faster. The downside, it's bashy 😄

humitos commented 2 years ago

@agjohnson done! We should merge https://github.com/readthedocs/readthedocs.org/pull/9098 first and then this branch. After that, all the .tar.gz will be uploaded to S3 -dev buckets. Then we can update the environment variables with -prod buckets and re-trigger the CircleCI job for the final test.

agjohnson commented 2 years ago

Failure is just on docs. Imma merge this.

agjohnson commented 2 years ago

Bah, wrong PR somehow. Oops.

agjohnson commented 2 years ago

I cancelled the running build (which had pulled down script from main on readthedocs.org before the corresponding PR was merged), and restarted the workflow:

https://app.circleci.com/pipelines/github/readthedocs/readthedocs-docker-images/238/workflows/36835669-d367-44bd-817e-29bd4f8ed93d/jobs/424

Looks good!