Closed iameskild closed 2 weeks ago
I agree with the pinning version, mainly because if left unpinned, this always tends to cause trouble with old releases as well, then migrating it will require hotfixes or extra custom changes.... The only point I made, and noticed that @iameskild commented in the RFD above as well, is that we will need to think very well about how we will keep the updates of these pins regularly.
As part of this RFD, I propose we review, test, and upgrade our dependencies once per quarter as part of our release process.
This also seems reasonable, though depending on the changes a new version might add once unpinned, stabilizing the branch while testing/releasing could make us lose the release schedule (depending on the affected resources).
So instead, after the last release for a quarter, the first task after releasing it will be unpinning and testing stability (thus we have the whole quarter to update the pins if needed gradually). All releases at the start of the quarter will have the new pins as well as will be already stabilized (not affecting the release process)
Thanks a lot for the feedback @viniciusdc :)
I agree with you that we should start dependency testing as early as possible for the subsequent release. This would give us time to adjust should we need to upgrade one of the dependencies.
Since this has been open for a while I am moving this RFD towards decision/voting. We are using a consent decision-making process for RFD approval therefore:
cc @nebari-dev/maintainers
RFD updated today based on discussion and notes from Tuesday's Nebari community call, see notes.
I'm cleaning up the RFD's and I'm going to close this for now due to no activity, but feel free to re-open if needed!
Managing Nebari dependencies
Summary
Pin all the things
Let me start by stating that Nebari is not your typical Python package. For Python packages that are intended to be installed alongside other packages, pinning all of your dependencies will likely cause version conflicts and result in failed environment builds.
Nebari on the other hand is a package that is responsible for managing your infrastructure and the last thing you want is for the packages that Nebari relies on to introduce breaking changes. This has happened now twice this week alone (the week of 2023-01-16, issue 1622 and issue 1623).
As part of this RFD, I propose pinning all packages that Nebari requires or uses. This includes the following:
pyproject.toml
and making sure the package can be built onconda-forge
nebari-docker-images
repo)Set pinned dependencies used by Terraform in
constants.py
In Nebari, the Python code is used primarily to pass the input variables to the Terraform scripts. As such, I propose that any of the pinned versions - be they pinned Terraform providers, image/tags combinations, etc. - used by Terraform be set somewhere in the Python code and then passed to Terraform.
As an example, I recently did this with the version of Traefik we use:
https://github.com/nebari-dev/nebari/blob/bd777e6448b5e2d6339bc3d9ef35672163ae1945/nebari/constants.py#L4
Which is then used as input for this Terraform variable:
https://github.com/nebari-dev/nebari/blob/bd777e6448b5e2d6339bc3d9ef35672163ae1945/nebari/template/stages/04-kubernetes-ingress/variables.tf#L19-L25
https://github.com/nebari-dev/nebari/blob/bd777e6448b5e2d6339bc3d9ef35672163ae1945/nebari/template/stages/04-kubernetes-ingress/modules/kubernetes/ingress/main.tf#L215
Regularly review and upgrade dependencies
Once packages start getting pinned, it's important to regularly review and upgrade these dependencies in order to keep up-to-date with upstream changes. We have already discussed the important of testing these dependencies and I believe we should continue with that work (See issue 1339).
As part of this RFD, I propose we review, test and upgrade our dependencies once per quarter as part of our release process.Although we may not need to update each dependency for every release, we might want to consider updating dependencies in a staggered fashion.
pyproject.toml
and ensure that the package is buildable onconda-forge
.We don't necessarily need to make the update process this rigid but the idea is to update a few things at a time and ensure that nothing breaks. And if things do break, fix them promptly to avoid running into situations where we are forced to make last-minute fixes.
User benefit
In my opinion, there are a few benefits to this approach:
Design Proposal
The design proposal is fairly straightforward, simply move the pinned version of the Terraform provider or image-tag used to the
constants.py
. This would likely require an additional input variable (as demonstrated by the Traefik example above).User impact
We can be sure that when we perform our release testing and cut a release, that version will be stable from then on out. This is currently NOT the case.
What to do with the other Nebari repos?
This RFD is mostly concerned with the main
nebari
package and doesn't really cover how we should handle:I think these are less of a concern for us since once the
nebari-jupyterhub-theme
is included in the Nebari Jupyterhub Docker image, and once the images are built they don't change, there is little chance that users will be negatively affected by dependency updates. The only exception would be if users pull the image tagmain
which is updated with every new merge intonebari-docker-images
- this does not follow best-practices and we will continue to advise against it.Unresolved questions
~~I still need to test if this is possible for the pinned version of a particular Terraform provider used, such as: https://github.com/nebari-dev/nebari/blob/bd777e6448b5e2d6339bc3d9ef35672163ae1945/nebari/template/stages/04-kubernetes-ingress/versions.tf#L1-L13~~
required_provider
block (usually in theversion.tf
file).tfupdate
.