Closed natestemen closed 1 year ago
My opinion:
+1 for slowly loosing packages in requirements.txt
. Starting with NumPy. We only have 3 core packages and we should be able to keep the situation under control.
Not sure about loosing quantum packages in dev_requirements.txt
. They make CI reproducible and, at the same time, they have no effect on the end users that install mitiq via pip install mitiq
.
Current status of dependency management
As it stands, we restrict our mandatory requirements to a "compatible release" using
~=
. https://github.com/unitaryfund/mitiq/blob/a2964908db14df6e8838ee00a20d7414fedc9d13/requirements.txt#L1-L3 For the first line, the~=
syntax means>=1.24.1
and==1.24.*
.Because we use dependabot to regularly open PRs to update our requirements files, these versions are often the latest possible version, and as a result, we are exactly pinning versions as there are no versions more recent than what we specify.
Tradeoffs
Strict(er) pinning is good for reproducibility. It ensures that when a user is downloading mitiq, they have the versions of dependencies that we use for testing mitiq during development. This hopefully reduces the number of spurious bug reports, and otherwise impossible to pin down problems that can arise when someone experiences a problem related to using a collection of dependency versions with which we have not tested. On the other hand, strict dependency resolution can make it harder for users to use mitiq if they have a specific version/range of versions they would like to use for a specific dependency. Requiring strict versions of dependencies means we are forcing external developers to also be strict with their dependencies.
Loose(r) pinning mostly benefits users who want to get spun up with mitiq as quickly as possible. Allowing for a range of versions means less time dealing with dependency conflicts (we all know how fun these can be to resolve). The biggest negative here is that it potentially opens the door for more complicated, system-dependent bugs. Theoretically, this should not happen if there is a perfect maintenance of allowed version range, but it can be very hard to know how exactly what are valid bounds for a specific dependency. E.g. if we depend on A and B, and B depends on A>1.2.2, we should take that into account in our dependency range. This is something a dependency resolver does automatically, but it would become partially our burden.
Other projects
Recommendation
In the short-term, I recommend we figure out what the broadest valid requirements for numpy are, and swap them into the
requirements.txt
file. If it looks good, we should follow suit with our other core dependencies. Ideally this would include the third party quantum dependencies that are not mandatory requirements listed here: https://github.com/unitaryfund/mitiq/blob/a2964908db14df6e8838ee00a20d7414fedc9d13/dev_requirements.txt#L2-L6@andreamari @Misty-W since we talked about this last week