Open mtelka opened 1 year ago
Out of curiosity, is pinning to an older version causing any hardship for you? I'm just trying to understand the end-user impact of pinning versions in pyproject.toml
.
I'm packaging zstandard
(and many other Python projects) for OpenIndiana and the packaging is running in current environment (not in virtual env) because final packages will be installed in current environment, not in a virtual one. We also run tests in current env. Pinning to non-latest build-system requires
causes build to fail so we need to patch this pinning out.
Bleh.
Feel free to submit a PR that removes the version pinning from pyproject.toml
. It's probably more trouble than it is worth since it is becoming apparent that the modern Python packaging tooling wasn't really designed with determinism as a first class feature.
@mtelka
Comment from @rgommers on discuss.python.org:
Yes, this is a key problem with pyproject.toml. It is actually not generic, the constraints you write in it are specific to how you build wheels for PyPI - they do not apply for other distros. This is not really documented anywhere, and widely misunderstood by distro packagers.
@jeanas, I think PEP 518 is generic enough and it is not tied to PyPI only. So I think @rgommers is wrong in his comment. This is maybe one of reasons why numpy
was (or maybe still is, I didn't check in past few months) very hard to package due to its strict pinning everywhere.
I'm definitely not wrong there. Pins you see may be "doesn't work at all, hence we must pin", or "could leave this out on distros, but too unsafe to do for sdist uploaded to PyPI", or "pin is necessary for wheel builds due to some reason". The numpy==
pins that I think you are referring to are the last case - distros really don't need those, but for packages uploaded to PyPI they must be present to avoid ABI compatibility issues.
Regarding this issue, I don't have all the context but it seems similar. wheel==
is probably overly strict even on PyPI, but putting an upper bound on wheel
may be healthy to avoid future "build PyPI sdist from source as end user" issues - I've certainly done that in the past. While as a distro maintainer, you can quite safely leave wheel
unpinned - if it ever breaks, you just deal with it then. Package authors who upload to PyPI don't have that luxury, since metadata there is immutable.
The
numpy==
pins that I think you are referring to are the last case - distros really don't need those, but for packages uploaded to PyPI they must be present to avoid ABI compatibility issues.
No, I'm not referring to other Python projects that use numpy
(thus numpy==
), but to numpy
itself. In past they pinned setuptools to old version. And there were some other issues related to pinning in numpy
itself, IIRC. Anyway, numpy
is not the main topic here.
Similar pinning for build deps is rare and I think zstandard
is the only packaged Python project in OpenIndiana (we have hundreds Python projects packaged) where I had to patch pyproject.toml
to unpin build deps.
I'm not arguing about runtime pinning here. That's okay (even often overkill) and I understand why you want strict runtime pinning.
Regarding this issue, I don't have all the context but it seems similar.
wheel==
is probably overly strict even on PyPI, but putting an upper bound onwheel
may be healthy to avoid future "build PyPI sdist from source as end user" issues - I've certainly done that in the past.
Likely I lack some knowledge here, but I do not understand why an end user would want to build sdist for anything serious. The sdist is created by the project and uploaded at PyPI for download. End user should just download the sdist (and create wheel from it, if he needs/wants).
Overly strict pinning (to rule out versions that are not released yet) is preparation for incompatibility that you anticipate (or better: expect to happen soon) but such incompatibility may never materialize.
Likely I lack some knowledge here, but I do not understand why an end user would want to build sdist for anything serious. The sdist is created by the project and uploaded at PyPI for download. End user should just download the sdist (and create wheel from it, if he needs/wants).
That's not really how it works. When you type pip install pkgname
, pip
will try to find a wheel but if it doesn't find one, it starts building from source without asking the user. So if you have a million users, and 1-2% are on platforms for which no wheels are provided, you get 10,000-20.000 users who build from source in very much uncontrolled environments. Now if wheel
makes a backwards incompatible change somewhere (either a bug or intentional), your package doesn't build from source anymore. So that's going to affect all those users. And you can't fix that as a package author - all you can do is cut a completely new release.
In past they pinned setuptools to old version.
What I wrote above is exactly why that pin was there. setuptools
breaks backwards compat all the time, and in really major and hard-to-fix ways. numpy
simply did not build with latest setuptools
across all the platforms it needed to. I'm very glad we were able to drop it.
Likely I lack some knowledge here, but I do not understand why an end user would want to build sdist for anything serious. The sdist is created by the project and uploaded at PyPI for download. End user should just download the sdist (and create wheel from it, if he needs/wants).
That's not really how it works. When you type
pip install pkgname
,pip
will try to find a wheel but if it doesn't find one, it starts building from source without asking the user. So if you have a million users, and 1-2% are on platforms for which no wheels are provided, you get 10,000-20.000 users who build from source in very much uncontrolled environments.
I think you wrote same as me just using different words :-).
Now if
wheel
makes a backwards incompatible change somewhere (either a bug or intentional), your package doesn't build from source anymore. So that's going to affect all those users. And you can't fix that as a package author - all you can do is cut a completely new release.
Sure, yes. But you would have similar (or possibly even more serious) problem when you pin extensively. For example, when your pinned dependency have critical issue (e.g. security vulnerability).
In any case, this is your project and I do not want to push you to do anything you do not want to do with it. I just reported something I think is a problem (even minor) and it is up to you how you'll deal with it. I can happily live with my patch for pyproject.toml
.
In general, pinning because of "guaranteed 100% reproducibility" is out of place in pyproject.toml -- which was not designed for it at all -- modulo "we are doing it anyway because it's not clear how to get cibuildwheel to build PyPI wheels with the ABI we want unless we pin it in pyproject.toml", which is understandable albeit still not technically correct.
The numpy pin on setuptools was, in fact, correct -- because it defined an actual restriction that really existed. If numpy knows that it cannot and will not build with newer setuptools, there is nothing to discuss, because the version bound correctly communicates to end users (including both pip install
from source as well as linux distros) what will actually work and which they, as distro builders, are expected and mandated to use -- no ifs, ands, or buts.
Pinning the version of wheel is not like this. The expectation is that new versions of wheel fix bugs, not add them, and it has no track record of making backwards-incompatible changes that require new sdist releases of downstream projects using wheel (either to adapt to the wheel
API change, or to permanently opt out of ever upgrading wheel
again).
At worst, wheel releases a buggy version -- everyone reports bugs in it, and wheel fixes the bug and releases yet another version, one which fixes the bug, and then balance and harmony is restored to the world and everything works again -- still without pinning.
The end result of including pinning in pyproject.toml is that downstream distributors are educated to believe that upstream cannot be trusted. Instead, they build with:
python -m build --skip-dependency-check
gpep517
, which does no checking anywayThis does NOT mean that pinning is ignored. It means that the entire build-system.requires table is ignored. Here, too, balance and harmony is restored to the world. Effectively, we return to the pre-pep517 state of affairs, where builders were expected to manually handcraft a suitable build environment and if they failed, they got to keep both pieces. All that changed is that now instead of running python setup.py bdist_wheel
you have a program whose job is to run setuptools.build_meta, whose job is to run... [sys.executable, 'setup.py', 'bdist_wheel']
. :D
Sadly, the current pinning approach is broken -- as all such attempts in pyproject.toml are inevitably doomed to be broken. The cffi dependency is pinned, but the environment is not actually reproducible -- because instead of freezing the environment with pip freeze
and recreating that environment prior to building with a PEP 517 build frontend, the top-level explicitly specified environment packages were given versioning. This misses indirect dependencies, in this case, cffi depends on pycparser but any version of pycparser might be installed in CI when building python-zstandard.
The correct solution here is to pass --constraint constraints-file.txt
(same format as a frozen requirements.txt) to pip install / pip wheel since if I recall correctly it will also constrain the versions of all packages installed into the isolated build environment. There's no direct option to python -m build
for this but $PIP_CONSTRAINT
may work here since it should be using pip under the hood for that anyway.
@eli-schwartz the --constraint
CLI arg is not passed to the ephemeral virtualenv provisioning in pip install
or other pip commands. However, PIP_CONSTRAINT
does work as it leaks into the underlying pip install
invocation that runs within that ephemeral virtualenv. Similarly, this does work with pypa/build.
I've been using this approach for over two years now: https://github.com/pypa/build/issues/292#issuecomment-999823527.
@indygreg I'm reading your last blog post now and see that you mentioned pip-tools. At the time of writing, there was a PR open implementing precisely what you wanted. I merged it 2 days after you post :) It took 13 months of reviews and the docs are here https://pip-tools.rtfd.io/#maximizing-reproducibility. Note that it's not yet released, though — it's only available in the main branch of pip-tools. But TL;DR — it produces a constraint file derived from what's in the pyproject.toml
's [build-system].requires
, combined with what the selected build backend requires on additionally through PEP 517 hooks.
Once released, you'll be able to generate that lockfile more easily. But you can also create one by hand today.
I also have hints regarding other parts of your article, but this issue is probably not the right place to discuss that :)
@eli-schwartz re:cibuildwheel -- see https://github.com/pypa/cibuildwheel/issues/1666 / https://github.com/pypa/cibuildwheel/issues/1683 / https://github.com/pypa/cibuildwheel/pull/1675 for some of the related improvement efforts.
Also see my ramblings regarding pinning differing env contexts: https://github.com/jazzband/pip-tools/issues/1326#issuecomment-1834517252
@mtelka @indygreg on a related note, wheel
shouldn't even be listed among direct build dependencies. setuptools
brings it in automatically, when building wheels is requested (and doesn't pull in the dependency when the frontend is only going to build an sdist!). See https://github.com/pypa/setuptools/pull/3056.
@webknjaz thanks for clarifying! I knew that somewhere in this design space, constraints files were the right answer... but since I'm more accustomed to the distro tooling, we don't do pinning so I don't regularly do this and clearly misremembered it a bit.
Also: thank you for (helping to) maintain pip-tools and bring a bit of sanity to the python packaging ecosystem. :) It is an absolutely fantastic project.
Commit f90e2bbcb800047cc971864162998a99d4c2673f changed the pyproject.toml
versions to be constraints rather than specific versions. Does this address this issue?
@indygreg wheel
shouldn't be listed there as setuptools
depends on it when needed 🤷♂️
In fact, it moved into setuptools
this week during the PyCon US 2024 Sprints so its direct use will be decreasing over time.
Commit f90e2bb changed the
pyproject.toml
versions to be constraints rather than specific versions. Does this address this issue?
Partially. The issue was not really about whether it is a constraint vs a specific version. The real issue was that specific versions include "upper bound" constraints (as well as lower bound constraints); this adds a challenge to downstream redistributors who updated their setuptools version.
Several upper bound constraints were removed in that commit, but one remains:
# 69.0.0 breaks handling of --config-settings=--build-option, which our CI
# relies on. So constrained to an older version until we figure out a
# workaround. See comment at
# https://github.com/pypa/pip/issues/11859#issuecomment-2132287974.
"setuptools<69.0.0",
Redistributors may or may not be using --build-option, but if they are then they are dealing with it at scale across many packages, not specific to zstandard.
@indygreg
wheel
shouldn't be listed there assetuptools
depends on it when needed 🤷♂️
Granted if the goal is recursively pinning all packages that should be installed into an isolated build venv, then listing wheel would indeed be intentional...
@indygreg wheel shouldn't be listed there as setuptools depends on it when needed 🤷♂️
Granted if the goal is recursively pinning all packages that should be installed into an isolated build venv, then listing wheel would indeed be intentional...
@eli-schwartz using the feature of pip-tools that I mentioned earlier, it's still possible to retrieve such additional/transitive deps for PEP 517 build env.
I'll remove wheel
as an explicit dependency since it is transitive.
As for setuptools
, I'm really not sure what to do.
I want to use pip install --config-settings=
since that is the official way forward. I had ported CI to use it last year. Even wrote a long blog post about the experience, which turned into some long threads on discuss.python.org.
But days after that ordeal, setuptools broke --config-settings=--build-option
, at least the way this project's CI uses it. See also my comment at https://github.com/pypa/pip/issues/11859#issuecomment-2132287974.
I'm not sure how to even implement dynamic build backend features any more in a PEP 517 / pyproject.toml
world. I already spent hours shaving this yak last year and quite frankly I don't want to have to page it all in again. Maybe I should just use the deprecated pip install --global-option
and hope setuptools gets their house in order before pip removes that feature?
I think preferably you'd tell cibuildwheel to use ci/requirements.${{matrix.py}}.txt
instead of build-system.requires to specify which version of setuptools your CI relies on.
zstandard 0.22.0
pinswheel
to0.41.2
, butwheel 0.41.3
is already out. Building withwheel==0.41.3
works well. Please update dependencies. Thank you.