Open nedbat opened 3 years ago
Thanks for the issue! Yes, this should be possible, and I would merge a PR that adds this.
What about packages which have an open-ended requires_python: ">=3.6"
saying 3.6+ but don't yet actually support 3.10 (or it's unknown)?
Here's my reasoning for using only python_requires
in Pallets. With classifiers, I have to add a new line and make a new, potentially empty release, just to indicate that I support the new Python. With python_requires
, I make a new release only to fix compatibility if needed, which is rare. I would also have to do potentially multiple releases, if I want to indicate that all the previous 2.Y-1, etc. releases also supported the latest Python.
It seems that other projects have come to the same conclusion, that keeping the classifiers up to date is noisy/busy work that doesn't really benefit actively maintained/compatible projects.
We ran our tests against 3.10.0rc months ago and addressed any test failures then. I don't think we actually had to make any new releases, just fix some tests.
It's sort of weird though, because technically, python_requires >= 3.6
doesn't indicate support the same way a classifier would. If a project becomes unmaintained and does use something that gets deprecated/breaks after a few releases, it would still appear to "support" the latest version. But at that point you get into trying to determine if a project is still maintained, run its tox
config against the given Python version, etc.
I do see at the top of the site that it says "don't explicitly support Python 3.10" (emphasis added). To indicate the potential inacurracy of checking python_requires
, maybe the site should have more than green/not color.
///
shading: implicit support detected via python_requires
. The color key at the top could mention that packages with this color will probably have support if they were actively maintained around the time of release.python_requires
Are there any projects in the top 360 that are known to be unmaintained and break with 3.10? Perhaps a list of (name, version)
pairs could be maintained that will override detection and mark them as red/unsupported on any version page >= the given one. The policy could be to not accept a PR to the list until X months after release, to allow projects time to update. If a project does become maintained again, a PR can remove it from the list.
green with /// shading: implicit support detected via python_requires. The color key at the top could mention that packages with this color will probably have support if they were actively maintained around the time of release.
This makes sense to me.
Are there any projects in the top 360 that are known to be unmaintained and break with 3.10?
Seems unlikely, but it is hard to gauge whether a project is maintained or not.
It seems that other projects have come to the same conclusion, that keeping the classifiers up to date is noisy/busy work that doesn't really benefit actively maintained/compatible projects.
312 of the top 360 have a 2.x or 3.x classifier, 87%.
Are there any projects in the top 360 that are known to be unmaintained and break with 3.10?
Yes, isodate is one. There are others (~I think~ in the top 360) which still test using nose (nose is unmaintained and doesn't work with 3.10), so it's still unknown if those projects support 3.10.
To get an idea of how many packages don't yet support Python 3.10, I tried installing them all with pip. (Ubuntu pt1, pt2; macOS pt1, pt2; Windows pt1, pt2.)
Looking at the 319 packages not explicitly declaring support for Python 3.10 (on Sunday), these failed to install.
Ubuntu:
macOS:
Windows:
In total, 42 unique packages.
Of these 42: 27 have requires_python
which, if we use the proposed metric, means we'd say they are ready for Python 3.10 when they aren't. Only three explicitly prohibit 3.10 via requires_python
:
Trusting an open-ended python_requires
would declare that these 27 packages are ready for Python 3.10 when they don't even install for Python 3.10.
I can imagine someone wanting to work with pandas, SciPy or TensorFlow checking this site and seeing it green for 3.10, but then confused why it won't even install.
And successful installation doesn't necessarily mean successfully running on Python 3.10. I expect more outside this list have runtime incompatibilities as well, such as isodate.
There are other metrics and heuristics that could be added. For example, check if a wheel is available for the given Python version exists, and treat that the same as a classifier. Skimming the list, it looks like most that don't install would probably have other install requirements, such as compilation toolchains. I'd also treat having no platform wheels and only a generic wheel as satisfying that metric.
I'd like to hear more ideas for solutions to the issue. I strongly prefer not having to update classifiers on my projects, for the reasons stated above. So they will forever be listed as "not Python 3.X ready" under the current metric, which is unfortunate.
FYI, the @pygame_org
twitter account is using this list based on classifiers to argue to the PSF that it should not promote .0 releases (or maybe even .1 releases) of Python https://twitter.com/pygame_org/status/1584872593597042688 . (And they blocked me for disagreeing that this release is not just like like last ones)
My suggestion: mark any packages that do not have minor version classifiers in a different color. Also check for wheels, and anything shipping 3.11 binary wheels get a green color too (I think this is easier with the json API? Haven't checked). This still misses some Python libraries, but the colors + check for wheels would really help.
Some examples that are still not 3.10 compatible according to the current logic: setuptools, six, importlib-metadata, zipp, click, flask, beautifulsoup4, and more.
FYI, you can't use Requires-Python
metadata for this, because it was not intended[^1] to be capped, and causes issues if it is. It was only designed to allow authors to drop support for old releases, not limit support for new releases. If you don't support Python 3.12, you can't add <3.12
to Python-Requires, it just causes back solves to older uncapped releases. Numba, which never supports future Python versions, had to remove this as a method of capping.
[^1]: At least it was never implemented.
Here's my concrete suggestion: Three colors: Green + checkmark for N classifier or N binary wheel. Red + X for neither of these but N-1 classifier or N-1 binary wheel present[^2]. And white + question mark otherwise. Didn't check to see if any of those have ABI3 wheels, those are harder to place.
14 packages[^1] would be added by including wheels in addition to classifiers for 3.11. Only 25 (total) packages provide 3.10 wheels but not 3.11, compared to 36 that provide 3.11 wheels.
This doesn't turn packaging
, setuptools
, etc. green, but it at least fixes 14 of them and adds two layers of color.
I can implement that if it sounds reasonable.
[^1]: pyyaml, pandas, cffi, aiohttp, lxml, greenlet, scipy, cython, kiwisolver, ruamel-yaml-clib, zope-interface, pyzmq, torch, shapely [^2]: This still often means a package just forgot to update in the case of classifiers. It's much more of a real "red x" if there's no wheel. So maybe this could be red / light red?
I made this chart that shows the number of packages that support a particular Python version based on binary wheels' filenames (limited to 3.9-3.11 - I wanted a visual aid to help think about whether to upgrade a project I'm involved with to 3.9 or 3.10). The discussion here was helpful for this, thank you.
Caveats & notes: The underlying script makes use of the abi3
tag when it's present. I didn't know about this tag before reading this thread, and so how the ABI tag is used may be incorrect or insufficient. The underlying data is from something like two weeks ago. The chart doesn't auto-update at all.
Red + X for neither of these but N-1 classifier or N-1 binary wheel present
For the chart, I adjusted this condition to "neither of these but a classifier for a different minor version present or a binary wheel for an earlier minor version present". Although the numbers are very few, some packages, like backports.zoneinfo, legitimately do not support the last few Python versions, and if we look at only the previous version (N-1), they get classified as 'maybe' / 'white + question mark' rather than 'no' / 'red + x'.
In case it's useful, I wrote a script that has a few different tricks for determining whether a package supports a Python: https://github.com/hauntsaninja/python_readiness
I currently can determine whether something is explicitly supported, viable, unsupported, or unknown. See exact set of support levels here: https://github.com/hauntsaninja/python_readiness/blob/main/python_readiness.py#L166-L173 It mainly uses classifiers and wheel tags. There are a few interesting wrinkles that you can see in the comments (e.g. I do use Requires-Python but in a very limited way)
I've been using variants of this script to help drive upgrades at work since Python 3.11
I've tried the script I've been using to process my own files on the top 360 packages from PyPI, here are the stats:
Totals:
Wheel: 36
Classifier: 44
NoWheel: 21
NoClassifier: 207
Unknown: 52
Max classifiers:
3.12: 129
3.11: 38
3.10: 19
3.9: 10
3.8: 5
3.7: 2
3.6: 3
3.4: 1
NoClassifier means there's at least one versioned classifier, but no 3.13 one (and I print a histogram of the max classifier present below). NoWheel means there are wheels for 3.12, but not 3.13. Unknown means there are no versioned classifiers and no 3.12 wheels. I'm not considering Python 2 classifiers.
Script is at https://github.com/henryiii/pystats/blob/9575988e260db3bbc3f170d5621eec7d0d255045/pyready/pymodel.py
For fun, went ahead and did the top 8000. Makes a 1.2 GB json file in the middle.
Totals:
Wheel: 171
Classifier: 733
NoWheel: 364
NoClassifier: 4724
Unknown: 2008
Max classifiers:
3.12: 1950
3.11: 952
3.10: 454
3.9: 314
3.8: 308
3.7: 256
3.6: 255
3.5: 110
3.4: 73
3.3: 41
3.2: 8
3.1: 3
IMO, the only "bright red" packages in the top 360 right now are these:
wrapt Needs wheels! >=3.6
pyarrow Needs wheels! >=3.8
SQLAlchemy Needs wheels! >=3.7
frozenlist Needs wheels! >=3.8
psycopg2-binary Needs wheels! >=3.7
snowflake-connector-python Needs wheels! >=3.8
google-crc32c Needs wheels! >=3.9
ruamel.yaml.clib Needs wheels! >=3.6
dulwich Needs wheels! >=3.7
pyrsistent Needs wheels! >=3.8
pendulum Needs wheels! >=3.8
debugpy Needs wheels! >=3.8
torch Needs wheels! >=3.8.0
tokenizers Needs wheels! >=3.7
mysql-connector-python Needs wheels! >=3.8
pyodbc Needs wheels! >=3.8
setproctitle Needs wheels! >=3.7
scikit-image Needs wheels! >=3.9
mypy Needs wheels! >=3.8
lz4 Needs wheels! >=3.8
sentencepiece Needs wheels!
Those don't have wheels, so are actually pretty likely to not work. Though I think mypy does (it's just slower) and IIRC lz4 does as long as it can build (compiler present and all that). And wrapt should work but just be slower. So even that not a guarantee.
Capping requires-python doesn't actually work correctly, so it's rare to see any caps except <4
(mostly Poetry projects), the only one in the top 360 is apache-airflow, so that probably won't work either. Packages that tried to add a cap like numpy and numba had to remove it again.
The JSON API includes the
requires_python
metadata for each package. If that were included, more packages would be marked as ready for 3.10 (for example).As one data point, pip will happily install Jinja2 into 3.10, but Jinja2 is marked as not ready for 3.10 because it doesn't mention the 3.10 classifier. It has
requires_python: ">=3.6"
and so could be marked as ready.