di / pyreadiness

Python support graph for specific Python versions for the most popular Python packages
https://pyreadiness.org
Apache License 2.0
41 stars 8 forks source link

Can pyreadiness look at more than the classifiers? #18

Open nedbat opened 3 years ago

nedbat commented 3 years ago

The JSON API includes the requires_python metadata for each package. If that were included, more packages would be marked as ready for 3.10 (for example).

As one data point, pip will happily install Jinja2 into 3.10, but Jinja2 is marked as not ready for 3.10 because it doesn't mention the 3.10 classifier. It has requires_python: ">=3.6" and so could be marked as ready.

di commented 3 years ago

Thanks for the issue! Yes, this should be possible, and I would merge a PR that adds this.

hugovk commented 3 years ago

What about packages which have an open-ended requires_python: ">=3.6" saying 3.6+ but don't yet actually support 3.10 (or it's unknown)?

davidism commented 3 years ago

Here's my reasoning for using only python_requires in Pallets. With classifiers, I have to add a new line and make a new, potentially empty release, just to indicate that I support the new Python. With python_requires, I make a new release only to fix compatibility if needed, which is rare. I would also have to do potentially multiple releases, if I want to indicate that all the previous 2.Y-1, etc. releases also supported the latest Python.

It seems that other projects have come to the same conclusion, that keeping the classifiers up to date is noisy/busy work that doesn't really benefit actively maintained/compatible projects.

We ran our tests against 3.10.0rc months ago and addressed any test failures then. I don't think we actually had to make any new releases, just fix some tests.

It's sort of weird though, because technically, python_requires >= 3.6 doesn't indicate support the same way a classifier would. If a project becomes unmaintained and does use something that gets deprecated/breaks after a few releases, it would still appear to "support" the latest version. But at that point you get into trying to determine if a project is still maintained, run its tox config against the given Python version, etc.

davidism commented 3 years ago

I do see at the top of the site that it says "don't explicitly support Python 3.10" (emphasis added). To indicate the potential inacurracy of checking python_requires, maybe the site should have more than green/not color.

davidism commented 3 years ago

Are there any projects in the top 360 that are known to be unmaintained and break with 3.10? Perhaps a list of (name, version) pairs could be maintained that will override detection and mark them as red/unsupported on any version page >= the given one. The policy could be to not accept a PR to the list until X months after release, to allow projects time to update. If a project does become maintained again, a PR can remove it from the list.

di commented 3 years ago

green with /// shading: implicit support detected via python_requires. The color key at the top could mention that packages with this color will probably have support if they were actively maintained around the time of release.

This makes sense to me.

Are there any projects in the top 360 that are known to be unmaintained and break with 3.10?

Seems unlikely, but it is hard to gauge whether a project is maintained or not.

hugovk commented 3 years ago

It seems that other projects have come to the same conclusion, that keeping the classifiers up to date is noisy/busy work that doesn't really benefit actively maintained/compatible projects.

312 of the top 360 have a 2.x or 3.x classifier, 87%.

Are there any projects in the top 360 that are known to be unmaintained and break with 3.10?

Yes, isodate is one. There are others (~I think~ in the top 360) which still test using nose (nose is unmaintained and doesn't work with 3.10), so it's still unknown if those projects support 3.10.

hugovk commented 3 years ago

To get an idea of how many packages don't yet support Python 3.10, I tried installing them all with pip. (Ubuntu pt1, pt2; macOS pt1, pt2; Windows pt1, pt2.)

Looking at the 319 packages not explicitly declaring support for Python 3.10 (on Sunday), these failed to install.

Ubuntu:

  1. pywavelets
  2. h5py
  3. pandas-gbq
  4. tensorflow
  5. tensorflow-addons
  6. pyarrow
  7. tensorflow-data-validation
  8. tensorflow-serving-api
  9. tfx-bsl
  10. datalab
  11. scikit-image
  12. tensorflow-model-analysis
  13. tensorflow-transform
  14. backports-zoneinfo
  15. azureml-dataprep
  16. numba
  17. azureml-core
  18. llvmlite
  19. torch

macOS:

  1. pandas
  2. pyarrow
  3. matplotlib
  4. scikit-learn
  5. tensorflow-serving-api
  6. mlflow
  7. lightgbm
  8. seaborn
  9. pywavelets
  10. scikit-image
  11. gensim
  12. tfx-bsl
  13. pandas-gbq
  14. tensorflow-transform
  15. tensorflow
  16. tensorflow-addons
  17. tensorflow-data-validation
  18. tensorflow-model-analysis
  19. datalab
  20. h5py
  21. xgboost
  22. backports-zoneinfo
  23. statsmodels
  24. azureml-dataprep
  25. numba
  26. azureml-core
  27. llvmlite
  28. torch

Windows:

  1. ipywidgets
  2. jupyter
  3. h5py
  4. gensim
  5. lxml
  6. nbclient
  7. notebook
  8. pandas
  9. psycopg2
  10. psycopg2-binary
  11. pandas-gbq
  12. pyarrow
  13. pywavelets
  14. scipy
  15. scikit-image
  16. lightgbm
  17. tensorflow
  18. tensorflow-addons
  19. seaborn
  20. tensorflow-data-validation
  21. tensorflow-serving-api
  22. datalab
  23. terminado
  24. tfx-bsl
  25. tensorflow-transform
  26. widgetsnbextension
  27. snowflake-connector-python
  28. tensorflow-model-analysis
  29. xgboost
  30. scikit-learn
  31. azure-identity
  32. msal-extensions
  33. mlflow
  34. nbconvert
  35. backports-zoneinfo
  36. statsmodels
  37. azureml-dataprep
  38. numba
  39. azureml-core
  40. llvmlite
  41. torch

In total, 42 unique packages.

Of these 42: 27 have requires_python which, if we use the proposed metric, means we'd say they are ready for Python 3.10 when they aren't. Only three explicitly prohibit 3.10 via requires_python:

  1. azure-identity
  2. azureml-core >=3.6,<3.9
  3. azureml-dataprep
  4. backports.zoneinfo >=3.6
  5. datalab
  6. gensim >=3.6
  7. h5py >=3.7
  8. ipywidgets
  9. jupyter None
  10. lightgbm
  11. llvmlite >=3.7,<3.10
  12. lxml >=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, != 3.4.*
  13. matplotlib >=3.7
  14. mlflow >=3.6
  15. msal-extensions
  16. nbclient >=3.6.1
  17. nbconvert >=3.7
  18. notebook >=3.6
  19. numba >=3.7,<3.10
  20. pandas >=3.7.1
  21. pandas-gbq >=3.7
  22. psycopg2 >=3.6
  23. psycopg2-binary >=3.6
  24. pyarrow >=3.6
  25. PyWavelets >=3.5
  26. scikit-image >=3.7
  27. scikit-learn >=3.7
  28. scipy >=3.7,<3.10
  29. seaborn >=3.6
  30. snowflake-connector-python >=3.6
  31. statsmodels >=3.7
  32. tensorflow
  33. tensorflow-addons
  34. tensorflow-data-validation >=3.6,<4
  35. tensorflow-model-analysis >=3.6,<4
  36. tensorflow-serving-api
  37. tensorflow-transform >=3.6,<4
  38. terminado >=3.6
  39. tfx-bsl >=3.6,<4
  40. torch >=3.6.2
  41. widgetsnbextension
  42. xgboost >=3.6

Trusting an open-ended python_requires would declare that these 27 packages are ready for Python 3.10 when they don't even install for Python 3.10.

I can imagine someone wanting to work with pandas, SciPy or TensorFlow checking this site and seeing it green for 3.10, but then confused why it won't even install.

And successful installation doesn't necessarily mean successfully running on Python 3.10. I expect more outside this list have runtime incompatibilities as well, such as isodate.

davidism commented 3 years ago

There are other metrics and heuristics that could be added. For example, check if a wheel is available for the given Python version exists, and treat that the same as a classifier. Skimming the list, it looks like most that don't install would probably have other install requirements, such as compilation toolchains. I'd also treat having no platform wheels and only a generic wheel as satisfying that metric.

I'd like to hear more ideas for solutions to the issue. I strongly prefer not having to update classifiers on my projects, for the reasons stated above. So they will forever be listed as "not Python 3.X ready" under the current metric, which is unfortunate.

henryiii commented 2 years ago

FYI, the @pygame_org twitter account is using this list based on classifiers to argue to the PSF that it should not promote .0 releases (or maybe even .1 releases) of Python https://twitter.com/pygame_org/status/1584872593597042688 . (And they blocked me for disagreeing that this release is not just like like last ones)

My suggestion: mark any packages that do not have minor version classifiers in a different color. Also check for wheels, and anything shipping 3.11 binary wheels get a green color too (I think this is easier with the json API? Haven't checked). This still misses some Python libraries, but the colors + check for wheels would really help.

Some examples that are still not 3.10 compatible according to the current logic: setuptools, six, importlib-metadata, zipp, click, flask, beautifulsoup4, and more.

FYI, you can't use Requires-Python metadata for this, because it was not intended[^1] to be capped, and causes issues if it is. It was only designed to allow authors to drop support for old releases, not limit support for new releases. If you don't support Python 3.12, you can't add <3.12 to Python-Requires, it just causes back solves to older uncapped releases. Numba, which never supports future Python versions, had to remove this as a method of capping.

[^1]: At least it was never implemented.

henryiii commented 2 years ago

Here's my concrete suggestion: Three colors: Green + checkmark for N classifier or N binary wheel. Red + X for neither of these but N-1 classifier or N-1 binary wheel present[^2]. And white + question mark otherwise. Didn't check to see if any of those have ABI3 wheels, those are harder to place.

14 packages[^1] would be added by including wheels in addition to classifiers for 3.11. Only 25 (total) packages provide 3.10 wheels but not 3.11, compared to 36 that provide 3.11 wheels.

This doesn't turn packaging, setuptools, etc. green, but it at least fixes 14 of them and adds two layers of color.

I can implement that if it sounds reasonable.

[^1]: pyyaml, pandas, cffi, aiohttp, lxml, greenlet, scipy, cython, kiwisolver, ruamel-yaml-clib, zope-interface, pyzmq, torch, shapely [^2]: This still often means a package just forgot to update in the case of classifiers. It's much more of a real "red x" if there's no wheel. So maybe this could be red / light red?

ento commented 1 year ago

I made this chart that shows the number of packages that support a particular Python version based on binary wheels' filenames (limited to 3.9-3.11 - I wanted a visual aid to help think about whether to upgrade a project I'm involved with to 3.9 or 3.10). The discussion here was helpful for this, thank you.

Caveats & notes: The underlying script makes use of the abi3 tag when it's present. I didn't know about this tag before reading this thread, and so how the ABI tag is used may be incorrect or insufficient. The underlying data is from something like two weeks ago. The chart doesn't auto-update at all.

Red + X for neither of these but N-1 classifier or N-1 binary wheel present

For the chart, I adjusted this condition to "neither of these but a classifier for a different minor version present or a binary wheel for an earlier minor version present". Although the numbers are very few, some packages, like backports.zoneinfo, legitimately do not support the last few Python versions, and if we look at only the previous version (N-1), they get classified as 'maybe' / 'white + question mark' rather than 'no' / 'red + x'.

hauntsaninja commented 1 month ago

In case it's useful, I wrote a script that has a few different tricks for determining whether a package supports a Python: https://github.com/hauntsaninja/python_readiness

I currently can determine whether something is explicitly supported, viable, unsupported, or unknown. See exact set of support levels here: https://github.com/hauntsaninja/python_readiness/blob/main/python_readiness.py#L166-L173 It mainly uses classifiers and wheel tags. There are a few interesting wrinkles that you can see in the comments (e.g. I do use Requires-Python but in a very limited way)

I've been using variants of this script to help drive upgrades at work since Python 3.11

henryiii commented 1 month ago

I've tried the script I've been using to process my own files on the top 360 packages from PyPI, here are the stats:

Totals:
  Wheel: 36
  Classifier: 44
  NoWheel: 21
  NoClassifier: 207
  Unknown: 52

Max classifiers:
  3.12: 129
  3.11: 38
  3.10: 19
  3.9: 10
  3.8: 5
  3.7: 2
  3.6: 3
  3.4: 1

NoClassifier means there's at least one versioned classifier, but no 3.13 one (and I print a histogram of the max classifier present below). NoWheel means there are wheels for 3.12, but not 3.13. Unknown means there are no versioned classifiers and no 3.12 wheels. I'm not considering Python 2 classifiers.

Script is at https://github.com/henryiii/pystats/blob/9575988e260db3bbc3f170d5621eec7d0d255045/pyready/pymodel.py

Full package list with info: ``` boto3 Yes >=3.8 urllib3 Yes >=3.8 botocore Yes >=3.8 requests No (3.12) >=3.8 setuptools >=3.8 certifi No (3.12) >=3.6 idna Yes >=3.6 charset-normalizer Yes Wheels >=3.7.0 typing-extensions Yes >=3.8 python-dateutil No (3.12) !=3.0.*,!=3.1.*,!=3.2.*,>=2.7 s3transfer Yes >=3.8 packaging Yes >=3.8 grpcio-status Yes >=3.8 aiobotocore Yes >=3.8 six >=2.7, !=3.0.*, !=3.1.*, !=3.2.* PyYAML Yes Wheels >=3.8 numpy Yes Wheels >=3.10 s3fs No (3.11) >=3.8 fsspec No (3.12) >=3.8 cryptography No (3.12) >=3.7 importlib-metadata >=3.8 pip No (3.12) >=3.8 cffi Yes Wheels >=3.8 pandas Yes Wheels >=3.9 zipp >=3.8 pydantic Yes >=3.8 google-api-core No (3.12) >=3.7 pycparser No (3.12) >=3.8 wheel No (3.12) >=3.8 jmespath No (3.11) >=3.7 attrs Yes >=3.7 protobuf No (3.12) >=3.8 rsa No (3.10) >=3.6,<4 click >=3.7 pyasn1 Yes >=3.8 awscli No (3.12) >=3.8 platformdirs Yes >=3.8 pytz Yes None colorama No (3.10) !=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,!=3.6.*,>=2.7 Jinja2 >=3.7 MarkupSafe Yes Wheels >=3.9 PyJWT No (3.12) >=3.8 googleapis-common-protos No (3.12) >=3.7 tomli >=3.8 filelock Yes >=3.8 pydantic-core Yes Wheels >=3.8 cachetools No (3.12) >=3.7 wrapt Needs wheels! >=3.6 google-auth No (3.12) >=3.7 virtualenv Yes >=3.7 pluggy No (3.11) >=3.8 pytest No (3.12) >=3.8 docutils No (3.11) >=3.9 pyarrow Needs wheels! >=3.8 pyparsing Yes >=3.6.8 pyasn1-modules Yes >=3.8 requests-oauthlib No (3.12) >=3.4 aiohttp Yes Wheels >=3.8 SQLAlchemy Needs wheels! >=3.7 oauthlib No (3.10) >=3.6 iniconfig No (3.11) >=3.7 annotated-types No (3.12) >=3.8 jsonschema Yes >=3.8 exceptiongroup >=3.7 yarl Yes Wheels >=3.8 scipy Yes Wheels >=3.10 multidict Yes Wheels >=3.8 tzdata >=2 soupsieve Yes >=3.8 greenlet Yes Wheels >=3.7 isodate Yes >=3.7 Pygments No (3.12) >=3.8 beautifulsoup4 >=3.6.0 psutil !=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,>=2.7 pillow Yes Wheels >=3.8 frozenlist Needs wheels! >=3.8 decorator No (3.10) >=3.5 pyOpenSSL No (3.12) >=3.7 aiosignal No (3.11) >=3.7 tomlkit No (3.12) >=3.8 requests-toolbelt No (3.11) >=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.* async-timeout No (3.11) >=3.7 tqdm No (3.12) >=3.7 distlib Yes None openpyxl No (3.11) >=3.8 more-itertools No (3.12) >=3.8 grpcio Yes Wheels >=3.8 et-xmlfile No (3.9) >=3.6 h11 No (3.10) >=3.7 Deprecated No (3.12) >=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.* sniffio >=3.7 lxml Yes Wheels >=3.6 anyio Yes >=3.9 PyNaCl No (3.10) >=3.6 proto-plus No (3.12) >=3.7 Werkzeug >=3.8 asn1crypto No (3.10) google-cloud-storage No (3.12) >=3.7 azure-core No (3.12) >=3.8 coverage Yes Wheels >=3.9 websocket-client No (3.12) >=3.8 msgpack Yes Wheels >=3.8 mypy-extensions No (3.11) >=3.5 rich Yes >=3.8.0 pexpect importlib-resources >=3.8 sortedcontainers No (3.7) ptyprocess chardet No (3.11) >=3.7 grpcio-tools Yes Wheels >=3.8 cloudpickle No (3.12) >=3.8 tenacity No (3.12) >=3.8 dill Yes >=3.8 aiohappyeyeballs Yes >=3.8 httpx No (3.12) >=3.8 rpds-py Yes Wheels >=3.8 poetry-core No (3.12) >=3.8,<4.0 referencing No (3.12) >=3.8 Flask >=3.8 matplotlib Yes Wheels >=3.9 google-cloud-core No (3.12) >=3.7 python-dotenv No (3.12) >=3.8 msal No (3.12) >=3.7 httpcore No (3.12) >=3.8 jsonschema-specifications Yes >=3.9 psycopg2-binary Needs wheels! >=3.7 markdown-it-py No (3.11) >=3.8 keyring >=3.8 bcrypt No (3.12) >=3.7 google-resumable-media No (3.12) >=3.7 poetry-plugin-export No (3.12) <4.0,>=3.8 mdurl No (3.10) >=3.7 scikit-learn Yes Wheels >=3.9 pkginfo Yes >=3.8 pathspec No (3.12) >=3.8 snowflake-connector-python Needs wheels! >=3.8 paramiko No (3.11) >=3.6 tabulate No (3.10) >=3.7 GitPython No (3.12) >=3.7 networkx No (3.12) >=3.10 regex Yes Wheels >=3.8 kiwisolver Yes Wheels >=3.8 jaraco.classes >=3.8 smmap No (3.12) >=3.7 gitdb No (3.12) >=3.7 jeepney >=3.7 SecretStorage No (3.10) >=3.6 wcwidth No (3.12) build Yes >=3.8 backoff No (3.10) >=3.7,<4.0 shellingham No (3.12) >=3.7 ruamel.yaml No (3.12) >=3.7 typedload Yes >=3.8 xmltodict Yes >=3.6 cycler No (3.12) >=3.8 portalocker No (3.12) >=3.8 itsdangerous >=3.8 py No (3.10) >=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.* google-crc32c Needs wheels! >=3.9 RapidFuzz Yes Wheels >=3.9 threadpoolctl No (3.12) >=3.8 pyproject-hooks >=3.7 pyzmq Yes Wheels >=3.7 awswrangler No (3.12) <4.0,>=3.8 google-cloud-bigquery No (3.12) >=3.7 sqlparse No (3.12) >=3.8 fastjsonschema No (3.12) None py4j No (3.10) azure-storage-blob No (3.12) >=3.8 msal-extensions No (3.12) >=3.7 pytest-cov No (3.12) >=3.8 trove-classifiers None termcolor Yes >=3.9 mccabe No (3.10) >=3.6 joblib No (3.12) >=3.8 google-api-python-client No (3.12) >=3.7 google-auth-oauthlib No (3.12) >=3.6 fastapi No (3.12) >=3.8 pycodestyle >=3.8 azure-identity No (3.12) >=3.8 fonttools Yes Wheels >=3.8 ruamel.yaml.clib Needs wheels! >=3.6 CacheControl No (3.12) >=3.7 marshmallow No (3.12) >=3.8 alembic No (3.12) >=3.8 tzlocal No (3.12) >=3.8 docker No (3.12) >=3.8 PyMySQL No (3.12) >=3.7 distro No (3.12) >=3.6 prompt-toolkit No (3.12) >=3.7.0 Cython Yes Wheels !=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,>=2.7 starlette Yes >=3.8 redis No (3.12) >=3.8 uritemplate No (3.10) >=3.6 poetry No (3.12) <4.0,>=3.8 httplib2 No (3.11) >=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.* isort No (3.12) >=3.8.0 ply google-auth-httplib2 No (3.12) defusedxml No (3.9) >=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.* blinker >=3.8 dnspython Yes >=3.9 uvicorn No (3.12) >=3.8 dulwich Needs wheels! >=3.7 crashtest No (3.11) >=3.7,<4.0 pyrsistent Needs wheels! >=3.8 toml No (3.9) >=2.6, !=3.0.*, !=3.1.*, !=3.2.* cleo No (3.11) >=3.7,<4.0 scramp No (3.11) >=3.8 nest-asyncio No (3.12) >=3.5 gunicorn No (3.12) >=3.7 Markdown No (3.12) >=3.8 babel No (3.12) >=3.8 installer >=3.7 msrest No (3.10) >=3.6 huggingface-hub No (3.11) >=3.8.0 opentelemetry-api No (3.12) >=3.8 azure-common No (3.9) grpc-google-iam-v1 No (3.12) >=3.7 ipython >=3.10 traitlets >=3.8 black Yes Wheels >=3.9 pycryptodomex Yes !=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,>=2.7 types-requests >=3.8 pycryptodome Yes !=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,>=2.7 future No (3.12) >=2.6, !=3.0.*, !=3.1.*, !=3.2.* setuptools-scm No (3.12) >=3.8 mock No (3.11) >=3.6 contourpy Yes Wheels >=3.9 sentry-sdk Yes >=3.6 pyflakes >=3.8 pendulum Needs wheels! >=3.8 requests-aws4auth No (3.12) >=3.7 tornado No (3.11) >=3.8 prometheus-client No (3.12) >=3.8 multiprocess Yes Wheels >=3.8 PyGithub No (3.12) >=3.8 webencodings No (3.6) typing-inspect No (3.11) jedi No (3.12) >=3.6 parso No (3.9) >=3.6 jsonpointer No (3.12) >=3.7 flake8 >=3.8.1 kubernetes No (3.11) >=3.6 Mako No (3.12) >=3.8 openai No (3.12) >=3.7.1 matplotlib-inline No (3.12) >=3.8 loguru No (3.12) >=3.5 argcomplete No (3.12) >=3.8 transformers No (3.10) >=3.8.0 datadog No (3.9) !=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,!=3.6.*,>=2.7 redshift-connector No (3.11) >=3.6 pytest-runner >=3.7 retry No (3.4) bs4 None pg8000 No (3.11) >=3.8 sagemaker No (3.11) >=3.8 opentelemetry-sdk No (3.12) >=3.8 asgiref No (3.12) >=3.8 pymongo Yes Wheels >=3.8 jsonpath-ng No (3.12) python-json-logger No (3.11) >=3.6 opentelemetry-semantic-conventions No (3.12) >=3.8 imageio No (3.11) >=3.8 typer No (3.12) >=3.7 aioitertools >=3.8 pyspark No (3.11) >=3.8 zope.interface Yes Wheels >=3.8 executing Yes >=3.8 gym-notices websockets Yes Wheels >=3.8 pkgutil_resolve_name >=3.6 debugpy Needs wheels! >=3.8 apache-airflow No (3.12) <3.13,~=3.8 smart-open No (3.11) <4.0,>=3.7 asttokens No (3.12) shapely Yes Wheels >=3.7 pytzdata No (3.8) >=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.* humanfriendly No (3.9) >=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.* snowflake-sqlalchemy No (3.12) >=3.8 arrow No (3.12) >=3.8 elasticsearch No (3.12) >=3.8 torch Needs wheels! >=3.8.0 stack-data No (3.12) oscrypto No (3.10) PySocks No (3.6) >=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.* pure-eval Yes None typeguard No (3.12) >=3.8 google-cloud-secret-manager No (3.12) >=3.7 requests-file None google-cloud-pubsub No (3.12) >=3.7 tokenizers Needs wheels! >=3.7 Sphinx Yes >=3.10 jsonpatch No (3.9) >=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, !=3.5.*, !=3.6.* tb-nightly No (3.11) >=3.9 mysql-connector-python Needs wheels! >=3.8 adal No (3.6) pylint Yes >=3.9.0 sympy No (3.11) >=3.8 jupyter-core >=3.8 orjson Yes Wheels >=3.8 google-pasta No (3.8) toolz No (3.12) >=3.8 jupyter-client >=3.8 ipykernel >=3.8 astroid Yes >=3.9.0 nbconvert >=3.8 types-python-dateutil >=3.8 pytest-mock No (3.12) >=3.8 xlrd No (3.9) >=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, !=3.5.* opensearch-py No (3.12) <4,>=3.8 aiofiles Yes >=3.8 appdirs No (3.8) pbr No (3.11) >=2.6 nodeenv No (3.10) !=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,!=3.6.*,>=2.7 pyodbc Needs wheels! >=3.8 jupyterlab No (3.12) >=3.8 mpmath No (3.9) jupyter-server >=3.8 setproctitle Needs wheels! >=3.7 progressbar2 No (3.12) >=3.8 scikit-image Needs wheels! >=3.9 nbformat No (3.12) >=3.8 XlsxWriter No (3.12) >=3.6 tox Yes >=3.8 aenum No (3.11) xgboost No (3.11) >=3.8 bleach No (3.12) >=3.8 comm >=3.8 identify >=3.8 jaraco.functools >=3.8 mypy Needs wheels! >=3.8 schema No (3.11) None db-dtypes No (3.12) >=3.7 rfc3339-validator No (3.8) >=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.* mistune No (3.11) >=3.7 slack-sdk No (3.12) >=3.6 tinycss2 No (3.12) >=3.8 pre-commit >=3.9 python-utils >3.9.0 nltk No (3.12) >=3.8 google-cloud-appengine-logging No (3.12) >=3.7 notebook No (3.12) >=3.8 asynctest No (3.7) >=3.5 click-man google-cloud-aiplatform No (3.12) >=3.8 jaraco.context >=3.8 sshtunnel No (3.8) cattrs No (3.12) >=3.8 altair No (3.12) >=3.8 absl-py No (3.12) >=3.7 cfgv >=3.8 lz4 Needs wheels! >=3.8 watchdog Yes Wheels >=3.9 jupyterlab-server No (3.12) >=3.8 nbclient No (3.11) >=3.8.0 semver No (3.12) >=3.7 responses No (3.12) >=3.8 sentencepiece Needs wheels! tensorboard No (3.11) >=3.9 ```

For fun, went ahead and did the top 8000. Makes a 1.2 GB json file in the middle.

Totals:
  Wheel: 171
  Classifier: 733
  NoWheel: 364
  NoClassifier: 4724
  Unknown: 2008

Max classifiers:
  3.12: 1950
  3.11: 952
  3.10: 454
  3.9: 314
  3.8: 308
  3.7: 256
  3.6: 255
  3.5: 110
  3.4: 73
  3.3: 41
  3.2: 8
  3.1: 3
henryiii commented 1 month ago

IMO, the only "bright red" packages in the top 360 right now are these:

wrapt Needs wheels! >=3.6
pyarrow Needs wheels! >=3.8
SQLAlchemy Needs wheels! >=3.7
frozenlist Needs wheels! >=3.8
psycopg2-binary Needs wheels! >=3.7
snowflake-connector-python Needs wheels! >=3.8
google-crc32c Needs wheels! >=3.9
ruamel.yaml.clib Needs wheels! >=3.6
dulwich Needs wheels! >=3.7
pyrsistent Needs wheels! >=3.8
pendulum Needs wheels! >=3.8
debugpy Needs wheels! >=3.8
torch Needs wheels! >=3.8.0
tokenizers Needs wheels! >=3.7
mysql-connector-python Needs wheels! >=3.8
pyodbc Needs wheels! >=3.8
setproctitle Needs wheels! >=3.7
scikit-image Needs wheels! >=3.9
mypy Needs wheels! >=3.8
lz4 Needs wheels! >=3.8
sentencepiece Needs wheels!

Those don't have wheels, so are actually pretty likely to not work. Though I think mypy does (it's just slower) and IIRC lz4 does as long as it can build (compiler present and all that). And wrapt should work but just be slower. So even that not a guarantee.

Capping requires-python doesn't actually work correctly, so it's rare to see any caps except <4 (mostly Poetry projects), the only one in the top 360 is apache-airflow, so that probably won't work either. Packages that tried to add a cap like numpy and numba had to remove it again.