Open sin-ack opened 2 months ago
As a workaround, one can exclude that specific requirement from the py_library
/py_binary
's deps
and rules_python
will not attempt to install it.
Given that colorama should be only included on Windows and you run this error on non Windows, I suspect that you are trying to cross-compile, or you are trying to use colorama without including it in your lock file, which should alter the conditional inclusion of the wheel from only on Windows to all platforms.
We at work use a similar lockfile without any issues with rules_python 0.31, so I am perplexed why this is not working in your case.
We switched to platform-agnostic lockfile (meaning that you can use this same lockfile on many platforms, it has all the needed info and all markers).
Consider that we have such lines in it
cupy-cuda12x==13.0.0; sys_platform == "linux" and platform_machine == "x86_64"
cupy-cuda12x==13.1.0; sys_platform == "linux" and platform_machine == "aarch64"
we are running this on linux x86_64 and run into
Error in fail: repo.execute: whl_library.ResolveRequirement(rules_python~~pip~pypi_research_311_cupy_cuda12x, cupy-cuda12x==13.1.0; sys_platform == "linux" and platform_machine == "aarch64"): end: failure:
....
....
....
===== stdout start =====
Ignoring cupy-cuda12x: markers 'sys_platform == "linux" and platform_machine == "aarch64"' don't match your environment
===== stdout end =====
===== stderr start =====
Traceback (most recent call last):
File "<frozen runpy>", line 198, in _run_module_as_main
File "<frozen runpy>", line 88, in _run_code
File "/root/.cache/bazel/_bazel_root/00e0182df830644af7af00c92693c660/external/rules_python~/python/pip_install/tools/wheel_installer/wheel_installer.py", line 205, in <module>
main()
File "/root/.cache/bazel/_bazel_root/00e0182df830644af7af00c92693c660/external/rules_python~/python/pip_install/tools/wheel_installer/wheel_installer.py", line 198, in main
whl = Path(next(iter(glob.glob("*.whl"))))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
StopIteration
I think one way to fix it would be to add code along the lines of
result_lines = []
for line in lines:
r = packaging.requirements.Requirement(line)
if r.marker:
if not r.marker.evaluate():
# if we have marker and it's not for our platform of interest, skip
continue
result_lines.append(line + "\n")
After a little bit of thinking I think the problem is that supporting cross-platform requirements would be something that detracts us from supporting other up and coming or existing cross-platform formats (e.g. pdm
, poetry
, uv
).
I proposed a change in #1885 to unblock people needed different versions per (py_version, os, arch)
tuple and it aligns well with the existing codebase.
Technically there is no parser of requirements.txt
file and the code that @vors are looking at is about requirements within the METADATA
files.
@aignas I just tested commit a6cb620f0c0c34082040b0560d4dd2e11e39715e against my project, but it doesn't fix the issue. The problem I have is that I can't really modify the requirements.txt
file to not have this requirement, because it is automatically generated by rules_python_poetry
from the poetry.lock
. What would you recommend in this case?
If I understand correctly, poetry in its lock file has dependency that is only installed on a particular platform. In order to have a single version of the dependency you would have to modify the poetly lock file to only contain a single version.
Let me know if that works for you.
sorry I'm a bit late to the issue.
@aignas I'm not sure I quite understand what you meant by
After a little bit of thinking I think the problem is that supporting cross-platform requirements would be something that detracts us from supporting other up and coming or existing cross-platform formats (e.g. pdm, poetry, uv).
Best I can figure you're saying that supporting cross-platform requirements would make it more difficult to support the non-requirements.txt
lockfiles from other package managers like pdm
, poetry
, uv
, etc. But all of those support writing out requirements.txt
files and they all regard writing a single, cross-platform file as a very useful feature. pdm
supports writing out a cross-platform file or a file for the current platform only. poetry
supports writing a single cross-platform lockfile. uv
supports writing out a lockfile for any platform. It does not yet support cross-platform but plans to. All of these tools export directly from their lockfiles to requirements.txt
. So I must be missing something because using these tools to export to a single cross-platform requirements.txt
seems like it would be a great workflow. Adding support for different requirements.txt
for different (os, arch) combos seems like a super useful feature that makes uv
adoption easier, but I don't think it needs to come at the expense of being able to use poetry
.
I'll also add that
In order to have a single version of the dependency you would have to modify the poetry lock file to only contain a single version.
is not a thing anyone should be doing since it's a generated file and not a manually maintained one. With the correct edits, you're right that it should work. However, manually editing a poetry
lockfile requires recomputing the hash on the file by hand and is also no longer reproducible. A new run of poetry lock
is going to overwrite those changes.
What I meant was that poetry, pdm and other tools usually lock the dependencies and each platform has certain constraints on the packages they include. I do not mean modifying the lock files manually, but rathec modifying the input files to the lock files.
The problem with requirements.txt is that up until now rules python implicitly supported only a single version of those packages in a given requirements file because all of the starlark tooling does not support marker evaluation.
In your pyproject file you could modify your project constraints to ensure that the resulting pdm or poetry lock file contains a single version of each package instead of conditionally including different versions on different platforms. That would mean when those tools export a requirements file, it would not have conditional dependencies and rules_python would be able to read it without a problem.
The uv outputting requirements files that are specific to a platform is something that is the same constraint that pip compile was imposing on the users, that is that in general, the requirements file is only compatible with the platform that the file was generated on/for.
I am wondering if supporting poetry, pdm, hatch lock files is something that we should do as they have more information in them, or if we should just support the environment markers in the requirement files.
Thank you for elaborating and providing context! Sorry for the misunderstanding.
I would think both supporting environment markers and the lockfiles would be very useful. Though if I had to choose it would be environment markers because there's a PEP for it, which makes it highly standardized and widely used/supported in third-party tools (and a single effort would provide better support for almost all third-party package managers).
We do support the PEP508 in the context of METADATA
parsing to ensure that the right dependencies are pulled in, but we don't support that in the context of requirements.txt
files because we don't have access to the Python interpreter at that stage. We could do it, but it is more work and I was focusing on landing #1837, so that we can have a way to refer to packages to target platforms that are different from the host.
I am happy to read PRs of anyone who may be willing to implement support of requirement markers in the parsing of the requirements.txt
though, so hence I'll leave the issue open.
Regarding "standardized and widle used" part of your answer, the real world of requirements.txt
is a little more nuanced. At the moment rules_python
requires extras
annotations in the requirements.txt
files, that poetry
supports, pdm
has added support only very recently and uv
and pip-compile
require the user to specify extra flags to not strip them. Having the extras annotations in the files is redundant for most package managers that manage a single venv
, but for rules_python
, where each py_binary
and py_test
target create a separate environment, the extra annotations are required.
I am writing all of this down not necessarily to disagree with you but to add more context to the discussion of supporting any requirements.txt
file out there - supporting output coming from uv
and pip-compile
is enough work already and doing that for others could be even more work. If someone would like to pick that up and make the parser/infrastructure more compatible with others, feel free to add me as a reviewer. :)
π bug report
Affected Rule
The issue is caused by the
pip.parse
extension.Is this a regression?
No.
Description
If the given requirement has a platform restriction like
platform_system == "Windows"
andpip
refuses to download the requirements, then Bazel will crash.π¬ Minimal Reproduction
https://github.com/sin-ack/rules_python-repro
π₯ Exception or Error
π Your Environment
Operating System:
Output of
bazel version
:Rules_python version:
Anything else relevant?
I'm using rules_python_poetry, which generates that verbose requirement line.