Closed Eric-Arellano closed 1 year ago
cc @jsirois @stuhood @benjyw: we've known this is an issue for a long time. Thoughts on the proposal?
It's really a retro-posal. V1 worked this way.
Wouldn't it be better to just get rid of the distinction between version
and extra_requirements
and allow this to point to a target
that can aggregate several python_requirement_library
targets?
And in the future it could point to any binary target that we build from source...
Wouldn't it be better to just get rid of the distinction between version and extra_requirements
The distinction is intentional to make it much easier to change the defaults. You can easily the version while not touching the default extra reqs, or vice versa. Updating the default for a list option is clunky, you have to completely override it unless you're only adding/removing elements.
Yes, but that problem can be eliminated under this proposal. We can have [python_tool].requirements
point to a target that in turn points to any number of targets. There is no need for a list option.
Note that v2 JVM tools have been using this hybrid approach: you can either use a "coordinate string" or an address to a jvm_artifact
target. It's working out well. I think this feature would be nice for Python.
This just bit me for mypy
and typing-extensions
:rage2:
A user migrating from poetry pointed out that they would like to be able to leverage the flake eight plug-ins they already have defined with poetry. Right now, they must duplicate the requirement strings. This issue would solve the problem
Along similar lines:
For the code in [pylint].source_plugins
, I would like to use the resolve from [pylint].lockfile
. I tried setting resolve="pylint"
in the BUILD file under the source_plugins
directory, but that says that pylint is not a valid resolve name.
My goal is to separate the pylint plugins' requirements so they aren't in the python-default resolve any more.
I guess this proposal is very similar but goes in the opposite direction: reference a requirement from the python-default resolve in the tool requirements.
So, I guess my question about this proposal is, how does using a requirement interact with multiple resolves? Would the tool still have a separate lockfile+resolve? Or would it start using the python-default resolve? Or would I need to create an explicit resolve for the tool and put the requirements in there?
For the code in [pylint].source_plugins
I had pointed @cognifloyd at this ticket thinging that it was necessary for source plugins. Until I remembered https://github.com/pantsbuild/pants/issues/14320. While the design has some awkward edges, I think source plugins work good enough already. They do not need this ticket at all.
Another example/use case I posted in Slack:
Getting this on ./pants_from_sources dependencies helloworld/translator/translator_test.py
in example-python
00:57:24.50 [WARN] Pants cannot infer owners for the following imports in the target helloworld/translator/translator_test.py:tests:
* pytest (line: 4)
If you do not expect an import to be inferrable, add `# pants: no-infer-dep` to the import line. Otherwise, see https://www.pantsbuild.org/v2.15/docs/troubleshooting#import-errors-and-missing-dependencies for common problems.
This goes away if I add pytest to my requirements.txt (//:reqs#pytest)
.
What about we go the other direction:
Add a target, similar to pants_requirements
that pulls tool requirements from pants.toml. This way you can say: "Add all the requirements ([tool].version
, and [tool].extra_requirements
) to resolve foobar (defaulting to python-default resolve).
Maybe call it pants_tool_requirements
or python_tool_requirements
. That way pants.toml is still the source of truth.
python_tool_requirements(
tool="pytest",
resolve="my-resolve",
)
Or, we add a BUILD symbol that allows referencing metadata in pants.toml.
python_requirement(
name="pytest",
requirements=[
pants_toml["pytest"]["version"],
] + pants_toml["pytest"]["extra_requirements"],
resolve="my-resolve",
)
spinning further on that direction.. oh, I misread pants.toml for pyproject.toml.. Oh, I'll post anyway as I think it's an interesting idea (and the pattern may be applied kind of generically too I believe):
# Generates targets as appropriate from a pyproject.toml poetry config
import_poetry_project(
source="pyproject.toml",
dependencies=True,
devDependencies=False,
distribution=True,
...
)
python_tool_requirements
I think I like that. Although, we often don't want everything from the resolve. You may only directly import n of the requirements from the tool.
import_poetry_project
we have poetry_requirements()
already for that
import_poetry_project
we have
poetry_requirements()
already for that
Well, that would be just the "dependencies=True"
part of my suggestion ;)
I'm not saying this is something we necessarily want to do.. but in case there are multiple aspects of some setup that you want to create targets for, having an "umbrella" target to manage it may make sense to reduce boilerplate and explicit know-how of all the various types involved..
+1 for this. A use case we have is that we use dependabot/renovate to manage dependency upgrades, and its really only compatible with a pyproject.toml
(poetry style) or a requirements.txt file. Would be nice to be able to define the versions of flake8/pytest/black and corresponding plugins in our poetry file instead of pants.toml.
I'm +1 on pants.toml referring to other, standard places (rather than the reverse)
After raising an issue on the slack channel @thejcannon asked me to add my (potential) use case here:
I’m trying to set up debugging (for the test and run goals) in VSCode. I’m on an M1 mac with Python 3.9. When I try to run ./pants test --debug-adapter [path to test file]
I get the following error:
15:06:23.64 [INFO] Completed: Building pytest_runner.pex
15:06:23.64 [ERROR] 1 Exception encountered:
ProcessExecutionFailure: Process 'Building pytest_runner.pex' failed with exit code 1.
stdout:
stderr:
A distribution for debugpy could not be resolved for ~/.pyenv/versions/3.9.15/bin/python3.9.
Found 1 distribution for debugpy that do not apply:
1.) The wheel tags for debugpy 1.6.0 are cp38-cp38-macosx_10_15_x86_64 which do not match the supported tags of ~/.pyenv/versions/3.9.15/bin/python3.9:
cp39-cp39-macosx_13_0_arm64
... 409 more ...
It seems that pants isn't building debugpy and there isn't a wheel available. This issue might get around that problem.
Another reason to love this: https://www.pantsbuild.org/docs/reference-flake8#extra_files is superfluous :smile:
This is addressed in a different way by https://github.com/pantsbuild/pants/pull/18481, which allows a tool to be installed from a user resolve. In fact, the version
/extra_requirements
options are now deprecated, and we want there to just be one kind of lockfile, that can be reused for tools as well as user code.
A weird thing with our tool support is that you can have the same requirement both to run a tool and as a library imported by your code.
Classic example: Pytest. Right now, you have to be careful to set the pytest version in both your requirements.txt and
[pytest].version
, violating DRY.Instead, this proposal would update
PythonToolBase
to allow either using a raw requirement string or a target address for apython_requirement_library
, so you could do this: