Closed aaronsgithub closed 1 year ago
One of the issues I'm hoping to resolve with the above is, if we follow the setup described here: https://github.com/jazzband/pip-tools#requirements-from-pyprojecttoml
Then, pip install .
would install from requirements.in
instead of the requirements.txt
generated by pip-compile
.
Currently, pip-tools can't compile only extra dependencies. This would require a new option which I'd call --only-extra
, for example:
# includes test deps only
pip-compile --only-extra=test -c requirements.txt -o test-requirements.txt
Note -c requirements.txt
so that common sub-dependencies are in sync with requirements.txt
.
Feel free to submit a PR I'd gladly review and merge.
Thank you for the quick reply.
I hope to work on that pull request.
What are your thoughts on specifying inputs to pip-compile
in pyproject.toml
?
I cannot find a way we can do this currently where you can specify inputs to pip-compile
inside pyproject.toml
to generate a requirements.txt
whilst also pointing pip
/ setuptools
to the resulting requirements.txt
file.
The docs I linked to before have pip
installing against requirements.in
instead of the resulting requirements.txt
.
That is what I was hoping to avoid with the lines:
[project]
...
dynamic = ["dependencies", "optional-dependencies"]
[tool.setuptools.dynamic]
dependencies = { file = ["./requirements/requirements.txt"] }
optional-dependencies.dev = { file = ["./requirements/requirements-dev.txt"] }
optional-dependencies.test = { file = ["./requirements/requirements-test.txt"] }
But then, pip-compile
has no input to generate the requirements.txt
files.
Should there be some way to inline requirements.in
under [tool.pip-compile]
?
As a workaround, you can statically declare dependencies in pyproject.toml
file, and use pip install -e . -c requirements.txt
to apply constraints. See the example below.
Thanks for the example.
My only bugbear is the extra arguments required to pip install
.
Part of the motivation is to have pyproject.toml
be a single source of truth whilst allowing pip install myproject
to "just work" without passing any additional commandline args.
In order for that to happen, it seems there would need to be a way of decoupling the actual dependencies in pyproject.toml
from the inputs to pip-compile
via [tool.pip-tools]
.
I was suggesting that one way of implementing this could be to have:
[tool.pip-tools]
dependencies = [...]
[tool.optional-dependencies]
dev = [...]
as overrides to:
[project]
dependencies = [...]
[project.optional-dependencies]
dev = [...]
I'd love to hear what you and the others think of this particular aspect?
Everything is configured in a single file, and pip install
would just work.
I'm sorry, I think I am missing some understanding of all your goals here, but thought I'd offer another workaround/flow for my best understanding of the ask:
pyproject.toml
:
[build-system]
requires = ["flit_core >=3.2,<4"]
build-backend = "flit_core.buildapi"
[project]
name = "myproject"
authors = [{name = "Andy", email = "andy@example.com"}]
license = {file = "LICENSE"}
classifiers = ["License :: OSI Approved :: MIT License"]
dynamic = ["version", "description"]
dependencies = ["alembic", "fastapi", "pydantic[email]", "sqlalchemy"]
[project.urls]
Home = "https://github.com/andydecleyre/myproject"
[project.optional-dependencies]
dev = ["black", "flit", "nox", "tomli"]
test = ["pytest", "pytest-playwright"]
[tool.pip-tools]
upgrade = true
header = false
annotation-style = "line"
strip-extras = true
allow-unsafe = true
noxfile.py
:
"""Tasks using Python environments."""
from pathlib import Path
import nox
import tomli
nox.options.default_venv_backend = 'venv'
nox.options.reuse_existing_virtualenvs = True
@nox.session(python='3.10')
def lock(session):
"""Generate updated lock files from pyproject.toml."""
metadata = tomli.loads(Path('pyproject.toml').read_text())
tempfiles = {
Path('requirements.in'): '\n'.join(metadata['project']['dependencies']),
Path('requirements-test.in'): '\n'.join(
metadata['project']['optional-dependencies']['test'] +
['-c requirements.txt']
),
Path('requirements-dev.in'): '\n'.join(
metadata['project']['optional-dependencies']['dev'] +
['-r requirements.txt', '-r requirements-test.txt']
)
}
session.install('-U', 'pip-tools', 'pip')
Path('requirements').mkdir(exist_ok=True)
with session.chdir('requirements'):
for in_file, content in tempfiles.items():
in_file.write_text(content)
session.run('pip-compile', '--config', '../pyproject.toml', in_file)
for in_file in tempfiles:
in_file.unlink()
Generate lock files:
$ pip install -e '.[dev]'
$ nox -s lock
Thanks for the flit example.
I'm probably not doing a good job of explaining so I'll take a step back and try again.
As part of the specification, pyproject.toml
has a prescribed way of specifying project dependencies via dependencies
and optional-dependencies
. But we also have the option of specifying these as dynamic
values so that their values may be provided by the output of another tool, in this case pip-compile
.
pip-compile
should be generating the dynamic
value of them for other build tools to use. The issue is that there is no way to specify the input to pip-compile
inside a pyproject.toml
file. For comparison, here is how this is possible in poetry. You could achieve this outside pyproject.toml
by doing something like:
pip-compile -o requirements.txt requirements.in
and then the build tools would be pointed to the output:
[project]
...
dynamic = ["dependencies", "optional-dependencies"]
[tool.setuptools.dynamic] dependencies = { file = ["requirements.txt"] }
3. The goal is that there shouldn't be any need for additional scripts or config files (e.g. `requirements.in`). Everything should be configured in `pyproject.toml`. There also shouldn't be any need for additional command line flags. `pip install` and `pip install -e` should just work based off the `pyproject.toml`. One way I could imagine this being achieved, is to essentially allow the dependencies to be specified under a [tool.pip-tools] heading:
[project] ... dynamic = ["dependencies", "optional-dependencies"]
[tools.pip-compile] # or [tools.pip-tools] dependencies = { file = ["requirements.in"] }
dependencies = [...] # for inlining
[tool.setuptools.dynamic]
dependencies = { file = ["requirements.txt"] } # instead we want the output of pip-compile
Usually folks don't want the lockfile to be the same content as the project's declared dependencies, because it would be way too restrictive for general installation.
Definitely true for libraries, but for applications I would want the lockfile to be used for installation. But that's something I can control within the scope of a project and its README.md so I guess this issue can be closed 🙂
@aaronsgithub it's not related to the scope of this project, it's because there's no standard for what you're asking. Specifying library deps is standardized. For apps, you need to specify the environment deps which is essentially a collection of coordinated pinned packages installed into that environment. It just so happens to correspond to the app deps. Though, such sets of deps might be slightly different — test/dev deps would often include the app ones, the linter deps might not need all the app deps. The test deps might be different per environment. Also the build deps are separate from the runtime ones.
Each of those could benefit from a constraints/lock file. I know that some people want a unified lock file while others understand that different environments might have conflicting constraints and unrelated environments May negatively impact the lockfiles of the target.
This is all non-standardized but I wouldn't say that the app constraints are unsupported. It's just that there's no standard for describing them within pyproject.toml
(and honestly, I think that dumping so many semantically different tool and env configs into a single file had a ton of disadvantages).
I prefer simpler in+txt file pairs that have dedicated semantic meaning. Of course, they should be in a dedicated subfolder, like requirements/
so they don't pollute project roots.
That said, it might be possible to agree on having a pip-tools specific section for the purpose of locking environments but it'd have to be well-thought first.
Ideally, such things should go through a standardization process so that there's some interoperability possible across different tools.
I prefer simpler in+txt file pairs that have dedicated semantic meaning. Of course, they should be in a dedicated subfolder, like requirements/ so they don't pollute project roots.
I actually like this too, and this is my current setup.
However, I would say motivation in seeking the ability to do everything within pyproject.toml
stems from the learning curve python packaging and project configuration has for beginners, and in certain cases it simplifies things if you can point collaborators to a single file.
That said, it might be possible to agree on having a pip-tools specific section for the purpose of locking environments but it'd have to be well-thought first.
Yep, definitely requires more thought than I've provided 😅.
What I would say is that it would be nice to have a way of declaratively specifying the existing cli functionality under [tools.pip-tools]
, with the possibility of inlining SRC_FILES
as an array.
What I would say is that it would be nice to have a way of declaratively specifying the existing cli functionality under
[tools.pip-tools]
, with the possibility of inliningSRC_FILES
as an array.
You might be able to achieve something very close to what you're after with an additional tool, taskipy.
What's the problem this feature will solve?
Perhaps this is already possible with the existing functionality of pip-tools and I need to go to timeout. Or if there are workarounds which would allow me to get close to the situation described below, I would be grateful if someone could share details with me.
I want to use
pyproject.toml
as the sole configuration file in a python project, and usepip-compile
to generate all the necessary pinned requirements files with unpinned dependencies specified inpyproject.toml
instead ofrequirements.in
.For argument sake, lets say those requirements files we need are:
requirements.txt
all requirements needed to run the applicationrequirements-test.txt
contains only dependencies needed for integration testsrequirements-dev.txt
superset of the both above files, with additional development dependencies.Then
pip install
should be able to infer from thepyproject.toml
file that it has use:requirements.txt
ifpip install .
requirements-dev.txt
ifpip install .[dev]
requirements-test.txt
ifpip install .[test]
Here is an example
pyproject.toml
with hypotheticaltool.pip-compile
sections:And here is how the requirements files could be generated using
pip-compile
:So the hypothetical
--deps
flag here would generate arequirements/requirements-test.txt
file using only the dependencies listed intool.pip-compile.optional-dependencies
.Can anything equivalent be achieved with existing functionality?
Describe the solution you'd like
The following to be possible:
tool.pip-compile.optional-dependencies.test
pip install
knows to install via the requirements files generated bypip-compile
pyproject.toml
is the only configuration file in the project and act as the single source of truth.Alternative Solutions
This can be achieved with separate configuration files but the goal is to have
pyproject.toml
be a single source of truth and satisfy all needs :)Additional context
That's all folks.