Open taion opened 7 years ago
I should list the alternatives here that I see:
setup.py
, then add it separately to Pipfile
, so that it shows up both as a transitive dependency and in my locked dev environment
requirements.in
) from Pipfile
setup.py
, setup.cfg
, Pipfile
, and Pipfile.lock
– adding yet another one seems even worsepyproject.toml
, make a TOML parser a build-system-level requirement, and pull from Pipfile
directly
pyproject.toml
is ubiquitous enough to make this safePipfile
format isn't exactly what you want in setup.py
anywayAlso, while in principle Pipfile
s can include dependencies that can't be represented nicely in this manner (e.g. a GitHub repo or something), in practice for libraries this is unlikely to be a major concern.
Hopefully this is the right place to add another request: a bit of documentation explaining in clear terms what should be done when converting some code into a library. I'm far from a Python expert, and I used Pipfile
and pipenv
to manage dependencies for an app. People requested I turn the app into a library, and it's been a struggle for me to figure out The Right Way to manage dependencies.
I also think that Pipfile.lock
should contain original versions of the packages. Not just to read it from setup.py, but also to be able to check quickly if lockfile is not up to date comparing to Pipfile
. In other words to quickly find out if user has added something to Pipfile
and did not update Pipfile.lock
just yet.
I can elaborate on use-case more if needed. This same yarn
uses to check if yarn.lock
is up to date (and we get advantage of it in vagga)
Any updates on this issue? It should be pretty trivial to do and maintenance cost should be minimal. Should I make a PR?
This feels to me like it is adding another quirk to python packaging. If we do this, setup.py
will depend on Pipfile.lock
which is very weird: now abstract dependencies in setup.py
will be generated from a file supposed to contain concrete dependencies (Pipfile.lock
).
As the use case if for libraries, doesn't this also mean that now libraries have to publish their Pipfile.lock
so that setup.py
works? But that's against the general principle that libraries are not supposed to publish their lockfile as they need to be able to work with many different versions of their dependencies.
To me, a much better way to handle this would be to extend the Pipfile with enough information to automatically generate a setup.py
from it. But maybe I am not understanding the distinction between setup.py
and Pipfile
well enough? Is there a case where you'd want to have both a Pipfile
and a setup.py
with different information? In all cases I can imagine, these two files would contain the same information (with setup.py
having additional meta data, and Pipfile
specifying additional deps for other environments): abstract dependencies that my application/library depends on to run ((1) and (2) in the classification above).
This feels to me like it is adding another quirk to python packaging. If we do this, setup.py will depend on Pipfile.lock which is very weird: now abstract dependencies in setup.py will be generated from a file supposed to contain concrete dependencies (Pipfile.lock).
Sorry, but one of us is misunderstanding this issue.
My understanding is that Pipfile.lock
contains a copy requirements from either Pipfile
or setup.py
, so that it's easy to find out whether Pipfile
(or setup.py
) was changed without updating lockfile yet.
Am I misunderstanding something?
Summary: It would be nice if
Pipfile.lock
contained the abstract dependencies in a way that could be used ininstall_requires
insetup.py
, as this would allow better tooling in the near term for library developersThis is intended as an actionable follow-up to https://github.com/pypa/pipfile/issues/27, https://github.com/kennethreitz/pipenv/issues/209.
To summarize many earlier discussions, across applications and libraries, we can taxonomize three different kinds of abstract dependencies.
Currently, (1) and (3) are handled via
Pipfile
(orrequirements.in
). (2) is handled bysetup.py
. (4) and (5) are handled byPipfile.lock
(orrequirements.txt
).In practice, however, (1) and (2) above are often managed in very similar ways. Also, the requirements around (5) are nearly identical for both libraries and applications.
This mostly works, except there is proximate room for improvement in tooling support. Specifically, my user workflow for adding a dependency per (2) to a library closely resembles that for adding a dependency per (1) for an application. I want to add the dependency to the abstract specification, then lock down a version for my development environment per (5) to have a reproducible build environment (so I can e.g. separate out test failures from dep version bumps from those relating to my own code). While this is easy for an application with
Pipfile
, it's not really possible for a library usingsetup.py
, sincesetup.py
is arbitrary Python rather than structured data.In an ideal world, this would be a non-issue if we had a
setuptools
entry inpyproject.toml
, but we don't have that right now. This would also be less of an issue if we could just includepackages
fromPipfile
insetup.py
, but barring near-universal adoption ofpyproject.toml
, there's no real way to access a TOML parser insetup.py
.That leaves a last option, which is realizable – as
Pipfile.lock
is just JSON, if the abstract dependencies fromPipfile
are made available inPipfile.lock
, then it would be straightforward to just forward through those dependencies fromsetup.py
with an appropriately configured package manifest.This would mitigate the flaws in tooling available for Python library development right now. Once better solutions become available in the future, as this is something that would be handled within packages, it would be straightforward to remove this section from
Pipfile.lock
and move people to newer tooling instead.