Closed flying-sheep closed 1 year ago
@annarosenthal, that’s great to hear. But it still says “pyproject.toml has no dependencies or is too large to display” for us. See iterative/dvc for example.
Also, does this mean that we can now use dependency-review-action
with pyproject.toml files?
@annarosenthal Awesome! Are there docs?
@annarosenthal, that’s great to hear. But it still says “pyproject.toml has no dependencies or is too large to display” for us. See iterative/dvc for example.
Also, does this mean that we can now use
dependency-review-action
with pyproject.toml files?
Thanks for flagging. We looked into it and this is caused by a discrepancy in assumptions we made for different/older PEP standards. Apologies about it. However, I filed a bug issue internally asking to update this issue once the fix ships.
@annarosenthal Awesome! Are there docs?
What kind of docs are looking for, in particular? We have this general doc - please let me know what kind of specific questions and/or topics aren't covered there that you still have questions about.
@annarosenthal Awesome! Are there docs?
What kind of docs are looking for, in particular? We have this general doc - please let me know what kind of specific questions and/or topics aren't covered there that you still have questions about.
Hello! From my perspective, I do see one source of confusion, in the Supported package ecosystems section of the About the dependency graph document.
The Supported package ecosystems section has a list of supported package managers. For Python, there are two: pip and Python Poetry. Today, the pyproject.toml
file is only listed as a supported format for Python Poetry, and that is the source of my confusion: Should pyproject.toml
now also be listed as a supported format for pip? And should it be a recommended format for either pip and/or Python Poetry?
I think the idea behind the docs is simply: “we recommend using a lockfile”. I’m not sure why that is – It’s not a good idea to pin versions for Python libraries and for libraries, I rather run tests with the newest version of everything instead of some older lockfile version, because that’s what people get when they install my library.
requirements.txt
is a convention, not a standard. Some people use it for abstract dependencies (reading it in setup.py
and writing its contents to the package’s metadata), others use it as a lockfile, e.g. via
pip-compile --generate-hashes pyproject.toml --output-file=requirements.txt
But as long as dependabot recommends lockfiles, and Python doesn’t have a standard lockfile format, it makes sense to say “we recommend using poetry.lock
, requirements.txt
AS LOCKFILES”, while of course the recommended format for Python package metadata remains PEP 621.
Maybe the whole segment should be reworded to be more clear.
Our recommendations:
- in addition to abstract dependencies, also use a lockfile
- avoid code-as-configuration
We support the following formats:
Language Abstract dependencies Lockfile Code-as-configuration Python pyproject.toml, pipfile requirements.txt[‡], poetry.lock, pipfile.lock setup.py Node package.json package-lock.json, yarn.lock …
[‡] requirements.txt can be used as a lockfile via pip-tools
. It is sometimes also used for abstract requirements, but using pyproject.toml
is preferable.
We have this general doc
That helps, thanks!
👋 Hi folks -- with the latest changelog you should be seeing the expected dependencies from your pyproject.toml
files in your dependency graph insights. Do let me know if you experience any issues.
Thank you for your feedback on the docs! I'll share it with the docs team.
I may be in a small minority doing this, but we only use pyproject.toml
and poetry.lock
to generate a requirements.txt
that has pinned SHAs (to satisfy dependency pinning for OpenSSF Scorecards).
When a dependency gets bumped I'm seeing Dependabot PRs that update the poetry.lock
but to get the requirements.txt
(which is actually used for pulling in dependencies) into sync I then have to manually run:
poetry export --format requirements.txt --output requirements.txt
Yes, I guess I could change my workflow to use poetry for the install (though I think there's potentially a bootstrapping problem there). But it would also be nice if Dependabot could just keep poetry.lock
and requirements.txt
in sync for me.
Examples:
A Dependabot PR that modifies poetry.lock
but leaves requirements.txt
untouched
The PR for the manually generated changes to requirements.txt
(feels like this could easily be done by Dependabot too)
The workflow that runs python3 -m pip install --require-hashes -r tools/requirements.txt
👋 Hi folks -- with the latest changelog you should be seeing the expected dependencies from your
pyproject.toml
files in your dependency graph insights. Do let me know if you experience any issues.Thank you for your feedback on the docs! I'll share it with the docs team.
Does this mean we can remove the stub setup.py from project using poetry?
Will this also for PDM? 😊
There’s exactly two standard ways of specifying dependencies, none of which dependabot supports.
PEP 517’s
prepare_metadata_for_build_wheel()
.The legacy way to specify Python dependencies is
requirements.txt
. The standard way is having a PEP 517 build system configured. That build system can optionally define a function calledprepare_metadata_for_build_wheel
. If that function exists, the dependencies can be obtained (from the metadata) without building a wheel, otherwisebuild_wheel()
needs to be called.Code to just do the above would look like:
PEP 631 a new standardized way to specify dependencies in a Python project (now merged into PEP 621).
It’s supported by
flit
and a bunch of other new build backends. So if you want to continue your (incomplete) way of just parsing files and not calling into Python, this needs to be supported.