Closed zehauser closed 3 years ago
How to handle this? Maybe this approach can help in some cases:
I have the problem with coderedcms where the published version requires Django < 3.1 at today, but I want 3.1, while coderedcms is not much important package for me at now.
I see that at this time they have on github Django < 3.2. So "poetry add git+https://github.com/coderedcorp/coderedcms.git" look like a good way, however unfortunately poetry fails with CalledProcessError.
I have cloned the package from github into myproject/changed_packages/coderedcms. Now I have 3.2 already inside, but in the case that 3.1 is still there I can change it in my local copy to 3.2. Installing with "poetry add changed_packages/coderedcms/ is then successfull. (note: with relative path; with absolute path poetry fails again with some ValueError).
I believe that this is a relative correct way of installation, because we manually take all problematical packages into a state where we will receive a non-conflicting solution.
The attitude of people like @finswimmer and @sdispater is a reason why we can't have nice things in Python eco-system. Poetry had a chance to become an actual standard in so messy world of package management, after Pipenv failed to become the one? Sorry guys, again - not this time, because core maintainers are hilariously stubborn and arrogant enough to claim that's good for the python ecosystem, as it forces more and more package developer to proper specify their needs.
Yeah, sure, it's absolutely good you cannot use Poetry due to reasons you don't have any impact on. Everything is fine, isn't it? And at the end of the day it occurs that after many years the best option is to still use pip
and pip-tools
. Ironic. And simply sad.
@jaklan Well you may be out of luck then: https://github.com/pypa/pip/issues/8076.
pip
is now more strict as well – due to the new resolver they implemented – and you will stumble upon the same issues you are currently facing with Poetry.
And I can assure you, we don't refuse this request just to annoy people. There is a lot of technical challenges in implementing something like this and exposing it to the end users.
That being said my proposition still stands:
That being said, if someone find a good solution to this problem and implement it, I'd gladly review it.
but, interestingly, none of the people that complained or downvoted our arguments have stepped up to find a technical solution to this problem.
@sdispater it is unclear what you mean by "technical solution". "allow incorrectly-specified transitive dependencies to be overridden" is a "technical solution" in my eyes. Your objection to that seemed to be "someone might misuse it" — so does that mean you won't accept anyone implementing this option in a PR? I can't imagine a technical solution to misspecified dependencies that cannot be misused. or by "technical solution" do you mean it would be hard to actually implement this in poetry because of technical complications in its core?
but, interestingly, none of the people that complained or downvoted our arguments have stepped up to find a technical solution to this problem.
Maybe I read a different thread, but I can see a bunch of ideas from @wolever, @hangtwenty, @drunkwcodes, @kylebebak, @seansfkelley and other guys, so talking about "none of the people" is pretty unfair.
And about the resolvers. With poetry
I even can't install mlflow
and alembic
together, because:
poetry add mlflow alembic
Using version ^1.14.1 for mlflow
Using version ^1.5.7 for alembic
Updating dependencies
Resolving dependencies... (1.7s)
SolverProblemError
Because mlflow (1.14.1) depends on alembic (<=1.4.1)
and no versions of mlflow match >1.14.1,<2.0.0, mlflow (>=1.14.1,<2.0.0) requires alembic (<=1.4.1).
So, because someproject depends on both mlflow (^1.14.1) and alembic (^1.5.7), version solving failed.
when pyproject.toml
has nothing more than:
[tool.poetry.dependencies]
python = "~3.8"
So it's even more ridiculous, because someproject
doesn't specify any version of alembic
and the only concrete dependency is from mlflow
, but poetry
still tries to install alembic ^1.5.7
for whatever reason.
Both old and new pip
don't have any problem with resolving that:
pip install mlflow alembic
Successfully installed ... alembic-1.4.1 ... mlflow-1.14.1 ...
This is the job of each package's maintainers to ensure their dependencies are correct and loose enough to not create conflict.
Well, they don't. So how can we use your tool given that constraint ? I have to revert to pip every time a lib I use has over narrow dependencies and it is extremely annoying.
@jaklan Well you may be out of luck then: pypa/pip#8076.
yeah that's why I froze pip to an old but useable version.
The state of dependency management in python is just sad.
To some on this thread — please be cognizant that this is a place to discuss the tradeoffs of the decision in this project.
Claiming people are "hilariously stubborn and arrogant" or "dependency management in python is just sad" is not relevant. I'd encourage you to consider whether you're positively contributing to this discussion.
the only concrete dependency is from
mlflow
, butpoetry
still tries to installalembic ^1.5.7
for whatever reason.
Have you raised with mlflow
that pinning their dependencies makes it difficult to use their package in broader environments? What was their response?
Stepping back, I can see two possible worlds:
To the extent it's feasible, the Legible world is better. But moving from Illegible to Legible is painful change — libraries are going to have to start paying attention to dependency specs, and until they do, it'll be painful for users. So its going to require people like us raising issues in the libraries which don't have accurate and reliable dependency specs.
The nature of open source is the work being optional. But to the extent people haven't done that work, complaining about the results will not be constructive.
To some on this thread — please be cognizant that this is a place to discuss the tradeoffs of the decision in this project.
Claiming people are "hilariously stubborn and arrogant" or "dependency management in python is just sad" is not relevant. I'd encourage you to consider whether you're positively contributing to this discussion
There were many relevant answers, but they were actually ignored and didn't change anything in the attitude. Claiming that's good for the python ecosystem, as it forces more and more package developer to proper specify their needs, just because there's no plan (and will) to implement that is just unserious.
Have you raised with
mlflow
that pinning their dependencies makes it difficult to use their package in broader environments? What was their response?
That's actually not relevant. In the example mlflow
is the only package which specifies any concrete version of the dependency alembic
- why does poetry
still fail to install that package and look for the newest version?
So its going to require people like us raising issues in the libraries which don't have accurate and reliable dependency specs.
The challenge is, we all have deadlines and timelines, and some/often times, raising an issue with an project won't help us bring in that external project in a reasonable time span. At that point, our options are either (temporarily) forking every project, or having the ability to override from our project's side.
FWIW, I run a patched verison of Poetry that allows overrides.
tl;dr: The maintainers want to see a PR. Contributors want to know which solutions are most interesting to the maintainers, since a PR is expensive to write. There are a few proposed solutions that could be turned into a PR. The ball is in the maintainers' court to specify which solutions sound acceptable, unless a contributor wants to risk a difficult PR being shot down for philosophical reasons.
To the extent it's feasible, the Legible world is better.
Yes. In the common case, in my experience, this is the case. But arguments against this feature are head-in-the-sand arguments. Poorly specified dependencies exist. Needlessly tight dependency bound exist. Unmaintained libraries exist. They will always exist, even in this "Legible" world -- just look at any language with a more mature packaging ecosystem.
Why those of us advocating for this feature are frustrated is that the "final answer" given ignores those realities to elevate a principle about the broader Python ecosystem that is impossible to achieve at the cost of us being able to use Poetry to just do our jobs and write applications:
If we ever want to have an ecosystem similar to what other languages already have, we have to draw the line somewhere and enforce everyone to contribute to the common goal. Poetry helps with that by making it easier to build and manages Python projects.
Emphasis mine, as this perfectly illustrates the head-in-the-sandness of this argument. To quote myself:
Right now, my hands are tied, because the latest published version of this package is dragging in a 3.6-only sub-dependency. That means I have to either roll my project back to 3.6, fork the library I'm waiting for just to package the code it already has (!) on master, or hope that I don't need to make many changes to my project while I wait for the next release of this dependency at some indeterminate future point.
In this scenario, a tool ostensibly to "[make] it easier to build and manage Python projects" is making it literally impossible to manage my Python project without going way out of my way, because it holds needlessly strong opinions about the broader Python ecosystem. Which means, of course, that there's an unstated fourth option here: don't use Poetry. Not a great option, since I came here on purpose when Pipenv failed to make it easier to build and manage my Python projects.
Lastly,
interestingly, none of the people that complained or downvoted our arguments have stepped up to find a technical solution to this problem.
This is false and disrespectful to those of us trying to make a clear, supported argument for why this feature should exist and how it should work. There are three-ish solutions existing up-thread, at least one offer to open a PR, and a request from the maintainers to chime in on which solutions they are interested in.
The majority of comments here are in support, and many of them are providing use-cases, hammering out basic semantics, or responding to objections. To my knowledge no PRs have been opened because, as the maintainers have pointed out, they are not easy. I think what we need is the maintainers to engage directly with the proposals, ideally blessing one of the three-ish solutions as "worth a try", rather than simply dismiss the feature as too hard or not a culture fit.
@seansfkelley fork it and people willing to implement something useful can take the lead. We can name it prose
.
FWIW, I run a patched verison of Poetry that allows overrides.
This sounds great @dmitrig01, is it a dangerous hack, or something that could become a PR for this feature?
Yarn supports the ability for me as the end user of a library to override the version when I know better. Poetry should as well.
Often times people don't bother testing their library with every other version of other libraries, instead just try what happens to be installed by default and pin that version.
This sometimes leads to a case where their pinned version has an issue, but I can't fix it, because Poetry doesn't let me. As a practical example, the firedantic
library after version 0.2.0
added support for the async
api in google-cloud-firestore>=2.0.0
, but it seems google-cloud-firestore>=2.1.0
introduced some bug which breaks emulator support on my machine.
There is no significant API difference between google-cloud-firestore
versions 2.0.2
and 2.1.0
so I could just as well use 2.0.2
with firedantic
and my code, but Poetry throws a tantrum if I try to do that by specifying google-cloud-firestore = "2.0.2"
in pyproject.toml
and there seems to be no way to mark this as a non-issue.
Now instead of having my code just work when I know the fix, I need to wait for a review for a PR on the firedantic
repository to be completed, OR for Google to fix the firestore client, whichever comes first. Neither should be strictly necessary, though I would of course at least open issues on both repositories once I had my code working.
A similar approach as Yarn would work well here as well, add a new [tool.poetry.resolutions]
section where you just say google-cloud-firestore = "2.0.2"
, and it could even only support strict binding. Then if the tool notices that binding there, it should downgrade compatibility errors to warnings and install it, leaving it up to you to figure out what's wrong if something blows up.
I'm not able to install older versions of PyTorch and Torchvision as I convert an existing codebase from a gaggle of scripts and a requirements.txt
to a nicely organized library managed with Poetry:
Because torchvision (0.5.0+cu100) depends on torch (1.4.0)
and find-similar-dataprep-inference depends on torch (1.4.0+cu100), torchvision is forbidden.
So, because find-similar-dataprep-inference depends on torchvision (0.5.0+cu100), version solving failed.
This pyproject.toml
is the minimal that seems to produce this error:
[tool.poetry]
name = "torchvision-not-working"
version = "0.1.0"
authors = ["Your Name <you@example.com>"]
[tool.poetry.dependencies]
python = "^3.7"
torchvision = "0.5.0+cu100"
torch = "1.4.0+cu100"
[build-system]
requires = ["poetry-core>=1.0.0"]
build-backend = "poetry.core.masonry.api"
[[tool.poetry.source]]
name = "pytorch"
url = "https://eternalphane.github.io/pytorch-pypi/"
If I remove/comment out the torch
dependency, it installs, but poetry installs torch 1.4.0+cu92
. This makes Torchvision not work:
RuntimeError: Detected that PyTorch and torchvision were compiled with different CUDA versions.
PyTorch has CUDA Version=9.2 and torchvision has CUDA Version=10.0.
Please reinstall the torchvision that matches your PyTorch install.
My observations are these:
torchvision is inadequately specific about its dependency requirement
$ poetry show torchvision --tree
torchvision 0.5.0+cu100 image and video datasets and models for torch deep learning
├── numpy *
├── pillow >=4.1.1
└── torch 1.4.0 <----- this should be 1.4.0+cu100
├── future *
└── numpy *
torchvision 0.5.0+cu100
should depend on 1.4.0+cu100
not just blank 1.4.0. without that specifier, the resolver is left to its own devices, and it chooses incorrectly because of sort order.
pip
's resolver seems to allow the user to specify, so something like
pip install \
--extra-index-url https://eternalphane.github.io/pytorch-pypi \
torchvision==0.6.0+cu101 \
torch==1.5.0+cu101
works (n.b. the version difference, I tried with a few different versions of torchvision+pytorch and this one was most recent in my shell history). If I remove the torch
spec, pip
will install the cu92
version.
pip
and Poetry sort the local identifier – the part after the +
– and choose cu92
because it’s the "latest" because it’s the last in the sort order after cu100
, cu101
, etc. PEP 440 Local Identifiers doesn't prescribe how to handle subversioning like this but it appears that the PyTorch team chose an inconvenient scheme.pip
does, so managing these dependencies with Poetry appears to be impossible.
My workaround for now is not to use Poetry inside a Docker image I'm building, instead using pip
with an automatically modified requirements.txt
. I do not declare the torch
dep in pyproject.toml
. As a part of a build process, I generate requirements.txt
with
poetry export --without-hashes | twiddle_torch_version.sh requirements.txt
with the contents of twiddle_torch_version.sh
:
#!/usr/bin/env bash
OUTPUT_FILE="${1}"
TEMP_FILE=".${OUTPUT_FILE}.tmp"
cat - > "${TEMP_FILE}"
# cut is blunter than regex
CUDA_VERSION="$(grep 'torchvision==\d\.\d\.\d+cu' "${TEMP_FILE}" | cut -d '+' -f 2 | cut -d ';' -f 1)"
# find the torch line and insert the CUDA version specifier
sed -E -e "s/torch==(.*);/torch==\1+${CUDA_VERSION};/g" < "${TEMP_FILE}" > "${OUTPUT_FILE}"
rm "${TEMP_FILE}"
and consume that generated requirements.txt
inside the Docker image before installing the library that I'm managing with Poetry from the whl file built with poetry build
.
I encountered #4109 and proposed a solution to it in #4110 along the way.
Among the suggestions above, the simplest in implementation seems to be per-package overrides.
Suppose we have pyproject-overrides.toml
with list of packages and their overridden dependencies. We can use it as a source of package dependencies (changing _get_package_info
and similar) and give it as an input for solver; I emphasize that solver logic remains untouched at all.
@le-chat Perhaps it would be even simpler to do "additional supported versions" instead of "overrides". This approach would be more future-friendly. For example, if a dependency updates we can detect that the additional version is not necessary any more (and can perhaps show a different warning, etc.). Also easier to upgrade to new versions down the road.
We have the problem of declaring that package A works with package B v.1.2.3. The other way around (declaring that it doesn't work with B v.2.3.4) can be handled just fine by adding a specific version restriction for B in the normal list of dependencies.
In any case, yes, the solver logic should remain untouched.
TLDR: Let's do "add" and not "override".
@frnhr Maybe... But sometimes we need restrict versions, an example — pycocotools depends on cython but does not declare this. To add cython as normal dependency is not enough (it's already presented): pycocotools may be built before cython if there is no per-package dependency in poetry.lock and this fails.
@sdispater I agree and appreciate that this is a very difficult solver constraint problem that is not immediately obvious to everyone, in fact, probably only to people who have worked with solvers before and encountered their annoyances. Perhaps, at some point, a blog post explainer by someone would be amazing.
It is very tricky to come up with a way to solve this for the general case and I can't think of any way that wouldn't require making all dependency constraints looser, which is a very bad idea. A big part of the problem is the way pip
has evolved over the years and design deficiencies in the python module system that was designed long before an explosion in the number of packages. A solution to this might be more feasible to do in PEP 582-style packages, if it is approved.
Could we perhaps add a flag to poetry export
like --unsafe-dump
or something that just spits out all the dependencies as a requirements.txt
? That way people can fall back to using plain pip install
s until the maintainer fixes the deps. It might also be possible to create a plugin that uses gh
cli to automatically create an issue on the project for the version constraint issue encountered.
That said, forking projects is not too bad as a temporary workaround. You can even use something like fork-sync github action to keep the fork up to date with the upstream repo.
Btw, I get the frustration that stems from this, especially, since otherwise poetry
is an amazing tool. I can't even count the number of times it has saved me from running into hard-to-track bugs that stem from mismatching versions installed by pip
and the reduced reproducibility but every now and then I have had to drop it for a project because of this very problem. It's worse when it is introduced by a new dependency you add 6 months into a project.
Lastly, I'd request every one to take a little bit of time to read about solvers and the dependency resolution problem. Then maybe we can try to come up with some reasonable workarounds that won't break a lot of shit unexpectedly.
As a user, I'd love to have yarn-style Selective Dependency Resolutions (the resolutions
clause)
Ref: https://classic.yarnpkg.com/lang/en/docs/selective-version-resolutions/
There are two reasons for this:
1️⃣ There are cases where my program depends on A, but also on B that depends on A. If the package B is no longer maintained, its constraints prevent me from using newer version of package A.
2️⃣ sometimes, I want to test if I could safely use newer version of some package.
You could argue that I may as well pip install
the new version in the virtualenv, that's fair.
I'd counter that using smth like resolutions
1) more precise and 2) allows me to continue using existing CI/CD pipeline, in short, it's less of a hack.
I wanted to share another use case for this feature, related to validating changes to dependencies, using VCS, before they are released. My libraries have 100% test coverage, however I still want to make sure the features I'm adding work in my application before I publish them. This involves running the application code against the VCS dependencies on my developer machine, on my teammates' machines, and in CI.
Here is a subset of my application's dependency tree:
I'm currently iterating on changes to Library B (lace/lacecore#97) which depend on some additional changes in Library A (lace/polliwog#261). In my unmerged feature branch for Library B, the dependency on A installs using a VCS dependency, pulling the branch from GitHub. This works fine using Poetry.
In my application I've declared Library A and Library B as VCS dependencies. However the solver complains about Library C, which I am not changing:
Because tri-again (1.0.0) depends on polliwog (>=1.0.0,<2)
and app depends on polliwog (branch slice-mapping), tri-again is forbidden.
So, because app depends on tri-again (1.0.0), version solving failed.
I think the workaround requires manual installs so it's not as friendly. Would be nice if I could tell Poetry that I made this branch of library A, and I know it works with all my dependencies.
In theory publishing a branch of Library C could be a workaround, but the actual dependency tree has several more packages which also depend on the lower-level Library A and it's not feasible to create branches for all of them.
If the maintainers are open to receiving a patch, can you reopen this issue? I just had to skim through this whole thread to figure out why it was closed, and it seems like it shouldn't be.
Additional note - not only yarn
resolved the issue with resolutions
, but also npm
with overrides
:
https://docs.npmjs.com/cli/v8/configuring-npm/package-json#overrides
The issue is definitely not specific to Python and if it was resolved in JavaScript eco-system - should be as well resolvable in the Python one. But the last comment from Poetry maintainers in that thread was created in 2019, despite the issue is still quite active...
@jaklan Indeed, I have currently the issue with a lib that has a dependency with a CVE, not being able to fix the issue is a real blocker on our side
is there a better way to pin sub-dependencies in the pyproject.toml file (or any other way) without manually editing lock file or exporting to requirements.txt, fix conflicts, install the desired version and add it back..?!
Why is this closed? Will it be reopened and fixed, or is Poetry going to ignore the problem and take a purist approach that is basically passing the buck? If there is no plan and timeline for fixing it, I will stop using poetry and let my organization know that I was wrong to use it for the past year. I was advocating for it, but when I faced a problem that required overriding a dependency resolution (details irrelevant since there is already so many examples above) I found this. The tone and lack of will to fix a real problem is deterring me from using poetry, unless someone forks it and starts listening.
Why is this closed? Will it be reopened and fixed, or is Poetry going to ignore the problem and take a purist approach that is basically passing the buck? If there is no plan and timeline for fixing it, I will stop using poetry and let my organization know that I was wrong to use it for the past year. I was advocating for it, but when I faced a problem that required overriding a dependency resolution (details irrelevant since there is already so many examples above) I found this. The tone and lack of will to fix a real problem is deterring me from using poetry, unless someone forks it and starts listening.
Hi!
I'd like to remind everyone that Poetry is first and foremost a free, open-source, community project. That means that while we hope that the project is useful to everyone, it is going to reflect the needs of the regular contributors the most, as they are contributing code to meet those needs.
There has been discussion in this thread over multiple versions of Poetry (including major versions), and much of it is now obsolete due to an evolving design and implementation. While this particular use case/issue is considered by the main contributors to be tertiary, it is evident from the number of comments that not everyone feels this way.
The largest reason for no progress on this front in several years is due to a lack of a fully-fleshed, viable design being presented. That being said, an implementation can be costly, especially if there is no consensus that it is mergable upstream.
For those who are interested, and willing to write code to this end, I would suggest joining the Discord server and starting a thread in the #contributing channel in order to workshop a design with the core team. Once we have a rough design document, we can open an issue that provides guidance for an implementation to anyone who is interested in contributing toward it.
Be aware that any implementation would have to be very invasive, and would have to be very carefully evaluated. There is no guarantee that a consensus on a clean and maintainable implementation can be reached -- but the first step to doing so would be a high-signal-to-noise and productive conversation; I do not believe this issue is the correct forum for that at this time.
Stumbled into the problem that poetry also takes sub-dependencies into account that are in a dev
group.
Not sure at which places that is useful, but as far as I understand it, dependencies such as mypy
do not really matter, right? Because to upstream author maybe just wants to make sure to always test with the same tool but for my package I don't care about that.
In my case nbconvert
requires bleach = *
which has a package.extras
in my poetry.lock file that looks like:
[package.extras]
css = ["tinycss2 (>=1.1.0,<1.2)"]
dev = ["Sphinx (==4.3.2)", "black (==22.3.0)", "build (==0.8.0)", "flake8 (==4.0.1)", "hashin (==0.17.0)", "mypy (==0.961)", "pip-tools (==6.6.2)", "pytest (==7.1.2)", "tox (==3.25.0)", "twine (==4.0.1)", "wheel (==0.37.1)"]
which is derived by poetry from bleach from the following setup.py
block:
EXTRAS_REQUIRE = {
"css": [
"tinycss2>=1.1.0,<1.2",
],
"dev": [
"black==22.3.0; implementation_name == 'cpython'",
"build==0.8.0",
"flake8==4.0.1",
"hashin==0.17.0",
"mypy==0.961; implementation_name=='cpython'",
"pip-tools==6.6.2",
"pytest==7.1.2",
"Sphinx==4.3.2",
"tox==3.25.0",
"twine==4.0.1",
"wheel==0.37.1",
],
}
In my dev
group I need a mypy>=0.981
which fails to install because
bleach
developers have too strict dependencies?bleach
developers should not have a group called dev
?package.extras.dev
blocks?Where do you see the problem? Or am I misunderstanding something?
Using the deprecated [tool.poetry.dev-dependencies]
block works, but using the new poetry add "mypy>=0.981"
which would add it to [tool.poetry.group.dev.dependencies]
and thus lead to this kind of naming conflict...
There's a couple things going on here:
groups.dev
/old dev-dependencies
feature.dev
name is triggering some latent collision; I suggest you reach out in Discussions or Discord to determine if you're hitting a bug, or just not using the tooling quite right.For reference, here is bleach
's requirements:
Requires-Dist: six (>=1.9.0)
Requires-Dist: webencodings
Provides-Extra: css
Requires-Dist: tinycss2 (<1.2,>=1.1.0) ; extra == 'css'
Provides-Extra: dev
Requires-Dist: build (==0.8.0) ; extra == 'dev'
Requires-Dist: flake8 (==4.0.1) ; extra == 'dev'
Requires-Dist: hashin (==0.17.0) ; extra == 'dev'
Requires-Dist: pip-tools (==6.6.2) ; extra == 'dev'
Requires-Dist: pytest (==7.1.2) ; extra == 'dev'
Requires-Dist: Sphinx (==4.3.2) ; extra == 'dev'
Requires-Dist: tox (==3.25.0) ; extra == 'dev'
Requires-Dist: twine (==4.0.1) ; extra == 'dev'
Requires-Dist: wheel (==0.37.1) ; extra == 'dev'
Requires-Dist: black (==22.3.0) ; (implementation_name == "cpython") and extra == 'dev'
Requires-Dist: mypy (==0.961) ; (implementation_name == "cpython") and extra == 'dev'
@q-wertz, I suggest that any subsequent discussion occur somewhere else, as you are either encountering usage or documentation issues, or a weird and nasty bug. In any case, this is an illustration of this proposed feature often being a footgun; your instinct to try and solve it with this extremely sharp-edged and brittle (proposed) tool is exactly why no core maintainer has been interested in exploring this feature themselves; it screams persistent headache for end users and Poetry developers alike.
Thank you for the extensive explanation and help.
You are probably right with your assumption there might be an issue on my side. I now refactored the pyproject.toml
file, removed the mix between old and new syntax and it seems to work now. Not exactly sure where exactly the issue was…
Gave me (minor) headaches for a few days :sweat_smile:
Sorry for the noise; I am updating the issue labels/status to be more accurate.
if anyone gets here, here's how you solve it.... fork the offending 10 levels upstream repo, relax constraints as needed, then use the forked repo instead, submitting a PR if appropriate.
@earonesty Yes, making poetry not suitable for large dependencies and long term projects
I just discovered that if you have sub package in your project with another pyproject.toml and you run poetry add some-package
from there it will happily upgrade or downgrade already installed packages to satisfy some-package
, but if you run from project root it will conflict
Is there any resolution to this issue? This is causing issues with pydantic
As a point of reference, pdm
now supports both overrides and exclusions in dependency resolution. Really hoping for poetry to introduce something similar.
https://pdm-project.org/latest/usage/config/#override-the-resolved-package-versions
https://pdm-project.org/latest/usage/config/#exclude-specific-packages-and-their-dependencies-from-the-lock-file
Not only pdm
, but uv
by astral (the creators of ruff
) does support dependency override as well:
https://github.com/astral-sh/uv?tab=readme-ov-file#dependency-overrides
This issue has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs.
(However, it is related to https://github.com/sdispater/poetry/issues/436).
Issue
In the dark, old world of Python packaging, sub-dependencies are handled very poorly. If I recall correctly,
pip
will happily install a sub-dependency despite conflicting versions being specified by two direct dependencies... in fact I think which version it ends up installing depends on the order inrequirements.txt
. Yuck! Only very recently has it even started issuing a warning for cases like this.In contrast,
poetry
does this right. It computes the entire dependency tree and will complain if there are conflicts anywhere in the tree.But... many packages out there are not specifying their dependencies properly. Even if they are, there's always the possibility that their specified dependencies are a tighter range than they strictly need to be.
Is there a way to tell Poetry to force a specific version (or version) range of a dependency in cases like this — or in other words, to ignore a dependency specification of another dependency somewhere in the tree? If not, should there be?