Closed cboylan closed 3 years ago
As per https://github.com/pradyunsg/pip/issues/1 part of this has landed in pip 10, but we ran into an issue yesterday that made a conflict go through unnoticed.
Cryptography and requests were listed in our requirements.
cryptography==2.2.2 depends on idna>=2.1 requests==2.18.4 depends on idna>=2.5,<2.7
Now idna 2.7 was released yesterday, which caused our wheel packaging to result in an incompatible set without warnings.
"pip install" now warns with "requests 2.18.4 has requirement idna<2.7,>=2.5, but you'll have idna 2.7 which is incompatible."
But running "pip wheel" doesn't do any such thing.
I was looking at getting a test case for this, but not sure what the correct expectation should be. The core of the issue can be gotten close to by this, my expectation would be that the >=2.1 specifier gets reduced to the range specified by requests:
def test_specifier_reduction():
ge21 = InstallRequirement.from_line("idna>=2.1")
ge25l27 = InstallRequirement.from_line("idna>=2.5,<2.7")
req_set = RequirementSet()
req_set.add_requirement(
ge21,
parent_req_name="cryptography"
)
req_set.add_requirement(
ge25l27,
parent_req_name="requests"
)
assert req_set.requirements['idna'].specifier == '>=2.5,<2.7'
I don't know if this is a pip issue or something to do with one of the libraries it depends on. At a bare minimum the same warning as 'pip install' causes should happen. Ideally the process (both in install and in wheel) should terminate so these compatibilities issues can be detected as part of automation pipelines.
Our workaround was to specifically specify idna==2.6; but in an ideal world pip would auto-reduce the set to this compatible number without requiring specification.
"pip wheel" doesn't do any such thing.
Expanding the warnings to pip wheel
(and also pip download
) sounds like a reasonable enhancement/feature request. I've gone ahead and filed it as #5497.
Ideally the process (both in install and in wheel) should terminate so these compatibilities issues can be detected as part of automation pipelines.
For situations where you want to ensure that the dependencies are consistent, there's pip check
that exits with an exit code of 1 if the dependencies aren't consistent. It uses the same underlying logic as pip install for printing/generating these warnings/errors.
Further, in the next release, pip install
's warnings will be limited to only checking graphs of packages directly influenced by the installation run. pip check
will continue to check all the packages in the graph.
pip would auto-reduce the set to this compatible number without requiring specification.
That's exactly what this issue is for tracking. :)
pip wheel
and pip download
can put multiple versions of a package in the target directory (unlike pip install
). For many uses of pip wheel
and pip download
that makes the warnings inaccurate (or at best, misleading). The use case above seems to be a very specific workflow (pip wheel -r requirements.txt
into an empty directory, which is then expected to be usable as a consistent install set). That's not how the majority of uses of pip wheel
that I have seen work.
--((-- seems like a collection issue, so posting here before opening an issue --))--
Have you considered the case where extras "pins" a version of a package already in requirements?
setup(name='foo',
install_requires=['bar'],
extras_require={'pinbar': 'bar==1.2.3']})
And pip install foo
install latest bar
,
but pip install foo[pinbar]
install bar==1.2.3
.
foo
before running setup.foo
is not listed in install_requires
.Can I please test my understanding? If I do the following at the command line:
$ pip install --upgrade a b c
# Do I get the same results as doing:
$ pip install --upgrade a
$ pip install --upgrade b
$ pip install --upgrade c
Is there any advantage (besides brevity) of doing on over the other?
Why not try to use this? https://github.com/openSUSE/libsolv/issues/284#issuecomment-428970139
@josb there's discussion about SAT solvers above in this thread, https://github.com/pypa/pip/issues/988#issuecomment-30903134
Thanks, @msarahan. I mentioned it because it seemed like a solved problem (heh). I wasn't aware of the need for a Python-based solution.
Python packages do not know what dependencies are defined in setup.py until the package is downloaded, extracted, and the setup.py -- potentially containing conditionals -- is executed on that particular platform: there's not enough information to determine the dependency graph ahead of time; AFAIU
Is there a non-iterative python package dependency resolver? How is that possible without declarative metadata?
On Wednesday, October 17, 2018, Jos Backus notifications@github.com wrote:
Thanks, @msarahan https://github.com/msarahan. I mentioned it because it seemed like a solved problem (heh). I wasn't aware of the need for a Python-based solution.
—ahead You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/pypa/pip/issues/988#issuecomment-430800687, or mute the thread https://github.com/notifications/unsubscribe-auth/AADGy7Plb6P5hVzaRXKMpXdgpoLcoD27ks5ul6TsgaJpZM4AuVK0 .
Wheels and sdists contain declarative listings of dependencies (PKG-INFO → Requires-Dist). Am I missing something?
Since there has been a … significant amount of discussion here, and it's a bit hard using the github UI to unroll it all, could someone familiar with this bug update the description to reflect the current state? i.e. whatever happened with the GSoC work that was supposed to solve this once and for all, are there future plans, what the consensus is?
The current situation is a bit complex, and a lot have happened. Tools like pipenv and poetry (supported by PEP-0518) sort of fill the gap, but also add a lot of new ways to specify dependencies (Pipfile, pyproject...). Tools have come a long way, but the ecosystem is sort of a mess right now.
whatever happened with the GSoC work that was supposed to solve this once and for all, are there future plans, what the consensus is?
There is more work to be done, following up on the code cleanups done during GSoC. The main blocker for me has been finding the time to do the follow up work. There is also some collaborative work that's being done with folks from pipenv regarding code reuse on this front; though that's still in a discussion state from pip's end.
could someone familiar with this bug update the description to reflect the current state?
I'll do this sometime in the upcoming week. :)
From "Distlib vs Packaging (Was: disable building wheel for a package)" (2018-09) https://mail.python.org/archives/list/distutils-sig@python.org/message/IQVZVVWX2BLEP6D4WQMKNXZHBF2NZINU/ :
- https://github.com/sarugaku/requirementslib -- abstraction layer for parsing and converting various requirements formats (pipfile/requirements.txt/command line/InstallRequirement) and moving between all of them
- https://github.com/sarugaku/resolvelib -- directed acyclic graph library for handling dependency resolution (not yet being used in pipenv)
- https://github.com/sarugaku/passa -- dependency resolver/installer/pipfile manager (bulk of the logic we have been talking about is in here right now) -- I think we will probably split this back out into multiple other smaller libraries or something based on the discussion
"pipenv and pip" (2018-08) https://mail.python.org/archives/list/distutils-sig@python.org/thread/2QECNWSHNEW7UBB24M2K5BISYJY7GMZF/#2QECNWSHNEW7UBB24M2K5BISYJY7GMZF may also be relevant
FWIW, this still routinely creates issues even with the latest version of pip
:
in CI we end up with attrs 17.3.0
installed along pytest 3.10.1
even though pytest
requires attrs >= 17.4.0
since version 3.5.0.
This creates issues while installing elasticsearch
and requests
together.
Output:
$ pip3 install elasticsearch==7.0.0 requests==2.21.0
Collecting elasticsearch==7.0.0
Using cached https://files.pythonhosted.org/packages/a8/27/d3a9ecd9f8f972d99da98672d4766b9f62ef64c323c40bb5e2557e538ea3/elasticsearch-7.0.0-py2.py3-none-any.whl
Collecting requests==2.21.0
Using cached https://files.pythonhosted.org/packages/7d/e3/20f3d364d6c8e5d2353c72a67778eb189176f08e873c9900e10c0287b84b/requests-2.21.0-py2.py3-none-any.whl
Collecting urllib3>=1.21.1 (from elasticsearch==7.0.0)
Using cached https://files.pythonhosted.org/packages/39/ec/d93dfc69617a028915df914339ef66936ea976ef24fa62940fd86ba0326e/urllib3-1.25.2-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests==2.21.0)
Using cached https://files.pythonhosted.org/packages/60/75/f692a584e85b7eaba0e03827b3d51f45f571c2e793dd731e598828d380aa/certifi-2019.3.9-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests==2.21.0)
Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests==2.21.0)
Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
requests 2.21.0 has requirement urllib3<1.25,>=1.21.1, but you'll have urllib3 1.25.2 which is incompatible.
Installing collected packages: urllib3, elasticsearch, certifi, chardet, idna, requests
Successfully installed certifi-2019.3.9 chardet-3.0.4 elasticsearch-7.0.0 idna-2.8 requests-2.21.0 urllib3-1.25.2
The output above has this warning:
requests 2.21.0 has requirement urllib3<1.25,>=1.21.1, but you'll have urllib3 1.25.2 which is incompatible.
The installed version of urllib3
(urllib3 1.25.2
) violates the requirement specifier for urllib3
specified by requests 2.21.0
? The required dependencies are:
elasticsearch==7.0.0
requires urllib3>=1.21.1
(source)requests==2.21.0
requires urllib3>=1.21.1,<1.25
(source)Both dependency specifiers could have been easily satisfied by installing urllib3 1.24.3
. But pip3
then installed urllib3 1.25.2
instead which violates the second specifier.
See this Stack Overflow thread for related discussion: https://stackoverflow.com/q/56096643/1175080.
Continuous integration on this repo is already super complicated with AppVeyor, Azure Pipelines, pypa-bot, and Travis CI examining every pull request. However, it might be of interest to gather up a series of interactions that are known to fail (jobs like https://github.com/pypa/pip/issues/988#issuecomment-466770940) and then run them in __allow_failures__ mode in a CI (CircleCI, GitHub Actions, or in a separate repo) as a way to document such problematic interactions and to encourage the maintainers of both packages to find mutually agreeable remedies.
I've posted an update in the issue description.
Is there any sort of timeline for a solution to this?
Is there any sort of timeline for a solution to this?
Not really. As of right now, my time is volunteered so it's fairly erratic and so, it's difficult to estimate how long this might take.
This work is basically blocked on someone (me or someone else) making the time and doing the work here (figuring out the UX, implementation, incorporating user feedback, dealing with holes in metadata and a bunch more stuff).
Work is happening though -- there's the discussion related to the feedback management and related things, at #6536.
I've updated the issue description again, this time with a link to a blog post containing more details on what's up and what's next: https://pradyunsg.me/blog/2019/06/23/pip-update/
Update:
Hi pip peeps,
I just wanted to let you know about my project, which is a package manager popular in the VFX industry, and has a full dependency solver: https://github.com/nerdvegas/rez.
Rez is not a python-specific package manager, however python is used extensively in VFX, and so there is good python integration, including a "rez-pip" tool which converts from pip to rez packages (eg "rez-pip --install Flask" will install Flask and all dependencies, and will convert them all into rez packages).
We can generally resolve environments containing hundreds of packages within a few seconds. The solver is deterministic - if a given request is unchanged, and there are no new packages released since the last solve, then you're guaranteed to get the same solve again.
We just recently have a developer starting work on porting part of the solver to C++ to potentially speed it up a fair bit (I'm predicting approx 5x speed improvement).
Note that rez's versioning schema is not quite the same as pip, although they overlap a lot.
See here for more details: https://github.com/nerdvegas/rez/wiki/Basic-Concepts#dependency-resolving
See here for a technical description of how the solver works: https://github.com/nerdvegas/rez/blob/master/src/rez/SOLVER.md
Please let me know if this is of interest, and do reach out if you have any questions. I realise that you're probably already on a development path for your new resolver, but it may be useful to look at something that already exists and is used primarily on python packages, and is battle tested (rez has been around for 8+ years now).
Thanks Allan
I meant to add that you can also get a dot graph of the resolve process, as shown here. This is a simple dependency graph resulting from a call to rez-env python-3 Flask
.
What is the current status on this, if I may ask?
I was encountering an issue with resolving dependencies as well and going to make an issue about it, until I saw that this one already existed. I however cannot find in here whether the specific issue I encountered has already been reported at some point, so I guess I will put it here for good measure as it may help solving the problem:
Let's say I have a package A, which has requirements B>=1.0 and C==1.0. Package C v1.0 has requirement B>=1.2. Package B is currently already installed with version 1.1.
Using the information above, pip install A
should result into C v1.0 being installed, B being updated to at least v1.2 and A being installed.
This however does not happen, as B>=1.0
is resolved before C==1.0->B>=1.2
and because B is already installed with a satisfactory version, B>=1.2
is never looked at.
I know that pip v10.0+ will warn you about having incompatible dependencies installed. However, I have always found it weird that it does not actually resolve them if it knows that they are currently incompatible.
Also, I know that you can use pip check
to see if there are currently any incompatible dependencies.
However, it does not have the option to also automatically resolve any incompatibilities it finds.
This means that the only way to do this, is doing it manually.
This is not a problem when I am installing package A myself, but if it is being installed on (for example) Travis CI, then this becomes much harder to do.
Finally, I am not sure if this has ever been mentioned or looked at, but I know that Conda does resolve the requirements of all dependencies. I frequently see it upgrade or downgrade dependencies that are not direct requirements of the package I am trying to install, but rather requirements of the package's requirements.
Hi, guys, maybe you can try a dependency diagnosis website http://www.watchman-pypi.com/dependency_analysis, which can help diagnoze dependency conflict problems and automatically visualize your dependency tree and those of your downstream projects.
Best, Neolith
There's a Twitter thread, listing a lot of details related to this issue (and big news): https://twitter.com/di_codes/status/1193980331004743680
Related context for that thread, is provided in a past comment in this issue: https://github.com/pypa/pip/issues/988#issuecomment-552674491
@pradyunsg Woah! Nice!
since no one mentioned yet: poetry recently released v1.0.0, which -as many probably already know- is built around a stable, pure python, MIT-licenced, exhaustive dependency resolver. might be worthwhile checking out in the development process :)
https://github.com/python-poetry/poetry/blob/1.0.0/poetry/mixology/version_solver.py#L31-L38
Interesting project. Thanks for posting!
In the example ( https://github.com/python-poetry/poetry/tree/1.0.0#dependency-resolution ), what would have happened if the first version of pbr that was selected (0.11.1) could not have been resolved because oslo could not be resolved at all with that version and poetry would have needed to fall back to another version of pbr?
In otherwords, suppose the "preferred" version of pbr is such that it will much later be determined that the dependencies pbr needs cannot be satisfied. Does poetry backtrack to the point at which it made the choice of that version of pbr and then choose another version?
On Sat, Dec 14, 2019 at 10:07 AM david notifications@github.com wrote:
poetry recently released 1.00, which -as many probably already know- is built around a stable and exhaustive dependency resolver https://github.com/python-poetry/poetry/tree/1.0.0#dependency-resolution
https://github.com/python-poetry/poetry/blob/1.0.0/poetry/mixology/version_solver.py#L31-L38
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/pypa/pip/issues/988?email_source=notifications&email_token=AAGROD6KO4Y2OBE4JENTFM3QYTZC3A5CNFSM4AFZKK2KYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEG4ES7Y#issuecomment-565725567, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAGROD6A5LOLFYAQVHEXNIDQYTZC3ANCNFSM4AFZKK2A .
Does poetry backtrack to the point at which it made the choice of that version of pbr and then choose another version?
Yep. poetry is using a Python port of the PubGrub algorithm (ported by the author of poetry) -- https://medium.com/@nex3/pubgrub-2fb6470504f.
The current discussion for the choosing the dependency resolution algorithms/tooling is happening at #7406. Poetry's author has pitched in himself, in that issue as well as #6536.
Thanks to @sdispater, who extracted mixology from poetry, I felt compelled to give it a shot for pip, and ended up building a dependency resolver that uses pip's CLI and a slightly modified version of mixology, which doesn't need to install any of the dependencies to reach a solution.
Please give it a go and report any issues you find :)
https://github.com/ddelange/pipgrip
P.S. it also handles cyclic dependencies:
$ pipgrip --tree -v keras==2.2.2
WARNING: Cyclic dependency found: keras depends on keras-applications and vice versa.
WARNING: Cyclic dependency found: keras depends on keras-preprocessing and vice versa.
keras==2.2.2 (2.2.2)
├── h5py (2.10.0)
│ ├── numpy>=1.7 (1.18.1)
│ └── six (1.14.0)
├── keras-applications==1.0.4 (1.0.4)
│ ├── h5py (2.10.0)
│ │ ├── numpy>=1.7 (1.18.1)
│ │ └── six (1.14.0)
│ ├── keras>=2.1.6 (2.2.2, cyclic)
│ └── numpy>=1.9.1 (1.18.1)
├── keras-preprocessing==1.0.2 (1.0.2)
│ ├── keras>=2.1.6 (2.2.2, cyclic)
│ ├── numpy>=1.9.1 (1.18.1)
│ ├── scipy>=0.14 (1.4.1)
│ │ └── numpy>=1.13.3 (1.18.1)
│ └── six>=1.9.0 (1.14.0)
├── numpy>=1.9.1 (1.18.1)
├── pyyaml (5.3)
├── scipy>=0.14 (1.4.1)
│ └── numpy>=1.13.3 (1.18.1)
└── six>=1.9.0 (1.14.0)
There's a Twitter thread, listing a lot of details related to this issue (and big news): https://twitter.com/di_codes/status/1193980331004743680
Related context for that thread, is provided in a past comment in this issue: #988 (comment)
Right. To expand on that: the PSF was able to get some funding from Mozilla Open Source Support and the Chan Zuckerberg Initiative to hire contractors to work on the pip resolver and related user experience issues. You can see our roadmap (which I need to polish up) and blog and forum and mailing list posts and notes from recent meetings to keep apprised. I'll be posting something about this soon to distutils-sig and the Packaging forum on Python's Discourse instance.
We aim to have pip's resolver feature prepared for release in pip 20.2 in July. (Per the quarterly release cadence for pip, unforeseen difficulties may delay till 20.3 in the next quarter.)
The alpha or beta release of pip with its new dependency resolver should be out in May.
I just posted a PSF blog post which discusses what is going to change in the pip resolver, when, and how you can help (including some low-effort things you can do right now, such as running pip check
).
I didn't mention this in the blog post because ordinary Python users shouldn't try it, but: As of right now, people who install pip from GitHub master will have the ability to run pip install --unstable-feature=resolver
and test the new resolver code. And less than half of the test suite fails! Expect errors and missing features, but it’s there! [Celebratory trumpet honk here.]
Hope all of you, and all the people you are close to, are healthy and staying that way.
I just tried the new resolver to see if it worked better on a case that does not work with pip's default one and the first thing it did for resolving the dependencies of the top level requirement is try to fetch and run setup.py egg_info
for setuptools-0.7.2.tar.gz
... That's on Python 3.6, not good...
Anyway, here is a yaml test for my initial issue:
base:
available:
- A 1.0.0; depends B >= 1.0.0, C >= 1.0.0
- A 2.0.0; depends B >= 2.0.0, C >= 1.0.0
- B 1.0.0; depends C >= 1.0.0
- B 2.0.0; depends C >= 2.0.0
- C 1.0.0
- C 2.0.0
cases:
-
request:
- install: C==1.0.0
- install: B==1.0.0
- install: A==1.0.0
- install: A==2.0.0
transaction:
- install:
- C 1.0.0
- install:
- B 1.0.0
- install:
- A 1.0.0
- install:
- A 2.0.0
- B 2.0.0
- C 2.0.0
The 3 first steps are just to setup the initial installed set. With pip's default resolver, the last install request result in only A-2.0.0 and B-2.0.0 being installed, C is erroneously being kept at 1.0.0.
After patching the YAML test code to use the new resolver:
--- i/tests/functional/test_yaml.py
+++ w/tests/functional/test_yaml.py
@@ -97,7 +97,7 @@ def handle_install_request(script, requirement):
"Need install requirement to be a string only"
)
result = script.pip(
- "install",
+ "install", "--unstable-feature=resolver",
"--no-index", "--find-links", path_to_url(script.scratch_path),
requirement, "--verbose",
allow_stderr_error=True,
The results are no better with the second install step updating C to 2.0.0, which means the default update strategy (only-if-needed
) is not honored.
Thanks @benoit-pierre for the detailed report. I think your two issues are as follows:
We'll definitely be looking at these items in the near future, and having specific test cases for them is really useful, so thanks for the report here!
the first thing it did for resolving the dependencies of the top level requirement is try to fetch and run setup.py egg_info for setuptools-0.7.2.tar.gz
@benoit-pierre Can you clarify what you did to get this? I've been digging into the new code, and as far as I can see, we should always try the latest version first. It's quite possible that there's something going wrong here (the code is still very incomplete) but it would help if I could reproduce this problem.
Using Python 3.6, starting with this state:
appdirs==1.4.3
certifi==2018.1.18
dbus-python==1.2.4
docutils==0.14
hidapi==0.7.99.post21
pip @ https://github.com/pypa/pip/archive/6086f71cde81454fda588c20868a241561809836.zip
plover @ https://github.com/openstenoproject/plover/releases/download/weekly-v4.0.0.dev8%2B66.g685bd33/plover-4.0.0.dev8.66.g685bd33-py3-none-any.whl
plover-plugins-manager==0.5.11
plover-treal==1.0.1
Pygments==2.2.0
PyQt5==5.9.2
pyserial==3.4
python-xlib==0.23
setuptools==41.2.0
sip==4.19.8
six==1.10.0
wcwidth==0.1.7
wheel==0.34.2
Then using either pip install 'plover_plugins_manager==0.5.14'
or pip install --unstable-feature=resolver 'plover_plugins_manager==0.5.14'
to reproduce the 2 bugs.
What is the best way for interested parties to submit test cases so that we can ensure that the complex scenarios are well understood and that the final software handles them all properly?
Using Python 3.6, starting with this state:
Thanks for that. Unfortunately, I haven't been able to set up that initial state (I run Windows, and trying it in a Linux Docker container gets build errors that I don't know how to address). But I'll keep the issue in mind and review as we continue the work.
Are the build errors related to dbus-python or maybe hidapi? I think those are the only 2 that would need building.
I can reproduce with only this set of packages:
appdirs==1.4.3
certifi==2018.1.18
docutils==0.14
pip @ https://github.com/pypa/pip/archive/6086f71cde81454fda588c20868a241561809836.zip
plover @ https://github.com/openstenoproject/plover/releases/download/weekly-v4.0.0.dev8%2B66.g685bd33/plover-4.0.0.dev8.66.g685bd33-py3-none-any.whl
plover-plugins-manager==0.5.11
Pygments==2.2.0
PyQt5==5.9.2
pyserial==3.4
python-xlib==0.23
setuptools==41.2.0
sip==4.19.8
six==1.10.0
wcwidth==0.1.7
wheel==0.34.2
@pradyunsg : YAML tests don't have a way to specify an initial state, do they?
Excellent, thanks. I can now reproduce the issue, I'll do some investigation.
@cclauss @benoit-pierre Thanks for asking and reporting these scenarios!
We'd love to get our users to help us understand complex dependency resolution related scenarios, which we want to hear about so we can consider them! The best mechanism would be to file issues on https://github.com/pradyunsg/zazo/issues. We've been aggregating most of the user reports there, as well as noting some of the edge cases that would be useful for testing the resolver.
YAML tests don't have a way to specify an initial state, do they?
@benoit-pierre No, not right now. We will be improving that in the near future though, to have a way to represent/recreate initial state.
In general -- thanks everyone for giving us your thoughts and edge cases! Please do file them as issues on zazo. Or, if you have other concerns, please feel free to file new issues here on pip.
Lots of people are subscribed to this issue, and we want them to notice when we make an announcement here. We do not want to take the unusual step of locking this issue to collaborators, but we also want to try really hard to avoid notification floods. So please be mindful of that if you need to leave a comment on this thread. Thanks!
@benoit-pierre I raised #7966 for the "old setuptools being built" issue. If you ant to track progress, we'll work on it over there.
We have now (per #7951) published a beta release of pip, pip 20.1b1, which includes an alpha (unstable) version of the new resolver. You can upgrade to it with python -m pip install -U --pre pip
. Please see this Discourse thread for the announcement.
@benoit-pierre I was able to verify your example. I'm working on improving the tests, and adding new test scenarios. I've added your example to the tests in this commit 17103e8d1a35bf6662587104addd65688b9394c6
Thanks for the opportunity to test this functionality,
We have now (per #7951) published a beta release of pip, pip 20.1b1, which includes an alpha (unstable) version of the new resolver. You can upgrade to it with
python -m pip install -U --pre pip
.
Now that 20.1.1 is out, and until there's a 20.2 beta that includes the resolver, one must instead use python -m pip install pip==20.1b1
as the above command just installs the newer 20.1.1 release.
Edited: Strike-through erroneous advice but leave message for posterity.
@rpanderson it's possible to use the alpha resolver in pip 20.1.1, by passing --unstable-feature=resolver, just like pip 20.1 betas.
pip's dependency resolution algorithm is not a complete resolver. The current resolution logic has the following characteristics:
NOTE: In cases where the first found dependency is not sufficient, specifying the constraints for the dependency on the top level can be used to make it work.
(2019-06-23)
This is being worked on by @pradyunsg, in continuation of his GSoC 2017 project. A substantial amount of code cleanup has been done, and is ongoing, to make it tractable replace the current resolver, in a reasonable manner. This work enabled pip >= 10 to warn when it is going to make an installation that breaks the dependency graph. (The installations are not aborted in such scenarios, for backwards compatibility.)
(2019-11-29)
A status update regarding this is available here.
(2022-12-16)
See the closing note for details.