Closed jamescw19 closed 1 year ago
After some more investigation, this issue occurs without the --upgrade
flag if there is no existing output file. It only "works" if --upgrade
is not given and the output file is present (therefore the pip-compile conditional branch occurs.
This actually affects my workaround above, as I'd have to run pip-compile
twice after deleting the previous requirements files; first to use the latest versions, second to recognise the --upgrade-packages
versions.
Looks like this behavior was changed in https://github.com/jazzband/pip-tools/pull/1031. There's relevant discussion in that PR, but I can't tell whether this exact case was intentionally changed. @AndydeCleyre might be able to share more.
I did not realize they were intended to potentially be used together, as I didn't see sense in the sentiment "keep/install an old version of package x AND upgrade all packages to their latest" -- I figured the upgrade-all bit would trump the other bit. I was wrong.
I'll try to review this whole thing.
Until then, does it help to put your restricted versions into the input file?
That makes sense. The requirements.in
file approach would probably work, yeah. This night just be a documentation issue then, if it's not the intended behaviour, because the docs have an example showing the use together.
I can see why the sentiment of upgrading all and pinning a specific package would seem strange. In my specific case, I'm actually trying to force the requirements file to contain a specific, CPU-only flavour of PyTorch with torch==1.10.0+cpu
, but upgrade all the other "standard" packages. At the moment, I end up with torch==1.10.0
in my requirements.txt
. So it's a bit of a non-standard use-case
OK, reviewing the discussion at #759, I think a key note is right at the start. @atugushev opened with:
Pip-compile should raise an error while trying to upgrade a package which doesn't exist in a requirements.in.
Through the discussion, the "penalty" was softened and we did not end up raising any errors in this case.
But the expectation there illuminated is that packages passed to --upgrade-package
are either present in the input file, or considered irrelevant. In other words, any package for which you want to control the version should be in the input file. That's my understanding, anyway.
With that in mind:
requirements.in
:
django
sqlparse # for django
$ pip-compile --upgrade-package 'sqlparse<=0.4.0'
asgiref==3.5.0
# via django
django==4.0.2
# via -r requirements.in
sqlparse==0.4.0
# via
# -r requirements.in
# django
$ pip-compile --upgrade --upgrade-package 'sqlparse<=0.4.0'
asgiref==3.5.0
# via django
django==4.0.2
# via -r requirements.in
sqlparse==0.4.0
# via
# -r requirements.in
# django
$ rm requirements.txt
$ pip-compile --upgrade --upgrade-package 'sqlparse<=0.4.0'
asgiref==3.5.0
# via django
django==4.0.2
# via -r requirements.in
sqlparse==0.4.0
# via
# -r requirements.in
# django
So
Pip-compile should raise an error while trying to upgrade a package which doesn't exist in a requirements.in.
is that a reasonable and good expectation?
For what it's worth, I relied on the behavior requested here (upgrading specific packages not declared in reqs.in) for a while and didn't realize it was no longer available! Certainly a question is whether the complexity is worthwhile, but I found it really valuable to be able to easily upgrade a specific sub-dependency that had recently had a security fix release. pip-compile
ensured I still had a set of compatible packages without forcing me to upgrade everything.
Of course, that use case can be worked around by temporarily adding the sub-dependency to reqs.in and then removing it, but it would be nice to have that automated. I didn't want the particular package in reqs.in permanently - if the relevant direct dependency no longer needed it, I would want it to disappear from the output.
I'm not saying things should be one way or another here, but want to mention a relevant alternative:
security.txt
:
sqlparse<=0.4.0 # See https://...
requirements.in
:
django
-c security.txt
Yep, definitely an option as well!
So is there a good reason not to treat --upgrade-package
items identically to a -c constraints.txt
file containing those items?
Oh right yes: because of the simple case of upgrading to the max. Never mind.
I made #1578 to trial different handling of --upgrade-packages
. I constructed a temporary file from upgrade_packages
and parsed that in -c constraints.txt
style, in addition to our existing handling of upgrade_packages
.
The result is good. If it's simple enough to do without creating a temp file, I'm interested in that.
It also presents the possibility of including the --upgrade-packages
switch as a source in the annotations.
is that a reasonable and good expectation?
we probably should make that clearer in the docs
I was going to say that this is definitely a reasonable solution, given your previous discussion about the issue -- happy to use requirements.in
to define any constraints. But what you've done in #1578 looks like a great solution to the problem, merging CLI upgrade-packages
with any existing constraints. If you're happy with the additional logic required, then this would absolutely solve my issue.
Thanks very much for investigating this!
I've added some tests to validate your changes, and opened a PR into your fork. Looks like your changes solve the problem, from the test output
Yeah while I prefer to put these kinds of needs into files rather than use -P
, I can't think of a good reason not to treat these like -c constrainst.txt
entries in addition to what we're doing.
The behavior would be more predictable and I now think more correct. #1578 is up for review.
Great, thanks for this! I agree, I think this becomes much more predictable.
So does this new behaviour account for the case where there is no requirements.in
file, but requirements come from a setup.[py|cfg]
file instead? Seems like it should from your changes, just clarifying I understand what the changes are permitting.
@jammie19 If I understand the question, then yes.
Here's behavior on master, using the pip-tools repo itself as the setup.py package to test:
$ pip-compile setup.py -U 2>&1 | grep tomli
tomli==2.0.1
$ pip-compile setup.py -U -P 'tomli<2.0.1' 2>&1 | grep tomli
tomli==2.0.1
And here's the result with #1578:
$ pip-compile setup.py -U 2>&1 | grep tomli
tomli==2.0.1
$ pip-compile setup.py -U -P 'tomli<2.0.1' 2>&1 | grep tomli
tomli==2.0.0
EDIT: I didn't note before, but tomli
is not a direct dep; it's a dep of direct dep pep517
.
Thanks for clarifying, that looks great!
I hope I'm not misreading this issue, but I believe I am experiencing a negative effect of the same (or similar) behavior described above. I'd really appreciate if you could confirm/correct my expectations in this situation. I'm also happy to open a separate issue if this isn't the right place for it.
I have
# test.in
azure-storage-blob
spacy~=3.2.0
I run the command:
pip-compile --upgrade -P "azure-core~=1.22.0" test.in
Could not find a version that matches typing-extensions<4.0.0.0,>=3.6.4,>=3.7.4,>=3.7.4.1,>=3.7.4.3,>=4.0.1 (from spacy==3.2.3->-r test.in (line 2))
Tried: 3.6.2, 3.6.2, 3.6.2.1, 3.6.2.1, 3.6.5, 3.6.5, 3.6.6, 3.6.6, 3.7.2, 3.7.2, 3.7.4, 3.7.4, 3.7.4.1, 3.7.4.1, 3.7.4.2, 3.7.4.2, 3.7.4.3, 3.7.4.3, 3.10.0.0, 3.10.0.0, 3.10.0.1, 3.10.0.1, 3.10.0.2, 3.10.0.2, 4.0.0, 4.0.0, 4.0.1, 4.0.1, 4.1.0, 4.1.0, 4.1.1, 4.1.1
There are incompatible versions in the resolved dependencies:
typing-extensions>=4.0.1 (from azure-core==1.23.0->azure-storage-blob==12.9.0->-r test.in (line 1))
typing-extensions>=3.6.4 (from catalogue==2.0.6->spacy==3.2.3->-r test.in (line 2))
typing-extensions>=3.7.4.3 (from pydantic==1.8.2->spacy==3.2.3->-r test.in (line 2))
typing-extensions<4.0.0.0,>=3.7.4 (from spacy==3.2.3->-r test.in (line 2))
typing-extensions<4.0.0.0,>=3.7.4.1 (from thinc==8.0.13->spacy==3.2.3->-r test.in (line 2))
First, I'm surprised that pip-compile
doesn't identify lower versions of azure-core
and azure-storage-blob
(given that there's no version restriction on them) and check their dependencies against the other constraints.
azure-storage-blob
doesn't have a restriction on typing-extensions
, therefor it won't check the validity a lower version of azure-storage-blob
even if one if its dependencies (azure-core
) has a conflicting requirement on typing-extensions
? Is that expected?Finally, I expected the -P "azure-core~=1.22.0"
to correctly constrain azure-core
version, which would allow pip-compile
to find a version of typing-extensions
that satisfies all of the constraints.
@tboddyspargo The final point here, constraining azure-core
and the expectation to find the highest version of all dependencies with the -U
flag combined with the -P "azure-core~=1.22.0"
is the issue highlighted in this thread. #1578 should fix this behaviour, though.
The main issue you're facing, the inability to resolve typing-extensions
is due to the new pip
resolver. It should be fixed by #1539, from what I understand of that PR.
@jammie19 - thank you very much for clarifying for me! I'm so grateful for everyone's time and expertise!
The documentation states that the
--upgrade
and--upgrade-packages
can be used together, however the--upgrade-packages
flag is ignored when--upgrade
is supplied.Environment Versions
python 3.9.0
pip 21.3.1
pip-compile 6.4.0
Steps to replicate
Using different args to the documentation, as
requests
is not in thedjango
requirements, andrequests<=3.0.0
does not affect the "latest" release.Expected result
Actual result
sqlparse
is not restricted to<=0.4.0
. If I omit the--upgrade
flag, only supplying--upgrade-packages 'sqlparse<=0.4.0'
, I get the expected result (but my other packages are not upgraded to later versions). This also doesn't work if the argument issqlparse==0.4.0
.My current workaround is just to delete the requirements file(s) before running
pip-compile
, and omitting the--upgrade
flag so the existing files aren't used as a cache. I suspect either the--upgrade-packages
constraint logic is inside theif not upgrade...
conditional, or that--upgrade-packages
only extend, not constrain the complete set of packages.This feature should be permitted, or the documentation changed to avoid stating that this is possible