Open zanieb opened 1 month ago
Thinking out loud, it seems like the most likely scenario this would happen in is if the compile originally ran and the package was not yanked, and then it was run at a later time and the package was yanked.
Given this is in the case where the yanked package is not pinned in the requirements, I would personally expect "1." to happen, with maybe a warning. As it was presumably yanked for a reason and I'm not pinning it, so I would would trust the package authors reason for yanking.
If the user wants to pin their package to a yanked version, I think it makes sense to require them to do it in the input requirements.
Yeah that's my feeling as well. I would find it weird that --upgrade
downgrades a package but 🤷♀️ yanks are rare.
Does --upgrade
normally guarantee the versions pinned in the old output file will be lower bounds of the new output file?
For example if I had a direct dependency on a
, and when I ran pip compile for the first time it created a pinning b==2.0
, but then when I ran upgrade the next day the newer version of a
now requires b==1.0
, would a
be upgraded and b
downgraded or would a
now never be able to upgrade?
I'm not that familiar with the behavior of pip compile --upgrade
but I know conda upgrade --all
will downgrade packages.
Honestly I'm not sure! I would be surprised if it provided that guarantee though, I presume we just no longer prefer the pinned versions and perform a resolution based on the initial bounds.
I'd definitely want a warning, i've been using a broken release. Option 1 seems to be the most helpful behavior to me.
The next step is to verify our existing behavior, if anyone is interested.
Prompted at https://github.com/astral-sh/uv/issues/3602#issuecomment-2115761269
When a yanked package is pinned in a lockfile and
pip compile --upgrade
is used and there is no new version of the package in the input range, should weNote the yanked package is not pinned in the input requirements.