pypa / pipenv

Python Development Workflow for Humans.
https://pipenv.pypa.io
MIT License
24.83k stars 1.87k forks source link

Updating only one locked dependency #966

Closed k4nar closed 2 years ago

k4nar commented 6 years ago

Sometimes I'm doing a PR and I want to update a specific dependency but I don't want to deal with updates of all my dependencies (aiohttp, flake8, etc…). If any breaking change was introduced in those dependencies, I want to deal with it in another PR.

As far as I know, the only way to do that would be to pin all the dependencies that I don't want to update in the Pipfile. But I find it to defeat the purpose of Pipenv in the first place :) .

So my feature request would be to be able to do something like:

$ pipenv lock --only my-awesome-dep

That would generate a Pipfile.lock with updates for only my-awesome-dep and its dependencies.

I can probably make a PR for that, but I would like to get some feedback first.

uranusjr commented 6 years ago

(Just realised I have been commenting to the wrong issue. Sorry for the mess up, please ignore me) 😞

l0b0 commented 6 years ago

An actual use case that I've struggled with for some hours now: I want to measure test coverage in a Django 2.0 project. Even pipenv install --keep-outdated --selective-upgrade --dev coverage insists on updating the non-dev Django package to version 2.1, which because of breakage elsewhere I absolutely cannot use yet. There really must be a way to change the set of installed packages without upgrading completely unrelated packages to possibly breaking versions. The possibility of breakage in the latest version will always exist.

I'll try @rfleschenberg's workaround, but I don't know whether having a presumably incorrect _meta hash property will break anything.

alecbz commented 6 years ago

@l0b0 If your application genuinely cannot handle a particular version of Django, I think it makes sense to state that restriction in your Pipfile?

wichert commented 6 years ago

@AlecBenzer That sounds like something for setup.py to me.

alecbz commented 6 years ago

@wichert That might make sense too (I'm actually not totally clear on in what circumstances you'd want to have both a setup.py and a Pipfile), but if you have a line in your Pipfile that says:

Django = "*"

you're telling pipenv that you want it to install any version of Django. If what you really want it to do is install 2.0 or lower, tell it that instead:

Django = "<=2.0.0"

While in this particular case pipenv is upgrading Django for no real reason, it could be that somewhere down the line you try to install a package that requires Django >2.0.0, at which point pipenv will happily install it if you haven't told it that you need <=2.0.0.

brettdh commented 6 years ago

If what you really want it to do is install 2.0 or lower, tell it that instead

@AlecBenzer on reflection, it now occurs to me that this is what npm/yarn do by default when you install a package; they find the latest major.minor version and specify ^major.minor.0 in package.json, which prevents unexpected major version upgrades, even when an upgrade-to-latest is explicitly requested. I wonder if Pipenv should do the same - but that would be a separate issue.

Of course, their lock file also prevents accidental upgrades of even minor and patch versions, which is what's being requested here.

brettdh commented 6 years ago

I think it's been discussed above and elsewhere, but there is a tension/tradeoff in the design space between npm/yarn and pipenv. Any package manager ostensibly has these goals, with some relative priority:

The trouble with pinning an exact version in the Pipfile is that it's then harder to upgrade packages; this is why pip-tools exists (though it's for requirements.txt).

tilgovi commented 6 years ago

The --keep-outdated flag does not seem to be working as intended, as was stated when the issue was re-opened. Whether that behavior should or should not be the default and how it aligns with other package managers is not really the central issue here. Let's fix the thing first.

alecbz commented 6 years ago

@brettdh

on reflection, it now occurs to me that this is what npm/yarn do by default when you install a package; they find the latest major.minor version and specify ^major.minor.0 in package.json, which prevents unexpected major version upgrades, even when an upgrade-to-latest is explicitly requested. I wonder if Pipenv should do the same - but that would be a separate issue.

Yeah that's along the lines of what I was trying to suggest in https://github.com/pypa/pipenv/issues/966#issuecomment-408420493

benkuhn commented 6 years ago

Really excited to hear this is being worked on!

In the mean time, does anyone have a suggested workaround that's less laborious and error-prone than running pipenv lock and hand-reverting the resulting lockfile changes that I don't want to apply?

wichert commented 6 years ago

@benkuhn Not that I know off - I do the same lock & revert dance all the time.

benkuhn commented 6 years ago

Ah ok, you can at least sometimes avoid hand-reverting:

  1. pipenv lock
  2. git commit -m "FIXME: revert"
  3. pipenv install packagename
  4. git commit -m 'Add dependency on packagename'
  5. git rebase -i
  6. Drop the FIXME: revert commit

Unfortunately it's still possible to create an inconsistent Pipfile.lock if your Pipfile.lock starts out containing a version of a package that's too old to satisfy the requirements of packagename, but perhaps pipenv will complain about this if it happens?

jhrmnn commented 6 years ago

--keep-outdated seems to systematically keep outdated only the explicit dependencies that are specified (unpinned) in Pipfile, while all the implicit dependencies are updated.

max-arnold commented 6 years ago

Am I correct that it is not possible to update/install single dependency using pipenv==2018.7.1 without updating other dependencies? I tried different combinations of --selective-upgrade and --keep-outdated with no success.

Editing Pipfile.lock manually is no fun...

mrsarm commented 6 years ago

Same than @max-arnold , it's my first day using the tool in an existing project, and I have to say I'm really disappointed, before I started to use it, I checked the doc site and the video demo, it looked impressive to me, and now this: in real project, work with pip or pipenv is almost the same, i don't see the point, like many other said in the thread, if I have a lock file, why you are updating my other dependencies if there is no need to update them.

Of course, ### if the update is mandatory, it's OK to update all the necessary dependencies, but just those, not all the outdated instead.

Also the options --selective-upgrade and --keep-outdated are not clear for what are useful for, there is another issue highlighting this here #1554 , and nobody is able to respond what these options do, incredible.

But my major disappointing is why this package was recommended by the Python official documentation itself, these recommendations should be more careful conducted, I know this can be a great project in the feature, have a lot of potential, but simple things like this (we are not talking about a bug or a minor feature), make this project not eligible for production environments, but suddenly because it was recommended by the Python docs, everybody are trying to use it, instead of looking for other tools that maybe work better, or just stick with pip, that doesn't solve also these issues, but at least it's very minimalist and it's mostly included in any environment (does not add extra dependencies).

uranusjr commented 6 years ago

@mrsarm Thank you for your opinion. Sorry things don’t work for you. I don’t understand where the disappointment comes from, however. Nobody is forcing Pipenv on anyone; if it doesn’t work for you, don’t use it. That is how recommendations work.

Your rant also has nothing particularly related to this issue. I understand it requires a little self-control to not dumping trash on people when things don’t go your way, but please show some respect, and refrain from doing so.

mrsarm commented 6 years ago

@uranusjr it's not trash, it's an opinion, and some times it's not an option, like my case, where somebody chose pipenv to create a project where I started to work now, and I have to deal with this.

But things get worst just now, and what I going to say it's not an opinion, it's a fact.

After trying to add one dependency that just I dismissed to avoid to deal with this issue (because it's a dev dependency, so I created a second environment with pip and the old requirements-dev.txt approach, just with that tool), I needed to add another dependency.

The new dependency is PyYAML, let say the latest version. If you install it in any new environment with pip, you will see that the library does not add any dependency, so only PyYAML is installed, is that simple in these cases with Pip. But adding the dependency with Pipenv (because a project that I didn't create is managed with Pipenv) the same issue happened, despite PyYAML doesn't have any dependency, and it's not previously installed in the project (an older version), pipenv updates all my dependencies in the lock file and the virtual environment, but I don't want to update the others dependencies, I just one to add a single new module without any dependency.

So the conclusion (and again an opinion, not a fact like pipenv broke all my dependencies) it's that Pipenv instead of help me to deal with the dependencies management, it turn it into hell.

dfee commented 6 years ago

I've followed this thread for months, and I think any real project will ultimately stumble upon this issue, because the behavior is unexpected, counter-intuitive, and yes: dangerous.

About a month ago I tried out a more-comprehensive alternative to pipenv, poetry; it solved the problems I needed to solve: 1) managing one set of dependencies (setup.py, setup.cfg, pip, and pipfile -> pyproject.toml) 2) future oriented, backwards-compatible (again pyproject.toml) 3) fairly un-opinionated (no i'm really not asking to install redis) 4) and the solution to the classic Pipenv problem: "Also, you have to explicitly tell it [pipenv] to not update the locked packages when you install new ones. This should be the default." [1] [2]

I weighed sharing these thoughts on the pipenv issue, but as @uranusjr said, "no one is forcing Pipenv on anyone", and I'm not forcing Poetry. I like it, it works well, and it solves my problems, but I'm just sharing an alternative, more-comprehensive solution to the problem I was having. Just take all that as my 2¢.

p.s. I think the concern about Pipenv being the "official" solution is due to it's first-class integrations – something that you, @uranusjr, might see it as a simple recommendation – the industry at large is taking it as the "blessed approach going forward". Frankly, that recommendation is more authoritative in the community than certain PEPs that have been around for more than a year.

techalchemy commented 6 years ago

Nobody is forcing you to participate in our issue tracker; if you don’t have a productive comment please find a forum that is not for triaging issues.

For users who are interested in trying the alternative resolver @uranusjr and myself have been implementing for several weeks now, please try out https://github.com/sarugaku/passa which will generate compatible lockfiles. Poetry does a lot of different things, but it also has limitations and issues itself, and we have a design philosophy disagreement about scope.

This is a project we manage in our spare time; if you want to see something fixed or you have a better approach, we are happy to accept contributions. If you are here to simply tell us we ruined your day and your project, I will ask you only once to see yourself out.

We have not forgotten or ignored this issue, we have a full implementation of a fix in the resolver linked above. Have patience, be courteous, or find somewhere else to talk. To those who have been waiting patiently for a fix, please do try the resolver mentioned above— we are eager to see if it meets your needs. It implements proper backtracking and resolution and shouldn’t handle this upgrade strategy

In the shorter term I think we can get a band aid for this into pipenv if we don’t wind up cutting over first.

techalchemy commented 6 years ago

@dfee I am not really sure that blurring lines between applications and libraries is the correct answer to dependency management, so I don’t see poetry’s approach as an advantage. I wasn’t involved in whatever your issue was with the recommendation engine, but we took that out some time ago...

sdispater commented 6 years ago

@techalchemy

I am not really sure that blurring lines between applications and libraries is the correct answer to dependency management, so I don’t see poetry’s approach as an advantage.

Why though? I never understood this idea that you should manage the dependencies of a library and an application differently. The only difference between the two is the lock file which is needed for an application to ensure a reproducible environment. Other than that it's the same thing. This is the standard in most other languages and Python seems the exception here for some reason and this is bad from a user experience standpoint since this is making things more complex than they should be.

it also has limitations and issues itself

Which ones? I am really curious about the issues or limitations you encountered while using Poetry.

mrsarm commented 6 years ago

My apologies to bean so rude. Now reading my comments I realize that despite the info I provided and some of my options are still valid (IMHO), it's wasn't appropriated the way I wrote what I wanted to say.

I understand that the issue tracker is most a place where to discuss bugs and improvements, and discuss whether this is a bug or an error by design is not clear in the thread, but again my apologies.

I thing there are two strong topics here:

In case it's a bug in on one of these params --keep-outdated --selective-upgrade, I still thinking that do not set whatever param solves the unnecessary update of the dependencies as default it's a really bad idea.

To compare with a similar scenario, imagine that you execute apt-get install vim to just install the vim tool in your system (and the necessary vim's dependencies or updates if apply), but imagine also that in this situation apt updates all the other dependencies of your system: python, the QT system, the Linux kernel... and so on. It's not that apt shouldn't allow us to updates other dependencies, but there is a clear command to do that: apt-get upgrade, while apt-get install PACKAGE just install / update PACKAGE and it's dependencies.

techalchemy commented 6 years ago

@sdispater the distinction is at the heart of every disagreement we've ever had and it's incredibly subtle but I'd point you at https://caremad.io/posts/2013/07/setup-vs-requirement/ or a good article for the elixir use case: http://blog.plataformatec.com.br/2016/07/understanding-deps-and-applications-in-your-mixfile/

pyproject.toml isn't really supported for library definition metadata -- and not at all by any version of pip that doesn't implement peps 517 and 518 (both of which are still having implementation details worked out) as an authoritative library declaration file. setup.cfg exists for that purpose (the actual successor to setup.py ) and IMHO both of those should really be supported. A library is published and intended for consumption with abstract dependencies so that they can play nice in the sandbox with others; applications are usually large, complex beasts with sometimes hundreds of direct dependencies. So one of our main divergences is that when we design and build our tooling, we take this into account also

@mrsarm For your first question, the update behavior was intentional (and was discussed extensively at the time, /cc @ncoghlan and related to OWASP security concerns). On the second point, the behavior is currently not properly implemented which is why the issue is still opened, which led us to rewriting the backing resolver behind pipenv, which I mentioned above. It simply didn't support this behavior. --selective-upgrade is supposed to selectively upgrade only things that are dependencies of the new package, while --keep-outdated would hold back anything that satisfied the dependencies required by a new package. Slightly different, but I am fairly sure neither works correctly right now.

mmerickel commented 6 years ago

pyproject.toml isn't really supported for library definition metadata -- and not at all by any version of pip that doesn't implement peps 517 and 518 (both of which are still having implementation details worked out) as an authoritative library declaration file. setup.cfg exists for that purpose (the actual successor to setup.py ) and IMHO both of those should really be supported.

Well this is certainly off topic but it's an important discussion so I can't help myself.

There is actually no standard around setup.cfg right now other than the conventions established by distutils and setuptools. pyproject.toml is absolutely for library metadata as the successor to setup.py or the community would have placed build requirements in setup.cfg instead.

pyproject.toml describes how to build a project (PEP 518), and part of building is describing metadata. I'm NOT saying that pyproject.toml needs a standard location for this metadata, but PEP 518 uses this file to install a build tool and from there it's very reasonable to expect that the build tool will use declarative configuration from somewhere else in the file to determine how to build the project.

Anyway, going back to pipenv vs poetry - there seems to be some idea floating around that applications don't need certain features that libraries get, like entry points, and this is just incorrect. It should be straightforward for an application to be a python package.

The only true difference between an application and a library in my experience with python and with other ecosystems is whether you're using a lockfile or not. Of course there's a third case where you really just want a requirements.txt or Pipfile and no actual code and that seems to be all that pipenv has focused on so far (pipenv install -e . falls into this category as pipenv is still afraid to try and support the package metadata). Unfortunately, while the design of pipenv is cleaner with this approach, it's also way less useful for most applications because PEP 518 decided to punt on how to install projects into editable mode so in order to continue using pipenv we will be stuck on setuptools quite a while longer as you cannot use pyproject.toml to switch away from setuptools and still use pipenv install -e ..

techalchemy commented 6 years ago

There is actually no standard around setup.cfg right now other than the conventions established by distutils and setuptools. pyproject.toml is absolutely for library metadata as the successor to setup.py or the community would have placed build requirements in setup.cfg instead.

Distutils is part of the standard library and setuptools is installed with pip now, so saying that there is no standard is a bit silly. Not to mention it uses the standard outlined in pep 345 for metadata, among others, and can also be used to specify build requirements.

the community would have placed build requirements in setup.cfg instead.

Do you mean the pep authors? You can ask them why they made their decision, they outline it all in the pep.

pyproject.toml describes how to build a project (PEP 518), and part of building is describing metadata. I'm NOT saying that pyproject.toml needs a standard location for this metadata, but PEP 518 uses this file to install a build tool and from there it's very reasonable to expect that the build tool will use declarative configuration from somewhere else in the file to determine how to build the project.

This came up on the mailing list recently -- nothing anywhere has declared a standard around pyproject.toml other than that it will be used to declare build system requirements. Anything else is an assumption; you can call that "library definition metadata", but it isn't. Try only defining a build system with no additional information about your project (i.e. no pep-345 compliant metadata) and upload it to pypi and let me know how that goes.

Anyway, going back to pipenv vs poetry - there seems to be some idea floating around that applications don't need certain features that libraries get, like entry points, and this is just incorrect. It should be straightforward for an application to be a python package.

Who is saying that applications don't require entry points? Pipenv has an entire construct to handle this.

so in order to continue using pipenv we will be stuck on setuptools quite a while longer as you cannot use pyproject.toml to switch away from setuptools and still use pipenv install -e .

Not following here... we are not going to leave pip vendored at version 10 forever, I've literally been describing our new resolver, and the actual installer just falls back to pip directly... how does this prevent people from using editable installs?

digitalresistor commented 6 years ago

This came up on the mailing list recently -- nothing anywhere has declared a standard around pyproject.toml

That's correct, it is not a "standard", yet in that same thread recognise that by calling it pyproject.toml they likely asked for people to use this file for other project related settings/config.

So by the same logic you invoked here:

Distutils is part of the standard library and setuptools is installed with pip now, so saying that there is no standard is a bit silly.

pyproject.toml is a standard, and the community has adopted it as the standard location to place information related to the build system, and other parts of a Python project.

digitalresistor commented 6 years ago

Not following here... we are not going to leave pip vendored at version 10 forever, I've literally been describing our new resolver, and the actual installer just falls back to pip directly... how does this prevent people from using editable installs?

PEP 517 punted on editable installs... which means there is no standard way to install a project in editable mode if you are not using setup tools (which has a concept known as develop mode which installs the project in editable mode).

mmerickel commented 6 years ago

Distutils is part of the standard library and setuptools is installed with pip now, so saying that there is no standard is a bit silly. Not to mention it uses the standard outlined in pep 345 for metadata, among others, and can also be used to specify build requirements.

Yes, the build system is expected to output the PKG-INFO file described in PEP 345. This is a transfer format that goes in an sdist or wheel and is generated from a setup.py/setup.cfg, it is not a replacement as such for the user-facing metadata. PEP 518's usage of pyproject.toml is about supporting alternatives to distutils/setuptools as a build system, no one is trying to replace the sdist/wheel formats right now. Those replacement build systems need a place to put their metadata and fortunately PEP 517 reserved the tool. namespace for these systems to do so. It's not an assumption - both flit and poetry have adopted this namespace for "library definition metadata".

Try only defining a build system with no additional information about your project (i.e. no pep-345 compliant metadata) and upload it to pypi and let me know how that goes.

How constructive.

Who is saying that applications don't require entry points? Pipenv has an entire construct to handle this.

Where is this construct? I cannot even find the word "entry" on any page of the pipenv documentation at https://pipenv.readthedocs.io/en/latest/ so "an entire construct" sounds pretty far fetched? If you mean editable installs then we have reached the point I was making above - with pipenv deciding to couple itself to pipenv install -e . as the only way to hook into and develop an application as a package, for the foreseeable future pipenv's support here is coupled to setuptools. I think the entire controversy boils down to this point really and people (certainly me) are frustrated that we can now define libraries that don't use setuptools but can't develop on them with pipenv. To be perfectly clear this isn't strictly pipenv's fault (PEP 518 decided to punt on editable installs), but its refusal to acknowledge the issue has been frustrating in the discourse as poetry provides an alternative that does handle this issue in a way that's compliant with the pyproject.toml format. Pipenv keeps saying that poetry makes bad decisions but does not actually attempt to provide a path forward.

techalchemy commented 6 years ago

https://pipenv.readthedocs.io/en/latest/advanced/#custom-script-shortcuts

Please read the documentation.

techalchemy commented 6 years ago

@bertjwregeer:

pyproject.toml is a standard, and the community has adopted it as the standard location to place information related to the build system, and other parts of a Python project.

Great, and we are happy to accommodate sdists and wheels built using this system and until there is a standard for editable installs we will continue to pursue using pip to build sdists and wheels and handle dependency resolution that way. Please read my responses in full. The authors and maintainers of pip, of the peps in question, and myself and @uranusjr are pretty well versed on the differences between editable installs and the implications of building them under the constraints of pep 517 and 518. So far All I'm seeing is that the peps in question didn't specifically address how to build them because they leave it up to the tooling, which for some reason everyone thinks means pipenv will never be able to use anything but setuptools?

I've said already this is not correct. If you are actually interested in the implementation and having a productive conversation I'm happy to have that. If you are simply here to say that we don't know what we're doing, but not interested in first learning what it is we are doing, this is your only warning. We are volunteers with limited time and I am practicing a 0 tolerance policy for toxic engagements. I do not pretend my work is perfect and I don't pretend that pipenv is perfect. I will be happy to contribute my time and effort to these kinds of discussions; in exchange I ask that they be kept respectful, that they stick to facts, and that those who participate also be willing to learn, listen, and hear me out. If you are here just to soapbox you will have to find another platform; this is an issue tracker. I will moderate it as necessary.

This discussion is wildly off topic. If anyone has something constructive to say about the issue at hand, please feel free to continue that discussion. If anyone has issues or questions about our build system implementations, please open a new issue. If you have issues with our documentation, we accept many pull requests around documentation and we are aware it needs work. Please defer all of that discussion to new issues for those topics. And please note: the same rules will still apply -- this is not a soapbox, it is an issue tracker.

mmerickel commented 6 years ago

https://pipenv.readthedocs.io/en/latest/advanced/#custom-script-shortcuts Please read the documentation.

Entry points are a more general concept than just console scripts and this link is completely erroneous in addressing those concerns. <soapbox>Ban away - you're not the only maintainer of large open source projects on here and none of my comments have been a personal attack on you or the project. People commenting here are doing so because they want to use pipenv and appreciate a lot of what it does. My comment was not the first off topic post on this thread, yet is the only one marked. Your snarky comments indicating that you think I don't know what I'm talking about are embarrassing and toxic.

techalchemy commented 6 years ago

In the project we maintain, we can soapbox. And yes, pip will support all compliant build systems which you both yourselves seem to full well understand will produce consumable metadata, and as pipenv uses pip as the backing tool to drive its installation process, as I described, yes, pipenv will support all compliant tooling. I already said this.

So yeah, please take your toxicity somewhere else. Your attitude is not welcome here. Final warning. Persistent attempts to incite conflict won’t be tolerated.

matteius commented 2 years ago

There is too much to parse here and I would just be echoing the sentiment that:

There's a little thing to take into account here: Changing a single dependency could change the overall set of requirements. Ex: Updating foo from 1.0 to 2.0 could require to update bar to >=2.0 (while it was <2.0 before), and so on.

or that

upgrade a dependency that has cascading requirements obviously it will have consequences for other dependencies

In fact -- we have pretty great resolution of dependencies now that will update within the constraints you specify in your Pipfile to what is the possible allowed latest versions where all specifiers and constraints are met, or otherwise it will error out and let you know that you cannot do that. There is also the --skip-lock flag that might be helpful for some.

Anyway I am closing this one down for the history books and If there is anything relevant in this thread that is not addressed or needs a new issue -- feel free to open one, but keep in mind that the current behavior is correct with respect to dependency resolution and keeping the constraints of the Pipfile all happy and satisfied.