cookiecutter / cookiecutter-django

Cookiecutter Django is a framework for jumpstarting production-ready Django projects quickly.
https://cookiecutter-django.readthedocs.io
BSD 3-Clause "New" or "Revised" License
12.11k stars 2.9k forks source link

Pin nested Python dependencies #1988

Open browniebroke opened 5 years ago

browniebroke commented 5 years ago

Description

It's been a while that it's been good practice to pin all your dependencies. More and more, it's also becoming a good practice to also pin your nested dependencies. This is something we currently don't do.

Rationale

Recently, we had a couple of issues where upgrades of nested dependencies broke the generated projects:

Use case(s) / visualization(s)

It would be nice for cookiecutter-django to adopt this best practice, which would also avoid breaking too easily. The 3 tools trying to solve this problem that I know of are:

Pipenv & Poetry are a bit new and imply a radically different workflow. They were ruled out in the past as they only provide 2 sets of dependencies (dev & prod: #1621 #1425). Moreover, I'm not sure how maintainable the files they produce would be in the template with all the if/else branches we have.

So my personal favourite is pip-tools, the steps to do that would be:

  1. Make our current requirements.txt files into requirements.in files
  2. Replace pinned versions with ranges wherever we care (e.g. Django)
  3. Generate the pinned requirements.txt
  4. Add the various if/else to the generated requirements.txt

Then, on Travis, when pyup sends us an update, we would need to check that the requirements.in don't produce any changes and is compatible. This is how they do it on Wharehouse (a.k.a the new PyPI). Ideally, we would do that for all combinations of the template (#591 would help).

PS: Some more reading on this topic.

yunti commented 5 years ago

pip-tools looks like it could work but I haven't tried it to be sure. Pipenv has an unsafe by default api so that it's currently impossible to update only a single dependency. Poetry has a safe by default api, and seems to have recently fleshed out their overall api design, they are approaching version 1.0 but not quite there yet.

yunti commented 5 years ago

We seem to be just using a local and production list of dependencies (with a base) which would be supported by Poetry so that could be an option.

earthboundkid commented 5 years ago

I have had a good experience with piptools so far. It's the least disruptive to current practices.

ollema commented 5 years ago

piptools is really great!

demestav commented 4 years ago

I believe we should consider adding this as it brings a lot of value. @browniebroke if you can confirm the workflow (pip-tools), then I could submit a PR.

  1. We create base.in, local.in and production.in source files based on the current requirements files. We will pin each requirement based on the current base.txt, local.txt and production.txt
  2. We run pip-compile to pin dependencies and nested dependencies
  3. We commit both .in files and .txt files

Do you know if this is compatible with pyup?

Edit: I just noticed that @browniebroke had a similar list on the first post of this issue. However, I am going to leave this here in hopes to restart the discussion on this.

danihodovic commented 4 years ago

Is there no consideration for poetry? With Poetry the project wouldn't need to maintain three files (base, local, production), everything is maintained in one file and built in to the tooling (poetry add pytest --dev). It's very similar to npm which I consider superior to pip in dependency resolution.

sfdye commented 4 years ago

👋 I am a member of both cookicutter-django and piptools.

While I absolutely think piptools is great, there is a new kid in town - pip's new resolver, which will come out very soon. (the beta version should be released very soon)

With a resolver native to pip, we don't have to maintain another set of external dependencies such as piptools or poetry.

demestav commented 4 years ago

@sfdye Interesting! I heard about the grant but did not follow the progress. It is good to hear that is coming so soon. I agree with you that we should wait and test it out .

pradyunsg commented 4 years ago

With a resolver native to pip, we don't have to maintain another set of external dependencies such as piptools or poetry.

I don't think this is true. Even with the new resolver in pip, this workflow still makes sense. Yes, in the pre-robust-pip-resolver world we're in today, having this step lets you use better dependency resolution logic, to better handle conflicting dependencies BUT that's not the only benefit. IMO it's not even the main one.

Maintaining a set of exact pins that you know an application would work with protects the deployments from being sensitive to new releases of a dependency. This makes for reproducible/robust deployments (you want that, right?). Folks have developed things like pyup / pip-tools / pipenv / poetry for making these workflows easier -- because there's enough of a benefit to doing this. :)


Anyway, all this is to say, don't not-do-this because pip's new resolver is better, because it solves only one of two issues. Notably, even with the new-resolver, pip doesn't provide all the functionality necessary for generating these pins out-of-the-box -- that's what pip-tools provides via pip-compile and pip-sync.

Poetry and pipenv add their own formats for describing things, and handle more of the project's development workflows than pip/pip-tools would. Some projects can do that but that's more of a workflow decision, and I'm gonna cop out of that one. :)

RobRoseKnows commented 3 years ago

Sorry to bump an old thread, but I don't think Poetry should be ruled out anymore, as it now supports extra dependencies. We could define production and local dependencies as extra deps and pin it that way? Additionally, poetry is now has a stable (>1.0) release.

caniko commented 3 years ago

Yeah, supporting poetry should be a no-brainer.