Open browniebroke opened 5 years ago
pip-tools looks like it could work but I haven't tried it to be sure. Pipenv has an unsafe by default api so that it's currently impossible to update only a single dependency. Poetry has a safe by default api, and seems to have recently fleshed out their overall api design, they are approaching version 1.0 but not quite there yet.
We seem to be just using a local and production list of dependencies (with a base) which would be supported by Poetry so that could be an option.
I have had a good experience with piptools so far. It's the least disruptive to current practices.
piptools is really great!
I believe we should consider adding this as it brings a lot of value. @browniebroke if you can confirm the workflow (pip-tools), then I could submit a PR.
Do you know if this is compatible with pyup?
Edit: I just noticed that @browniebroke had a similar list on the first post of this issue. However, I am going to leave this here in hopes to restart the discussion on this.
Is there no consideration for poetry? With Poetry the project wouldn't need to maintain three files (base, local, production), everything is maintained in one file and built in to the tooling (poetry add pytest --dev
). It's very similar to npm which I consider superior to pip in dependency resolution.
👋 I am a member of both cookicutter-django
and piptools
.
While I absolutely think piptools
is great, there is a new kid in town - pip's new resolver, which will come out very soon. (the beta version should be released very soon)
With a resolver native to pip, we don't have to maintain another set of external dependencies such as piptools
or poetry
.
@sfdye Interesting! I heard about the grant but did not follow the progress. It is good to hear that is coming so soon. I agree with you that we should wait and test it out .
With a resolver native to pip, we don't have to maintain another set of external dependencies such as
piptools
orpoetry
.
I don't think this is true. Even with the new resolver in pip, this workflow still makes sense. Yes, in the pre-robust-pip-resolver world we're in today, having this step lets you use better dependency resolution logic, to better handle conflicting dependencies BUT that's not the only benefit. IMO it's not even the main one.
Maintaining a set of exact pins that you know an application would work with protects the deployments from being sensitive to new releases of a dependency. This makes for reproducible/robust deployments (you want that, right?). Folks have developed things like pyup / pip-tools / pipenv / poetry for making these workflows easier -- because there's enough of a benefit to doing this. :)
Anyway, all this is to say, don't not-do-this because pip's new resolver is better, because it solves only one of two issues. Notably, even with the new-resolver, pip doesn't provide all the functionality necessary for generating these pins out-of-the-box -- that's what pip-tools provides via pip-compile and pip-sync.
Poetry and pipenv add their own formats for describing things, and handle more of the project's development workflows than pip/pip-tools would. Some projects can do that but that's more of a workflow decision, and I'm gonna cop out of that one. :)
Sorry to bump an old thread, but I don't think Poetry should be ruled out anymore, as it now supports extra dependencies. We could define production and local dependencies as extra deps and pin it that way? Additionally, poetry is now has a stable (>1.0) release.
Yeah, supporting poetry should be a no-brainer.
Description
It's been a while that it's been good practice to pin all your dependencies. More and more, it's also becoming a good practice to also pin your nested dependencies. This is something we currently don't do.
Rationale
Recently, we had a couple of issues where upgrades of nested dependencies broke the generated projects:
1949
1953
1954
Use case(s) / visualization(s)
It would be nice for
cookiecutter-django
to adopt this best practice, which would also avoid breaking too easily. The 3 tools trying to solve this problem that I know of are:Pipenv & Poetry are a bit new and imply a radically different workflow. They were ruled out in the past as they only provide 2 sets of dependencies (dev & prod: #1621 #1425). Moreover, I'm not sure how maintainable the files they produce would be in the template with all the if/else branches we have.
So my personal favourite is pip-tools, the steps to do that would be:
requirements.txt
files intorequirements.in
filesrequirements.txt
Then, on Travis, when pyup sends us an update, we would need to check that the
requirements.in
don't produce any changes and is compatible. This is how they do it on Wharehouse (a.k.a the new PyPI). Ideally, we would do that for all combinations of the template (#591 would help).PS: Some more reading on this topic.