pypa / pipenv

Python Development Workflow for Humans.
https://pipenv.pypa.io
MIT License
24.79k stars 1.86k forks source link

documentation/question: package publishing #1288

Closed cdaringe closed 6 years ago

cdaringe commented 6 years ago

problem

pipenv's opening documentation here is worded such that I expected that I could actually package and ship my project with pipenv.

  1. can i package/publish my project with pipenv?
  2. if so, how? the docs and issues from what I've searched omit mention of this

anyway, thanks! so far feels like a much better UX that past tooling

Describe your environment

n/a

Expected result

pipenv publish to kick off a publish cycle

Actual result

docs and/or feature not present

Steps to replicate

n/a

uranusjr commented 6 years ago

Python has traditionally distinguish between library packaging and application packaging. The former uses setup.py (and friends) to configure a package when it is downloaded from PyPI; the latter uses requirements.txt. A project can be both at the same time, but it’s up to you to create configurations for both scenarios and sync them correctly (if you want to). Pipenv as things stand handles application packaging, with Pipfile replacing requirements.txt. It doesn’t not handle library packaging and uploading to PyPI (and can’t; Pipfile lacks metadata to create a PyPI package).

This, discussion in #1263, and PEP 518 give me a project idea though. What if there is a pipeline that

  1. Takes information from Pipfile and pyproject.toml
  2. Creates a setup.py on the fly
  3. Package (setup.py sdist etc.)
  4. Upload to PyPI (using twine)

I’ll experiment a bit when I have time and report back if there’s progress 😎

DrSensor commented 6 years ago

I think you may want to take a look at pybuilder as reference. It also produce setup.py when build.

yunstanford commented 6 years ago

I'd like to see a more general feature around this, like adding customer tasks/commands, not only pipenv publish.

Korijn commented 6 years ago

Sorry for reviving this but I have one more question: does setup.py have a place for development requirements?

Korijn commented 6 years ago

Answering my own question: Apparently people use extras_require by adding a "dev" option there.

voronind commented 6 years ago

It will be cool to have pipenv install hook to update setup.py.

uranusjr commented 6 years ago

@dimka665 @DrSensor From the documentation https://docs.pipenv.org/advanced/#pipfile-vs-setup-py

voronind commented 6 years ago

It's crazy idea that we cannot use pipenv for Python library development. I think it's not forever.

Although practicality beats purity.

I use pipenv to get deps for setup() and wheels: python setup.py bdist_wheel

uranusjr commented 6 years ago

@dimka665 Um, but, you totally can? Pipenv uses Pipenv for development, and it works just fine. It’s just you don’t specify library requirements in Pipfile, but in setup.py instead.

gsemet commented 6 years ago

I use pipenv for libraries and still declare the dependencies in pipfile (but I don't track the lock file), reflected back to setup.py with pbr. Works fine so far (still I have to automatically generate the requirements.txt before packaging). I only have a minor evolution to do in the pbr to do (still planned) to have it reading Pipfile directly instead of the requirements.txt. I know here we don't like declaring dependencies in requirements.txt and pipfile for libraries, but it seems totally legit as long as you do not track the lock files or frozen versions

DrSensor commented 6 years ago

Seems like pipenv cli need to adopt some command to add package to setup.py. Another choice is command to generate barebone setup.py from Pipfile.

techalchemy commented 6 years ago

@DrSensor this is not going to happen — it is a bad practice. Pipfiles and setup files are for different purposes. We don’t parse setup files and have no plans of building tooling to do this

uranusjr commented 6 years ago

@DrSensor Unfortunately this is the current consensus from the core developers (not Pipenv, but PyPA, and Python as a whole). As suggested in https://github.com/pypa/pipenv/issues/1851#issuecomment-376357550, you’d need to raise this to a boarder audience than Pipenv to change the situation.

cdaringe commented 6 years ago

It's definitely not a bad practice. I have over 130 modules published in a different technology stack using the combined lib/bin strategy, and it's a wonderful optimization that simplifies development. It's even all in JSON so I can parse it at runtime if desired, which ends up being incredibly useful frequently. I ship libs, executables, or often both out of the same declaration file. I will definitely agree that it's fair to want them separate--that's an opinion. but a bad practice? I think that's subjective, and I also think it's incorrect. Having to learn multiple formats makes python development that much harder.

cdaringe commented 6 years ago

Edit--sorry i misread. parsing setup probably indeed isn't worth the effort. But supporting some format that supports shipping software in both modalities is probably worth the investment

jtratner commented 6 years ago

@cdaringe - I'm just a little confused, why doesn't this work?

  1. put dependencies in setup.py
  2. have a standard Pipfile like this:
[[ source ]]
url = "https://pypi.python.org/simple"
verify_ssl = true
name = "pypi"

[packages]

"e1839a8" = {path = ".", editable = true}

[dev-packages]

coverage = "*"
tox = "*"
xunitmerge = "*"
pytest = ">=3.3.1"
pytest-cov = ">=2.5.1"
Sphinx = ">=1.2.2"
sphinx_rtd_theme = "*"
flake8 = ">=2.4.1"
mock = ">=1.3.0"

[requires]
python_version = "3.6"

then pipenv install --dev <package> will add to your dev package dependencies. otherwise you specify in setup.py

jtratner commented 6 years ago

1 missing thing from that is ensuring lock is updated when you update setup.py, but this is pretty easy to do in, say, a Makefile, like this:

Pipfile.lock: setup.py
    pipenv lock
cdaringe commented 6 years ago

@jtratner, sure that looks like it could work. i may try that soon!

to achieve the goal of publishing a library and an executable script, the proposed solution requires 3 tools and a handful of associated files, where neighboring technology stacks would require 1 tool and 1 or 2 files. every time i come back to python i feel lost about what the right tool for the job is. easy_install, pip, pipenv, ...make, pip_install, venv/virtualenv, pyenv, etc. they are all sorta kinda related, but all have unique roles. in node, you need just one thing--npm (debatably nvm too, but installing deps next to your source makes it much less relevant in comparison to python). npm has warts, but the consolidation down to one simple tool for library authoring, executable authoring, and build scripts i feel has been underplayed in that community's explosive growth.

i was hoping that this tool would be the npm of python, and have the [pyenv|nvm|etc] bit figured out too. instead, it's a sort of hybrid. in node speak, it's nvm + 1/3 of the npm client. i gather from the above remarks that the python community has grown comfortable with what they have--throw more tools at it (with no snark intended). this all works, but i think the dev experience would be improved by merging/consolidating all of the functionality.

sersorrel commented 6 years ago

@cdaringe, you may be interested in Poetry - I think that is intended to provide an experience more like other languages' packaging systems.

techalchemy commented 6 years ago

@cdaringe I don't think that we are against changing things, I think it's really important to have the discussion about how to use the tooling in question and how to consolidate (and those conversations are actively occurring -- just that we are not solely responsible for the decisions... @dstufft or @ncoghlan might be able to say more about where to go with this)

Roughly I would be interested in a consolidated list of features you think we are missing so that we can focus on those features directly rather than on what proportion of other ecosystems' tooling we have implemented -- since python is its own beast, not everything node does belongs here, and vice versa.

ncoghlan commented 6 years ago

Folks may also want to read http://www.curiousefficiency.org/posts/2016/09/python-packaging-ecosystem.html#my-core-software-ecosystem-design-philosophy.

Tightly coupling publishing tools to installation tools is useful for forming an initial tight-knit coherent publishing community, and for exploiting network effects as a commercial platform operator, but it's an approach with a limited lifespan as an ecosystem grows and the diversity of deployment and integration models increases.

For Python, the first phase of that lasted from around 1998->2004 (build with a distutils based setup.py, publish as a tarball if you had no dependencies, or as a Linux distro package if you wanted dependency management), the second phase from around 2004-> 2008 (build with a setuptools based setup.py, install with easy_install), and we're currently still in the third phase (build with a setuptools based setup.py, install with pip).

One key current focus of ecosystem level work is on eliminating the requirement for published projects to include a setup.py file at all. Once that's the case, then it would indeed be reasonable for pipenv to define a PEP 517 build backend that allowed Pipfile-based applications to be published directly to PyPI. It wouldn't involve a setup.py file, though, it would be a pyproject.toml file that looked something like:

[build-system]
requires = ["pipenv"]
build-backend = "pipenv.publish:build" # Actual API name TBD
frob commented 4 years ago

I haven't done any python pip publishing but isn't the setup.py just python? Why not just parse the Pipfile toml in the setup.py file? Like I said, new to pypi but why not do that?

uranusjr commented 4 years ago

@frob It’s definitely viable, but Pipenv devs are not interested in including it into the project (personally I don’t think it’s a good fit). IIRC there are projects doing exactly this, but I can’t recall the name from the top of mu head.

frob commented 4 years ago

Right, I agree that it doesn't really need to be a part of the project, but even the docs on contributing to pypi recommend reading the README.md for the long description. I don't see this as any different.

Korijn commented 4 years ago

For example, one difficulty you will encounter taking your proposed approach is requiring a toml parser just to run setup.py. You'll need to somehow postpone importing the dependency or ensure everyone already has it installed prior to running setup.py.

frob commented 4 years ago

If they are using pipenv will they not also have the toml parser?

uranusjr commented 4 years ago

It is not uncommon for packages to require extra packages to perform setup, and there is a standard way to declare it, see PEP 518. I would say a TOML parser is much nearer to the minor end of the spectrum in term of build requirements :)

Korijn commented 4 years ago

If they are using pipenv will they not also have the toml parser?

Pipenv and the toml parser are not installed in the virtual environment where the package will be installed. Also see all the issues with setup_requires described in PEP 518 ("Rationale") which @uranusjr just linked.

It is not uncommon for packages to require extra packages to perform setup, and there is a standard way to declare it, see PEP 518. I would say a TOML parser is much nearer to the minor end of the spectrum in term of build requirements :)

Forgive me, I haven't followed recent developments regarding pyproject.toml. The project boilerplate/workflow we have going on with setup.py and pipenv meets all our developer and production requirements, so we don't have a need to transition at the moment.

ncoghlan commented 4 years ago

The way I view this is that when I'm using pipenv with a packaged Python project, the "application" that pipenv is managing is the project test suite, rather than the library itself.

So while you can declare pytoml as a build dependency, I prefer to go the other way around, and add an editable install of the local source package to Pipfile (this has historically required some workarounds, but I believe it picks up declared dependencies correctly on master now)

frob commented 4 years ago

I prefer to go the other way around, and add an editable install of the local source package to Pipfile

Can you expand on this. It isn't quite clear to me what you mean.