pypa / pipfile

Other
3.24k stars 144 forks source link

[request] in addition to [packages] and [dev-packages] could we support [test-packages] etc. #99

Open ghost opened 7 years ago

ghost commented 7 years ago

I see the following as separate use cases that require different sets of dependencies:

The reason this would be nice is so that my dev-packages could have things like jupyter notebooks in them which are useful for development but too heavy for testing. Having a test-packages level would let me do tests on jenkins for CI/CD using pipenv cleanly.

Alternative might be to allow you to define any prefix (dev-, test-) and pass that in somehow.

Not sure if this request fits in with your approach, but thought I would throw it out there. Been using pipenv and it's been working great, Thanks!

bbarker commented 6 years ago

+1 for allowing the definition of any prefix

stsewd commented 6 years ago

This also is the case of rtd, where there are several requirements files https://github.com/rtfd/readthedocs.org/tree/master/requirements

adevore commented 6 years ago

TOML's nesting tables may be a better fit.

[env-packages.dev]
sqlite3 = "*"

[env-packages.test]
psycopg2 = "*"

Unfortunately, the table style of package specifications clashes with using [packages.<env>]. So unless a prefix besides packages is used, this:

[packages.test]
psycopg2 = "*"

is equivalent to this:

[packages]
test = { psycopg2 = "*" }
adaliszk commented 6 years ago

What also would be nice is the composer like optional and suggested packages: https://github.com/pypa/pipenv/issues/2036

orsinium commented 6 years ago

We need this feature. Many projects already have many requirements files, more than two (main and dev) as in current Pipfile realization. Examples:

Also it will be useful for multiple test requirements. Most projects with tox based tests need different requirements for different test environments. Examples:

yunti commented 6 years ago

Is a possible workaround for now to use the approach mentioned in the docs " If you'd like to specify that a specific package only be installed on certain systems, you can use PEP 508 specifiers to accomplish this." as a possible way to identify a testing environment?

OrangeDog commented 6 years ago

@yunti only if your testing environment can be uniquely identified by some combination of the supported markers. If it is then it may not a great testing environment.

kennethreitz commented 6 years ago

@yunti the [develop] section is intended for test dependencies.

adaliszk commented 6 years ago

In my case only the developers machine differs from testing, staging and production. I really like the specify your environment approach described by @adevore

You could have environments were you need to install a specific package bot nowhere else. It would be also unique in package managers to instead specifying common environments just allows you to define as many environments you like.

yunti commented 6 years ago

@OrangeDog eh good point. I didn't realise those were the only options - that workaround won't work then- shame

@kennethreitz -understood, but there are application use cases where that doesn't quite fit - as mentioned in the original post and also some production only packages (mail servers, boto etc...)

OrangeDog commented 6 years ago

Django has another use-case for production-only packages I thought I'd mention. In dev/testing you use the built-in Django dev server. In production you'd want to add gunicorn or waitress or something.

kennethreitz commented 6 years ago

actually, you should practice dev-prod parity as much as possible, and should definately have gunicorn installed locally, even if you're using the dev server.

lbenezriravin commented 6 years ago

I'd like to add another usecase for this feature. When running code that operates on a remote machine, using a library such as rpyc, sometimes 3rd-party libraries must be installed on the remote machine. It's possible to simply add these requirements to the application requirements, and install the application on the remote machine, but that is pretty ugly, IMO. I'm forced to install all sorts of unnecessary things, which can be a real cost if they're large C-based extensions, like matplotlib or the like. Also, it's unclear looking at the Pipfile what's used where, and future maintainers are forced to grep the codebase to get a clear understanding of the dependencies, which undermines one of the reasons I use Pipfiles in the first place. Being able to do something like pipenv install --someflag remote-packages would make this entire process easy and intuitive.

davenquinn commented 5 years ago

Some feature like this might be helpful in the scientific programming workflow — almost every of my many python projects includes a dependency on a few basic, weighty libraries: Pandas, Scipy, Numpy, Matplotlib, SciKit-ML, etc. It'd be extremely nice to install these all globally, reference them in a [system-packages] section, and have pipenv install optionally install these in the parent Python. This would prevent unnecessary system bloating with commonly used infrastructure, while making sure the dependency requirements stay documented in the Pipfile

Igonato commented 5 years ago

Would love to see a custom [*-packages] prefix supported. Ability to reference them internally would also be very very useful, with pip you can put -r file.txt in a requirements file. I often use a setup like this:

# base.txt - generic dependencies
foo==3.1.4
bar==1.5.9

# test.txt - testing tools
-r base.txt

pytest==2.6.5
pytest-cov==3.5.8

# development.txt - development tools
-r test.txt

abc==9.7.9
xyz==3.2.3

# production.txt - production-only packages
-r base.txt

supervisor==8.4.6

@davenquinn I think installing anything to a parent python silently is a bad idea, but overall seems like a valid use-case. [peer-packages] or [optional-packages] would be a better name IMO since it would cover a wider range of cases.