python-poetry / poetry

Python packaging and dependency management made easy
https://python-poetry.org
MIT License
31.86k stars 2.28k forks source link

Including companion dependency only for specific versions of another dependency #8813

Open gerbenoostra opened 11 months ago

gerbenoostra commented 11 months ago

Feature Request

I have an optional dev dependency (a companion package containing stubs), which I only want to include for certain versions of a main dependency.

For example, when using pyspark~=3.0, I want to use the pyspark-stubs~=3.0. However, when using pyspark~=3.3., I don't want to (as that package already includes the stubs).

My attempt:

[tool.poetry.dependencies]
pyspark = [
  {version = "~3.0", optional = true, python = "<=3.8"},
  {version = "~3.3", optional = true, python = ">3.8"}
]

[tool.poetry.group.test.dependencies]
pyspark-stubs = {version = "~3.0", python = "<=3.8", optional=true}

Which fails to resolve, as pyspark-stubs depends on pyspark ~3.0, and the project depends on both pyspark ~3.0 and pyspark ~3.3.

The problem in this case is of course that for pyspark 3.3, there's no compatible pyspark-stubs specified. But in this case, I want the resolution to not use any pyspark-stubs. I tried to enforce this by using the same restriction on python, but that didn't work.

How can I specify such a companion dependency (like stubs) only for specific versions of my main dependency? Or is there a way to group them together in another way?

dimbleby commented 11 months ago

duplicate #8499, both seem unlikely ever to happen

gerbenoostra commented 11 months ago

It does work with transitive dependencies though, I assume.

If I depend on library A version 1 or 2, where version 1 has different dependencies than 2, depending on the version (perhaps based on python version), either the set of 1 or the set of 2 would get installed.

Maybe I can create a workaround where I release 2 versions of a private (empty) package: one with pyspark 3.0 & pyspark-stubs as dependencies, and release another new version that depends only on pyspark 3.3.

radoering commented 11 months ago

Which fails to resolve, as pyspark-stubs depends on pyspark ~3.0, and the project depends on both pyspark ~3.0 and pyspark ~3.3.

If I understand correctly this example should resolve because the intersection of the markers of pystark-stubs and pyspark ~ 3.3 is empty. However, this example might be a duplicate of #5506.

Maybe I can create a workaround where I release 2 versions of a private (empty) package: one with pyspark 3.0 & pyspark-stubs as dependencies, and release another new version that depends only on pyspark 3.3.

You may run into the same solver issue or it may work. I'd like to hear whether this changes anything. 🙂