Open wolfv opened 3 years ago
I really don't like option 1 because of added complexity involved in the recipes. Option 2 is doable. I wonder if this should be fixed in libsolv or not. How does this affect which version is used? For eg: conda prioritizes python 3.9 over 3.8.
Thanks for commenting! :)
Yes, we're working on improvements in libsolv:
pypy37
dependency).Thanks @wolfv
What is the rationale making a package with a higher lower bound preferable vs higher higher bound?
For the second possible improvement, this is not limited to solvables that have a build string isn't it? It just that in python
case that build string will only select providers that have a track feature.
we could have a rule that a variant package must indicate a de-priorization in the first level of dependency via build-string pinning
I think python
is a special case where there were 2 run exports python
and python_abi
. For everything else, this rule should apply. For python, de-prioritization was indicated in python
and the variant was selected using python_abi
. I've fixed that in https://github.com/conda-forge/python_abi-feedstock/pull/11
We're currently trying to fix mamba resolutions when variant packages are involved. It already works reasonably well when (directly) requiring a package that has a track feature applied. However, it stops working well when we have variants that do not directly expose any track_feature information.
For example, lets say we have 5 numpy builds:
The metadata for numpy
numpy=1.20=py37
contains a dependency onpython >=3.7,<3.8
. However, the metadata fornumpy=1.20=pypy37
contains exactly the same python dependency. The dependencies differ inpython_abi
and thepypy37
variant package has an additionalpypy3.7
dependency.However, the de-priorization by track_feature is applied on the
python
package. This makes it hard to pick the preferred solution right away, since we cannot figure this out without doing a full resolution.In my opinion it would be preferable to either:
python
package via build string to thepypy
variant. If we'd export a stricter dependency (e.g.python >=3.7,<3.8 *pypy
) for numpy'spypy
build, we would be able to "inherit" the de-priorization by inspecting the first level of dependencies. This should be a fairly quick process. If we'd get consensus for this idea, we could have a rule that a variant package must indicate a de-priorization in the first level of dependency via build-string pinning (or on the package itself if the variants depend on differently named packages).Does conda-forge think that's reasonable? I think for the case of pypy we could do some pretty straight-forward repodata patches to add
*pypy
to all packages that also requirepypy3X
.