Closed Lnaden closed 6 years ago
I think the docs are out of date here. I tried really hard to get that behavior to work, but in the end, special-casing numpy was not tenable. The docs need updating.
A better pattern is to have only one value for numpy in conda_build_config.yaml - the oldest you can manage. We use 1.9 or 1.11, for example. Numpy is forward-compatible, so by building with an old version, we are compatible back to that, and through the current version. See https://github.com/AnacondaRecipes/scikit-learn-feedstock/blob/master/recipe/meta.yaml#L36
The forward compatibility is captured in this expression: https://github.com/AnacondaRecipes/scikit-learn-feedstock/blob/master/recipe/meta.yaml#L46
Conda-build currently computes hashes based on whether it thinks a variable is "used" in a recipe. That's either from explicit usage in templates or build scripts, or implicit usage by matching a variant key name with a dependency in the host or build section. The latter is why you are getting loops over numpy.
You could also hard-code a version constraint for numpy in the host deps, like we do at https://github.com/AnacondaRecipes/scikit-image-feedstock/blob/master/recipe/meta.yaml#L29
The main reason for hard-coding is if a recipe requires a newer numpy version than the other recipes in your collection. scikit-image required 1.11, but we had our conda_build_config.yaml setting for numpy as 1.9. Thus, the hard-coding overrode the variant and you would not get a loop.
Thanks for the explanation!
It was not really about numpy specifically and more about the de-duplication. I just chose python + some other package, I just picked numpy.
A better pattern is to have only one value for numpy in conda_build_config.yaml - the oldest you can manage.
Trying to understand a bit better on the pinning system, if I want to ensure that the packages I help deploy work on at least numpy 1.9 (for example), but don't want to restrict them to say >=1.9, <1.10
because they won't get updated that often and I don't want them to be locked to that version of Numpy. Would it then make sense to make a global variant of
{'numpy': '1.9', 'pin_run_as_build': {'numpy': {'min_pin': 'x.x'}}}
A follow up for that is I am also trying to convert all the recipes from the old build:
and run:
blocks to build:
, host:
, and run:
in the requirements
section. I'm pretty sure most the packages could easily move their build
section to the host
section, but then I don't think pin_run_as_build
will map correctly, is there a pin_run_as_host
or am I misinterpreting the behavior?
Thanks in advance!
take a look again at https://github.com/AnacondaRecipes/scikit-learn-feedstock/blob/master/recipe/meta.yaml#L46
docs at https://conda.io/docs/user-guide/tasks/build-packages/variants.html#pinning-at-the-recipe-level
You could use pin_run_as_build for this purpose, too, but that's a little different: https://conda.io/docs/user-guide/tasks/build-packages/variants.html#pinning-at-the-variant-level
Okay, I will look at this some more and try locally.
Thanks for the help.
Hi there, thank you for your contribution!
This issue has been automatically locked because it has not had recent activity after being closed.
Please open a new issue if needed.
Thanks!
Short Version: The full combinatorial number of builds are queued for un-pinned, non-python packages, despite the documentation saying that un-pinned, non-python packages will be de-duplicated.
I manage the build platform for several packages and we are in the process of converting our build script and packages from Conda Build 2 to 3. I am trying to understand the build variant system based on the docs on the main conda docs page and use the new API (
conda_build.api
) in my build environment. As I understand from the general pinning example (item 2), a build script which does not pin non-python packages should not have multiple variants in that package.e.g.., for the sample meta.yaml here in the directory "simple":
and using the following conda-build API:
Actual Behavior
The length of
metas
is now 9 long, so there will be 9 meta packages that try to build.Expected Behavior
I should get 3 metas which would give 3 builds because the Python versions are implicitly pinned, and the NumPy versions should not be, based on the docs in the links above.
However, when I execute this command, I get 9 total builds for each combinatorial Python/NumPy pair. The same is true if I change the
build
block to ahost
block instead. I will also get 9 builds if I use the conda build command line interface aswhere the variants.yaml file is the YAML dictionary of the variants variable above
To quote the doc page from above:
Which seems to not be true for this example. Can anyone provide insight as to why the docs and the API/command line interface do not follow the docs, and more importantly, how to actually prevent duplicate builds?
Additional info:
build.sh
script in thesimple
directory which just does an echo, nothing actually installed'ignore_version': ['numpy']
to the variants dictionary does not change behaviorpytest
instead ofnumpy
. Tested just in case it was an exception written in fornumpy
render(..., finalize=True)
orrender(..., finalize=False)
Steps to Reproduce
simple
Output of conda info