conda-forge / conda-smithy

The tool for managing conda-forge feedstocks.
https://conda-forge.org/
BSD 3-Clause "New" or "Revised" License
152 stars 176 forks source link

Configuring a Library Variable Group for Azure correctly (PR #1206 follow-up) #1353

Open s-m-e opened 4 years ago

s-m-e commented 4 years ago

This is a question seeking clarification on the contents of the README file, specifically the information added in PR #1206. It says:

If this is your first build on Azure, make sure to add Library Variable Group containing your BINSTAR_TOKEN for automated anaconda uploads.

I am trying to replicate conda-forge's build pipeline on Azure for custom test builds. I am not sure if I understand the above correctly and if an additional clarification is required in the README file:

1) Is the BINSTAR_TOKEN equivalent to an Anaconda Cloud "API / Authorization Token"? (I obtained an unrestricted API token for my Anaconda organization through Anaconda Cloud's web interface.) 2) Is there any requirement on the name or configuration of the "Library Variable Group"? (I just named it "anaconda_upload" and activated "allow access to all pipelines" for this group.) 3) Do I need to treat / encrypt / encode the token somehow? (I just created a variable named BINSTAR_TOKEN inside the Library Variable Group and gave it the "untreated" API token.)

When build_steps.sh runs on Azure, I can see the following output at the end:

TEST END: /home/conda/feedstock_root/build_artifacts/linux-64/qgis-3.12.2-py38h0e74f0e_0.tar.bz2
Renaming work directory,  /home/conda/feedstock_root/build_artifacts/qgis_1593461144392/work  to  /home/conda/feedstock_root/build_artifacts/qgis_1593461144392/work_moved_qgis-3.12.2-py38h0e74f0e_0_linux-64_main_build_loop
# Automatic uploading is disabled
# If you want to upload package(s) to anaconda.org later, type:

anaconda upload /home/conda/feedstock_root/build_artifacts/linux-64/qgis-3.12.2-py38h0e74f0e_0.tar.bz2

# To have conda build upload to anaconda.org automatically, use
# $ conda config --set anaconda_upload yes

anaconda_upload is not set.  Not uploading wheels: []
####################################################################################
Resource usage summary:

Total time: 1:41:11.1
CPU usage: sys=0:10:38.1, user=1:47:17.1
Maximum memory usage observed: 3.8G
Total disk usage observed (not including envs): 813.9M
+ validate_recipe_outputs qgis-feedstock
Cloning into '/tmp/tmps2hz2o4q_recipe/feedstock-outputs'...
validation results:
{
  "linux-64/qgis-3.12.2-py38h0e74f0e_0.tar.bz2": true
}
NOTE: Any outputs marked as False are not allowed for this feedstock.
+ [[ True != \F\a\l\s\e ]]
+ upload_package --validate --feedstock-name=qgis-feedstock /home/conda/feedstock_root /home/conda/recipe_root /home/conda/feedstock_root/.ci_support/linux_.yaml
Found git SHA 978eb5e2dc42fc03a5b2d7168be96f01bc110ce4 for this build!
Using BINSTAR_TOKEN for anaconda.org uploads to qgist.
No numpy version specified in conda_build_config.yaml.  Falling back to default numpy value of 1.11
Adding in variants from internal_defaults
Adding in variants from /home/conda/recipe_root/conda_build_config.yaml
Adding in variants from /home/conda/feedstock_root/.ci_support/linux_.yaml
Distribution /home/conda/feedstock_root/build_artifacts/linux-64/qgis-3.12.2-py38h0e74f0e_0.tar.bz2 is new for qgist, but no upload is taking place because the BINSTAR_TOKEN/STAGING_BINSTAR_TOKEN is missing or empty.
+ touch /home/conda/feedstock_root/build_artifacts/conda-forge-build-done-linux_
+ test -f /home/vsts/work/1/s/build_artifacts/conda-forge-build-done-linux_

The most interesting aspect is the following statement: no upload is taking place because the BINSTAR_TOKEN/STAGING_BINSTAR_TOKEN is missing or empty. What am I doing wrong or misunderstanding?

Ping @hmaarrfk: Thanks for the original PR, it helped me a lot.

s-m-e commented 4 years ago

I ended up re-naming the variable group itself into BINSTAR_TOKEN and added the BINSTAR_TOKEN variable directly to the pipeline configuration (in Azure's web interface). The latter did the trick (though I am not sure if it is supposed to work like this).

hmaarrfk commented 4 years ago

Yeah. I had to manually add tokens to each repo.

Honestly, it isn't so bad when you only manage a few different feedstocks manually

beckermr commented 4 years ago

I am pretty sure the variable group is is hard coded in smithy. Smithy now has a command line option to rotate or update binstar tokens per repo. See the cli file for details.

jdblischak commented 2 years ago

I'm running into the same problem and can't find a solution. Here's what I've tried:

When I initially ran conda smithy register-ci, I had my Azure token in ~/.conda-smithy/azure.token and my Anaconda.org token in ~/.conda-smithy/anaconda.token. Thus I also get the line:

Using BINSTAR_TOKEN for anaconda.org uploads to jdblischak.

I also know that the BINSTAR_TOKEN has sufficient permissions because I was able to upload packages from a CircleCI job using the same token.

But as this Issue and the README state, this initial registration is insufficient. Thus I also manually created an Azure Library Group Variable for my project "feedstock-builds". At first I named the group "anaconda.org" since the README gave no guidance on this. It didn't work.

I am pretty sure the variable group is is hard coded in smithy.

Then I looked in the source code and found the line below, so I tried "anaconda-org". Again no luck.

https://github.com/conda-forge/conda-smithy/blob/e65bd1cfef08ca3da41c2b8f07d2be539f123fe7/conda_smithy/azure_ci_utils.py#L207-L209

I ended up re-naming the variable group itself into BINSTAR_TOKEN and added the BINSTAR_TOKEN variable directly to the pipeline configuration (in Azure's web interface). The latter did the trick (though I am not sure if it is supposed to work like this).

Lastly I tried naming the group variable BINSTAR_TOKEN to match the variable it contains. Still no luck.

So I am at a loss for what to try next. Do I need to somehow explicitly connect this library group variable to my pipelines? I confirmed that it has no restrictions on its use (ie it can be used in any pipeline). There are docs that explain how to add a group variable by defining in the pipeline yaml file, but if that was required I'd assume conda-smithy would include it already. Clearly it works for official conda-forge repositories without being explicitly referenced in the pipeline yaml files.

Any suggestions for what I can try next?

Here are some screenshots:

BINSTAR_TOKEN variable in group variable of same name

No pipeline restrictions

jdblischak commented 2 years ago

and added the BINSTAR_TOKEN variable directly to the pipeline configuration (in Azure's web interface).

Yeah. I had to manually add tokens to each repo.

After a tip from my colleague, I now understand these comments. I got confused by all the Azure-specific terminology.

For anyone else trying to upload binaries to a personal channel from their own feedstock repo, ignore the README instructions about the Library Group Variable. That is the right solution for conda-forge that has thousands of pipelines within the build-feedstocks project (since they can in theory be shared across pipelines within the same project). But as the instructions currently are, the CI scripts don't access the Library Variable Group. There must be some undocumented step. Instead, add a BINSTAR_TOKEN as a secret pipeline variable to each of the feedstock-specific pipelines.

To be really pedantic, go to https://dev.azure.com/<your-azure-account>/feedstock-builds/_build, click on the pipeline named after your feedstock repo, and then click "Edit" in the top right of the UI:

image

Then click "Variables":

image

Then add BINSTAR_TOKEN, making sure to tick the box "Keep this value secret"

image

The Azure YAML files already properly add the secret variable as an environment variable:

    env:
      BINSTAR_TOKEN: $(BINSTAR_TOKEN)

Note that I also set conda_forge_output_validation to False. I didn't test if this was required, but it does remove the env vars FEEDSTOCK_TOKEN and STAGING_BINSTAR_TOKEN, so you don't have to worry about it trying to submit to official conda-forge repos/channels.

beckermr commented 2 years ago

Ahhhh yes. We have a step in our staged-recipes CI where we do a bunch of uploading and rotating of tokens. It may be that we need to write some docs on those steps.