microsoft / azure-pipelines-tasks

Tasks for Azure Pipelines
https://aka.ms/tfbuild
MIT License
3.49k stars 2.61k forks source link

[Question]: Unable to download from Storage Account since using WIF and @AzureFileCopy6 #19875

Closed dpgray94 closed 2 months ago

dpgray94 commented 4 months ago

Task name

AzureFileCopy

Task version

6.239.11

Environment type (Please select at least one enviroment where you face this issue)

Azure DevOps Server type

dev.azure.com (formerly visualstudio.com)

Azure DevOps Server Version (if applicable)

No response

Operation system

Microsoft Windows Server 2019

Question

Overview of Issue I am testing the use of AzureFileCopy@6 as I would like to move my service connections in Azure DevOps away from Service Principles and instead make use of Workload Identity.

As part of the upgrade to the AzureFileCopy version 6, it no longer creates the output for a SAS token - which is fine and as such I have removed the SAS token from the templateLink uri in my deployment. However, when I then run the build pipeline, my task fails with the following error for each file:

[error]InvalidContentLink: Unable to download deployment content from [redacted]

On investigation it appears to be an authentication issue. So I have given the connection identity Contributor access to the storage account for testing purposes as well as Storage Blob Data Contributor.

I am hoping for some guidance of what (if any) work is required to the storage accounts to support requests from a service account using Workload Identity.

What I Expect: When the pipeline task gets to the point where it accesses files in my Storage Account, it authenticates and can download the content without the use of a SAS Token

What Happens I get an error, displaying the full correct path to the storage account, blob container and folder - but fails to download the content

### Tasks
v-bsanthanak commented 4 months ago

@dpgray94 we are not clear on the part "I have made changes that strip out the SAS Token when concatenating a valid URI together.". Can you brief the issue elaborately on what change are you exactly trying to do. Also please share the complete debug logs.

dpgray94 commented 4 months ago

@v-bsanthanak Hi, apologies for my vagueness let me expand on that.

I am testing the use of AzureFileCopy@6 as I would like to move my service connections in Azure DevOps away from Service Principles and instead make use of Workload Identity.

As part of the upgrade to the AzureFileCopy version, it no longer creates the output for a SAS token - which is fine and as such I have removed the SAS token from the templateLink uri in my deployment. However, when I then run the build pipeline, my task fails with the error in my original question, which appears to be an authentication issue. I have given the connection Contributor access to the storage account for testing purposes.

I am hoping for some guidance of what (if any) work is required to the storage accounts to support requests from a service account using Workload Identity.

v-snalawade commented 4 months ago

@dpgray94 - Hi, please go through https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/azure-file-copy-v6?view=azure-pipelines

The AzureFileCopy@6 task uses Azure RBAC to access blob storage instead. This requires the identity of the service connection used to have the appropriate RBAC role e.g. Storage Blob Data Contributor. See Assign an Azure role for access to blob data.

dpgray94 commented 4 months ago

@v-snalawade Thanks for the reply - I have already assigned RBAC roles to the identity. I have just checked once again, and can confirm this. However, I still get the error:

[error]InvalidContentLink: Unable to download deployment content from ...

v-snalawade commented 4 months ago

@dpgray94 - Please share pipeline debug logs for failed pipeline.

v-schhabra commented 3 months ago

Hi @dpgray94 Could you pls share the logs at v-snalawade@microsoft.com or v-schhabra@microsoft.com?

v-schhabra commented 2 months ago

@dpgray94 Closing this issue as we need logs for our investigation.