Azure / azure-storage-python

Microsoft Azure Storage Library for Python
https://azure-storage.readthedocs.io
MIT License
338 stars 240 forks source link

Unable to upload/download blob from spa when deployed #673

Closed zorzigio closed 4 years ago

zorzigio commented 4 years ago

Which service(blob, file, queue) does this issue concern?

Blob

Which version of the SDK was used? Please provide the output of pip freeze.

azure-storage-blob==12.3.1

What problem was encountered?

Trying to upload/download a blob from a spa produces the following error

<Error
>
<Code
>
AuthenticationFailed
</Code
>
<Message
>
Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
RequestId:3b05eea5-101e-00ba-71c8-4ec75f000000
Time:2020-06-30T10:23:13.2420635Z
</Message
>
<AuthenticationErrorDetail
>
Signature did not match. String to sign used was r
2020-06-30T09:23:13Z
2020-06-30T15:23:13Z
/blob/account/container//Natural_Language_Processing (5).pdf

2019-07-07
b

</AuthenticationErrorDetail
>
</Error>

The javascript code responsible for the upload is

// upload to Azure
const blobName = file.name;
const accountSas = resp.data.SAS;
const account = resp.data.account;
const containerName = resp.data.container;
const anonymousCredential = new AnonymousCredential();
const blobServiceClient = new BlobServiceClient(
    `https://${account}.blob.core.windows.net?${accountSas}`,
    anonymousCredential
);
// Create a container
const containerClient = blobServiceClient.getContainerClient(
    containerName
);
// Create a blob
const content = file;
const blockBlobClient = containerClient.getBlockBlobClient(blobName);
const uploadBlobResponse = await blockBlobClient.upload(
    content,
    Buffer.byteLength(content)
);

while the backend Python code for the SAS token generation is the following

if content['up_down'] == 'download':
    permission = BlobSasPermissions(read=True)
else:
    permission = BlobSasPermissions(write=True)

account_name = os.getenv("STORAGE_ACCOUNT_NAME")
container_name = metadata.get_container_name()
blob_name = content['filePath']
expiry = datetime.utcnow() + timedelta(hours=5)

options = {
    'account_name': account_name,
    'container_name': container_name,
    'blob_name': blob_name,
    'account_key': os.getenv("STORAGE_ACCESS_KEY"),
    'permission': permission,
    'expiry': expiry
}

SAS = generate_blob_sas(**options)

The problem appears when the spa is deployed on Azure Kubernetes, but the code works fine when deployed locally.

As a check, with the request for the SAS token I am also returning the STORAGE_ACCESS_KEY which seems to match the original one.

Have you found a mitigation/solution?

No