Azure / doAzureParallel

A R package that allows users to submit parallel workloads in Azure
MIT License
107 stars 51 forks source link

addTask sets default expiry time in SAS token too short for long running jobs #363

Open zerweck opened 5 years ago

zerweck commented 5 years ago

When adding a task, the creation of the upload token for the standard job files like the file 1.txt containing the log of task 1 ignores if the user maybe has set a longer expiry time for their output files.

In my case I have long running tasks and changed the following line from the documentation example https://github.com/Azure/doAzureParallel/blob/2b8f388dc476a3a51aa71c3ab1165d080ad08fe8/docs/72-persistent-storage.md#L63 to

# make token expire 20 days after creation
writeToken <- storageClient$generateSasToken("w", "c", outputFolder, end = Sys.time() + 60 * 60 * 24 * 20) 

So the SAS token for my output files now has the se parameter in the URI like this: [...]&se=2019-08-18T11%3A11%3A48Z&[...] while in the URI for 1.txt it looks like this: [...]&se=2019-07-31T11%3A11%3A59Z&[...] so my own upload gets ignored because the task crashes with a FileUploadAccessDenied Error for the 1.txt before my upload is run.

I am not sure if this is definied here https://github.com/Azure/doAzureParallel/blob/96bfc226628a5f0e58c52d2f0e4f76b76e3bf2e9/R/batch-api.R#L79 or here https://github.com/Azure/doAzureParallel/blob/96bfc226628a5f0e58c52d2f0e4f76b76e3bf2e9/R/doAzureParallel.R#L560-L566

If you could point me in the right direction I could try a fix.