Closed MarkKharitonov closed 4 years ago
Thanks for the feedback! We are routing this to the appropriate team for follow-up. cc @xgithubtriage.
Storage team, please help to look into this question.
@MarkKharitonov
Thanks for reporting this issue! This is the same issue of https://github.com/Azure/azure-powershell/issues/8531 Do you run this cmdlet in Azure automation?
To answer your question:
PS C:\> $container = Get-AzStorageContainer $containerName -Context $ctx
PS C:\> $blob = $container.CloudBlobContainer.GetBlockBlobReference($blobName)
PS C:\> $blob.UploadFromFile($localDestFile)
Please, let me repeat the first sentence in this issue:
Related to #8531, only happens locally. Not running any Azure Runbook.
About item (4) - Could you provide a fully functional sample? Because the blob can be quite large and we need to deal with asynchronous nature of the call, i.e. wait for its completion. Please, do not spare the vital details, since this issue is very annoying to say the least.
Thank you.
"The latest Az.Storage still can repro this issue. The root cause is the API "System.IO.Path.GetFullPath()" not work with UNC path on your machine. The workaround is change Powershell config to make the API work. Upgrade OS might can also make it work."
I am running Win10 20150 and AZ 4.2.0 and Az.Storage 2.1.0- this is BROKEN. To suggest I alter the Powershell config of my systems is INSANE.
Would someone in the AZ team acknowledge that a previously functional cmdlet is now BROKEN? Please get it FIXED.
@MarkKharitonov @lukeb1961
Thanks for the responds!
For fix this in PSH, We have discussed this for a long time before, there's no easy fix in PSH code. As the issue happens in Azure runbook before, so we fixed runbook to make it work with the API.
Per before discussion, upgrade to Windows 10 Anniversary, this issue should not appear. Would you please run [environment]::OSVersion
in Powershell console which can repro this issue, and give the detail output. Then I can check related person to see why the API not work in your OS. (I know you use Win10 20150, but would like to get more info.)
For the script, would you please share your upload scenario, like the blob Type (block/page/append), the average size of the blob? And we can see if there's a more stable way for the blob upload by call SDK API. If your blob is really large, you might can also try to call AzCopy command in PSH to upload blob.
Besides that, please share the detail requirements that you need to write the script. Do you start upload asynchronously, and wait for it complete with a task? You can share your original script to use this cmdlets (hide credential), and I can see how to switch it to SDK API.
Here are all the details (mind the prompt, it is from within VS Code debugger):
[DBG]> [environment]::OSVersion
Platform ServicePack Version VersionString
-------- ----------- ------- -------------
Win32NT 10.0.17763.0 Microsoft Windows NT 10.0.17763.0
[DBG]> $BlobName
fintech-infrastructure\0.0.20176.25730\0\dev2\plan-data-resources.tfplan
[DBG]> dir $SourceFile
Directory: C:\Users\mkharitonov\AppData\Local\Temp\a8440_20200624101820
Mode LastWriteTime Length Name
---- ------------- ------ ----
-a---- 6/24/2020 10:19 AM 12404 plan-data-resources.tfplan
[DBG]> [io.path]::GetFullPath($SourceFile)
C:\Users\mkharitonov\AppData\Local\Temp\a8440_20200624101820\plan-data-resources.tfplan
[DBG]> $PSVersionTable
Name Value
---- -----
PSVersion 5.1.17763.1007
PSEdition Desktop
PSCompatibleVersions {1.0, 2.0, 3.0, 4.0...}
BuildVersion 10.0.17763.1007
CLRVersion 4.0.30319.42000
WSManStackVersion 3.0
PSRemotingProtocolVersion 2.3
SerializationVersion 1.1.0.1
[DBG]> Set-AzStorageBlobContent -File $SourceFile -Container $ContainerName -Context $ctx.Context -Blob $BlobName -Force
Failed to open file C:\Users\mkharitonov\AppData\Local\Temp\a8440_20200624101820\plan-data-resources.tfplan: Illegal characters in path..
At line:1 char:1
+ Set-AzStorageBlobContent -File $SourceFile -Container $ContainerName ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : CloseError: (:) [Set-AzStorageBlobContent], TransferException
+ FullyQualifiedErrorId : TransferException,Microsoft.WindowsAzure.Commands.Storage.Blob.SetAzureBlobContentCommand
[DBG]>
As you can see:
[io.path]::GetFullPath()
works fine.The files we upload are Terraform plans (small) and terraform verbose logs (megabytes)
The problem is that the Az.Storage module authors convert the given path to UNC themselves by utilizing the \\?
notation, which does not work out of the box:
[DBG]> dir "\\?\c:\temp"
Illegal characters in path.
At line:1 char:1
+ dir "\\?\c:\temp"
+ ~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [Get-ChildItem], ArgumentException
+ FullyQualifiedErrorId : System.ArgumentException,Microsoft.PowerShell.Commands.GetChildItemCommand
[DBG]>
It does work on a machine where the powershell.exe.config was modified:
C:\> dir "\\?\c:\temp"
Directory: \\?\c:
Mode LastWriteTime Length Name
---- ------------- ------ ----
d----- 6/19/2020 5:10 PM temp
C:\>
Now I understand the rationale behind this conversion - to bypass the maximum path length imposed by Windows on the standard paths which does not apply to \\?
paths. But come on, you have to be smarter than that and do it ONLY if the given path is actually over the limit, because the current implementation is just broken for ALL paths.
And the advice to modify the powershell.exe.config is awful - have you actually tried it yourself? The only way I could do it is first change the ownership of the file from TrustedInstaller to myself and only then I could add myself to the ACL and give me the write permissions. And what would happen on the next update of powershell? Will the installer be able to overwrite it? In short, this is a miserable workaround that does not scale.
You have to fix your mess. You have to take into account that the code is broken for many people out there. Be professional, apply the \\?
conversion only if you really need it as a fallback mechanism in case the max path error has actually occurred. I do not believe I need to give such an obvious advice here.
Please, take responsibility and fix the Az.Storage module already.
@MarkKharitonov
Thanks for the detail!
The issue is caused by the .net API System.IO.Path.GetFullPath(), it should support UNC path from .net 4.6.2 framework per link. But currently seems not and cause the issue. As you have shared your OS/PSH version, I will contact the .net team to see what should be the correct behavior, and the correct way to handle path.
PowerShell based on DMlib to do the data transfer, and the related code for UNC path is in DMlib. I have opened an issue in DMlib. After DMlib fix this issue, Powershell can fix it by upgrade to new version of DMlib. I have discussed with DMlib team for this, current all long/short file path will be convert to UNC path in DMlib since: 1. .net API declared to support UNC path; 2. separate code path to handle long/short file path will make code difficult to maintain. DMlib team will evaluate if/how to fix it in DMlib , depends on the reply from .net team.
Per you scenario, the files to upload looks are small ( "small" or "megabytes" per your above comments). Before the issue is fixed, If you don't want to apply the workaround by modify the powershell.exe.config, Would you like to upload it with SDK?
# Get blob object to upload with
$container = Get-AzStorageContainer $containerName -Context $ctx
$blob = $container.CloudBlobContainer.GetBlockBlobReference($blobName)
$blob.UploadFromFile($localSrcFile)
$t = $blob.UploadFromFileAsync($localSrcFile) $t.Wait()
We will try to upload with SDK. We work in sprints, so it may take a few weeks for us to get there.
@MarkKharitonov Thanks for the update! Let us know if you need any assistance on upload with SDK.
We have contact .net team, and get their comment "If PowerShell doesn’t have a problem with the setup logic, the most likely scenario is that something installed has mussed with PowerShell’s config file. "
So to investigate why the issue happens on your machine (it should not happen after Windows 10 Anniversary):
Could you share the powershell.exe.config file content (hide credential if any), on the machine which can repro the issue and before you do any change to workaround this issue. Especially , does it contains any setting of "Switch.System.IO.UseLegacyPathHandling"?
We can't repro this issue locally on same OS. If you can share how you setup such a repro machine from clean OS, it might can help us to setup a repro environment and investigate the root cause.
Hi, The machines where the issue reproduces 100% are our developer workstations. On the build agents the issue is intermittent. On the workstations the powershell.exe.config is:
<configuration>
<uri>
<schemeSettings>
<add name="http" genericUriParserOptions="DontUnescapePathDotsAndSlashes"/>
<add name="https" genericUriParserOptions="DontUnescapePathDotsAndSlashes"/>
</schemeSettings>
</uri>
</configuration>
At least on the 4 workstations I saw it. Adding
<runtime>
<AppContextSwitchOverrides value="Switch.System.IO.UseLegacyPathHandling=false" />
</runtime>
removes the problem, but it is not a solution in general. The workstations are imaged from a set of "gold" images used by our IT. I do not know what other information I can provide on it. If you have some command to collect more stats, then I can run it and share the results.
I have not gotten to apply the SDK solution yet.
@MarkKharitonov Thanks for the reply! I have shared the above information with .net team, and will update you if get any responds from them.
i use https://docs.microsoft.com/da-dk/archive/blogs/jeremykuhne/new-net-path-handling-sneak-peek as a work around
We ended up with the following code based on your recommendation:
$ctx = Get-StorageContext $ResourceGroupName $AccountName
$container = Get-AzStorageContainer $ContainerName -Context $ctx
$blob = $container.CloudBlobContainer.GetBlockBlobReference($blobName)
$blob.UploadFromFile($SourceFile)
I will report on how it works for us.
@MarkKharitonov Thanks for the update! Let me know if anything need our assistance.
We ended up with the following code based on your recommendation:
$ctx = Get-StorageContext $ResourceGroupName $AccountName $container = Get-AzStorageContainer $ContainerName -Context $ctx $blob = $container.CloudBlobContainer.GetBlockBlobReference($blobName) $blob.UploadFromFile($SourceFile)
I will report on how it works for us.
@MarkKharitonov is this working for you without the modifications to powershell.exe.config? Or does it require setting Switch.System.IO.UseLegacyPathHandling=false? I am still dealing with having to reset this for hybrid runbook workers.
@jusnitlive - this code does not require messing with powershell.exe.config
https://github.com/Azure/azure-powershell/issues/8473#issuecomment-609847693
I can confirm that this fixes my issue. Add string value: HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\.NETFramework\AppContext Value name: Switch.System.IO.UseLegacyPathHandling Value data: false
@MarkKharitonov Set-AzStorageBlobContent works with no code modifications
We are working closely with DMlib team to fix this issue asap. We expect the fix will be out in 1-2 months.
The Powershell fix PR is merged :https://github.com/Azure/azure-powershell/pull/12882, target to release on 9/22. Code change is upgrade to the latest DMlib release with the fix.
The fixed build have released in Az 4.7.0 (Az.Storage 2.6.0)
You can get the release here:
Download the install package from Github: https://github.com/Azure/azure-powershell/releases/tag/v4.7.0-September2020 Install the package from PSGallery: https://www.powershellgallery.com/packages/Az/4.7.0 WW
@blueww , I got same issue of "illegal path" while uploading .json file into blob. I was using az.storage 1.5.0 , as suggested above try installing az.storage 2.6.0 ( also tried latest 4.2.0 ) and imported module from there while I was uploading and didnt fix the issue.
Only issue got fixed after registry hack and az.storage 1.5.0: HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft.NETFramework\AppContext Value name: Switch.System.IO.UseLegacyPathHandling Value data: false
Any idea is this issue resolved by Microsoft in which version of az.storage ? as didnt work with 2.6.0 az.storage
@akashsawant1992 This issue is fixed in Az.Storage 2.6.0 and later. (You can use the latest Az.Storage) Please make sure you have close and reopen the Powershell session, the correct version of Az.Storage is loaded, the test again.
The workaround from you also works, since the root cause is: the .net API System.IO.Path.GetFullPath(), it should support UNC path from .net 4.6.2 framework per link but currently seems not. And your registry make this API works with UNC path.
Description
Related to #8531, only happens locally. Not running any Azure Runbook.
Steps to reproduce
Environment data
Module versions
Debug output
So, there is this SO question - https://stackoverflow.com/questions/54522744/set-azstorageblobcontent-throws-exception-illegal-characters-in-path. The proposals are:
Now I was trying to check the latest preview version of the Az.Storage module, but it does not seem possible with Powershell 5.1:
Please, advise.