Closed tedhardyMSFT closed 4 years ago
Thanks for the feedback! We are routing this to the appropriate team for follow-up. cc @sumantmehtams
Thanks for the feedback! We are routing this to the appropriate team for follow-up. cc @sumantmehtams
@tedhardyMSFT to give you some context when we moved to netcore mdoel for Az modules, we had to get rid of the custom encoding type we had and move to .net compatible System.Text.Encoding (https://docs.microsoft.com/en-us/dotnet/api/system.text.encoding.utf8?view=netframework-4.8).
Now if you see System.Text.Encoding has some fixed types like Unicode, ASCII, UTF7, UTF8, UTF32, so we tried to keep those only since string and byte are not really encoding types. Now you would argue that why there is string but not byte and i agree ideally we should have removed string also. But our intention was to keep the System.Text.Encoding standard types only as part of this PR: https://github.com/Azure/azure-powershell/pull/6598/files while moving to az.datalakestore.
Another thing was even though we were exposing this custom byte type but if you see the previous code we were basically falling back to utf8.
You can basically remove the encoding type or just change it to utf8 and it should work. Does that work for you?
I was wrong, looks like utf8 wont work, we need a way for explicitly uploading bytes
Description
Migrating from AzureRm to Az for DataLakeStore operations. In AzureRm, we are able to upload compressed data using a byte[] for the -Value parameter. When using passing a byte[] for the -Value for the New-AzDataLakeStoreItem (or Add-AzDataLakeStoreItemContent) cmdlet, I get an invalid Encoding type.
Steps to reproduce
Environment data
Module versions
Debug output
Error output