microsoft / ReportingServicesTools

Reporting Services Powershell Tools
MIT License
456 stars 213 forks source link

Byte Array in Write-RsRestCatalogItem.ps1 sometimes creates OutOfMemory exception for large (500MB) .pbix files #210

Open GitHubJedi opened 5 years ago

GitHubJedi commented 5 years ago

Do you want to request a feature or report a bug?

Bug

What is the current behavior?

When testing a write of 500 MB .pbix files (include large CSVs) to the Report Server, receive an OutOfMemory exception:

Write-RsRestCatalogItem : System.Exception: Failed to create catalog item: Array dimensions exceeded supported range. 
---> System.OutOfMemoryException: Array dimensions exceeded supported range.
   at System.Linq.Buffer`1..ctor(IEnumerable`1 source)
   at System.Linq.Enumerable.ToArray[TSource](IEnumerable`1 source)
   at System.Management.Automation.ParameterBinderBase.BindParameter(CommandParameterInternal parameter, 
CompiledCommandParameter parameterMetadata, ParameterBindingFlags flags)
   at System.Management.Automation.CmdletParameterBinderController.BindParameter(CommandParameterInternal argument, 
MergedCompiledCommandParameter parameter, ParameterBindingFlags flags)
   at System.Management.Automation.CmdletParameterBinderController.BindParameter(UInt32 parameterSets, 
CommandParameterInternal argument, MergedCompiledCommandParameter parameter, ParameterBindingFlags flags)
   at System.Management.Automation.CmdletParameterBinderController.BindParameters(UInt32 parameterSets, Collection`1 
arguments)
   at System.Management.Automation.CmdletParameterBinderController.BindCommandLineParametersNoValidation(Collection`1 
arguments)
   at System.Management.Automation.CmdletParameterBinderController.BindCommandLineParameters(Collection`1 arguments)
   at System.Management.Automation.CommandProcessor.BindCommandLineParameters()
   at System.Management.Automation.CommandProcessorBase.DoPrepare(IDictionary psDefaultParameterValues)
   at System.Management.Automation.Internal.PipelineProcessor.Start(Boolean incomingStream)
   at System.Management.Automation.Internal.PipelineProcessor.SynchronousExecuteEnumerate(Object input, Hashtable 
errorResults, Boolean enumerate)
   at System.Management.Automation.PipelineOps.InvokePipeline(Object input, Boolean ignoreInput, 
CommandParameterInternal[][] pipeElements, CommandBaseAst[] pipeElementAsts, CommandRedirection[][] 
commandRedirections, FunctionContext funcContext)
   at System.Management.Automation.Interpreter.ActionCallInstruction`6.Run(InterpretedFrame frame)
   at System.Management.Automation.Interpreter.EnterTryCatchFinallyInstruction.Run(InterpretedFrame frame)
   --- End of inner exception stack trace ---

Looks like it is related to when it gets to the "Upload" after turning VERBOSE on, the try/catch block at line 250

could be one of the reads of the JSON here, maybe ([System.Text.Encoding]::UTF8.GetBytes($paylodJson)) ?

$payloadJson = ConvertTo-Json $payload

                if ($Credential -ne $null)
                {
                    Invoke-WebRequest -Uri $catalogItemsUri -Method Post -WebSession $WebSession -Body ([System.Text.Encoding]::UTF8.GetBytes($payloadJson)) -ContentType "application/json" -Credential $Credential -Verbose:$false | Out-Null
                }
                else
                {
                    Invoke-WebRequest -Uri $catalogItemsUri -Method Post -WebSession $WebSession -Body ([System.Text.Encoding]::UTF8.GetBytes($payloadJson)) -ContentType "application/json" -UseDefaultCredentials -Verbose:$false | Out-Null
                }

If the current behavior is a bug, please provide the steps to reproduce and if possible a minimal demo of the problem.

1) Create a large .PBIX file (500MB+) 2) Use Write-RsRestCatalogItem.ps1 to write this to ReportServer

What is the expected behavior?

Upload may take a bit (15-20 min), but eventually the report is written to the Report Server

Which versions of Powershell and which OS are affected by this issue? Did this work in previous versions of our scripts?

Windows Server 2012 R2, 32GB memory, PowerShell 4.0. Running through an on-premise TFS agent.

I can get it to work on my local PC (Windows 10, 8GB memory, Powershell 5.1.16299.820).

Considering the agent has more RAM than my PC, there must be some limit set on the amount of memory available for PowerShell on these agents for .NET arrays. I have tried some custom configurations using the following .NET config, but this is messy to apply during PS runtime and not sure is getting applied. Have to do some hacky things with ConfigurationManager, AppDomain.

https://docs.microsoft.com/en-us/dotnet/framework/configure-apps/file-schema/runtime/gcallowverylargeobjects-element

GitHubJedi commented 5 years ago

I was able to add the following configuration to a powershell.exe.config in my $PSHome directory (Powershell 4.0 Windows Server 2012 R2). This got past the error. However, the script proceeded to then seemingly use up all the memory on the server (~30GB/32GB) and not complete. Had to force kill it.

`<configuration>`  
  <runtime>  
    <gcAllowVeryLargeObjects enabled="true" />  
  </runtime>  
</configuration> 
GitHubJedi commented 5 years ago

The solution to this is probably to implement some kind of memory streaming response instead reading large byte arrays