Closed c4rlosmarin closed 5 years ago
@Tiano2017 Hey Tian, do you know what could be happening here? Does ARM do any type of throttling that would cause this hanging behavior to occur?
Actually not sure about this one.. I thought it's "Get-AzDeployment", but seems it's "Get-AzureDeployment". Is this owned by CRP?
This is definitely about the Get-AzureDeployment command-let, so classic deployment model.
So from the memory dump, at thread number 5, it seems the thread gets blocked due to a call to Microsoft.WindowsAzure.Management.ManagementClientExtensions.GetOperationStatus which then calls Microsoft.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccess, right?
OS Thread Id: 0x2edc (5) Child SP IP Call Site
00000022bd4cde10 00007ffc02879a84 [GCFrame: 00000022bd4cde10]
00000022bd4cdf38 00007ffc02879a84 [HelperMethodFrame_1OBJ: 00000022bd4cdf38] System.Threading.Monitor.ObjWait(Boolean, Int32, System.Object)
00000022bd4ce050 00007ffbd46ffb84 System.Threading.ManualResetEventSlim.Wait(Int32, System.Threading.CancellationToken)
00000022bd4ce0e0 00007ffbd46fa2db System.Threading.Tasks.Task.SpinThenBlockingWait(Int32, System.Threading.CancellationToken)
00000022bd4ce150 00007ffbd503e841 System.Threading.Tasks.Task.InternalWait(Int32, System.Threading.CancellationToken)
00000022bd4ce220 00007ffbd46f9969 System.Threading.Tasks.Task.Wait(Int32, System.Threading.CancellationToken)
00000022bd4ce250 00007ffb77d58be2 Microsoft.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccess(System.Threading.Tasks.Task)
00000022bd4ce290 00007ffb77d6d92c Microsoft.WindowsAzure.Management.ManagementClientExtensions.GetOperationStatus(Microsoft.WindowsAzure.Management.IManagementClient, System.String)
00000022bd4ce2f0 00007ffb77d6cf8b Microsoft.WindowsAzure.Commands.Utilities.Common.ServiceManagementBaseCmdlet.GetOperation(System.String)
00000022bd4ce350 00007ffb77d6cf26 Microsoft.WindowsAzure.Commands.Utilities.Common.ServiceManagementBaseCmdlet.GetOperation(Microsoft.Azure.AzureOperationResponse)
00000022bd4ce390 00007ffb77d48548 Microsoft.WindowsAzure.Commands.Utilities.Common.ServiceManagementBaseCmdlet.ExecuteClientActionNewSM[[System.__Canon, mscorlib]](System.Object, System.String, System.Func`1<System.__Canon>, System.Func`3<Microsoft.Azure.OperationStatusResponse,System.__Canon,System.Object>)
00000022bd4ce3e0 00007ffb777637a4 Microsoft.WindowsAzure.Commands.ServiceManagement.HostedServices.GetAzureDeploymentCommand.OnProcessRecord()
00000022bd4ce440 00007ffbc8cec603 System.Management.Automation.CommandProcessor.ProcessRecord()
00000022bd4ce4d0 00007ffbc8c44e4b System.Management.Automation.CommandProcessorBase.DoExecute()
00000022bd4ce510 00007ffbc8ccf3dc System.Management.Automation.Internal.PipelineProcessor.SynchronousExecuteEnumerate(System.Object)
00000022bd4ce5a0 00007ffbc8dd27bf System.Management.Automation.PipelineOps.InvokePipeline(System.Object, Boolean, System.Management.Automation.CommandParameterInternal[][], System.Management.Automation.Language.CommandBaseAst[], System.Management.Automation.CommandRedirection[][], System.Management.Automation.Language.FunctionContext)
00000022bd4ce630 00007ffb7712e8dc DynamicClass.lambda_method(System.Runtime.CompilerServices.Closure, System.Object[], System.Runtime.CompilerServices.StrongBox`1<System.Object>[], System.Management.Automation.Interpreter.InterpretedFrame)
00000022bd4ce6c0 00007ffbc9131af0 System.Management.Automation.Interpreter.EnterTryCatchFinallyInstruction.Run(System.Management.Automation.Interpreter.InterpretedFrame)
00000022bd4ce750 00007ffbc9131af0 System.Management.Automation.Interpreter.EnterTryCatchFinallyInstruction.Run(System.Management.Automation.Interpreter.InterpretedFrame)
00000022bd4ce7e0 00007ffbc8f46961 System.Management.Automation.Interpreter.Interpreter.Run(System.Management.Automation.Interpreter.InterpretedFrame)
00000022bd4ce830 00007ffbc8f82279 System.Management.Automation.Interpreter.LightLambda.RunVoid1[[System.__Canon, mscorlib]](System.__Canon)
00000022bd4ce8a0 00007ffbc8e15b33 System.Management.Automation.DlrScriptCommandProcessor.RunClause(System.Action`1<System.Management.Automation.Language.FunctionContext>, System.Object, System.Object)
00000022bd4ce940 00007ffbc8e15475 System.Management.Automation.DlrScriptCommandProcessor.Complete()
00000022bd4ce9d0 00007ffbc8c450fa System.Management.Automation.CommandProcessorBase.DoComplete()
00000022bd4cea50 00007ffbc8ccf72d System.Management.Automation.Internal.PipelineProcessor.DoCompleteCore(System.Management.Automation.CommandProcessorBase)
00000022bd4ceae0 00007ffbc8ccf3eb System.Management.Automation.Internal.PipelineProcessor.SynchronousExecuteEnumerate(System.Object)
00000022bd4ceb70 00007ffbc8c46ce8 System.Management.Automation.Runspaces.LocalPipeline.InvokeHelper()
00000022bd4cec40 00007ffbc8c4775a System.Management.Automation.Runspaces.LocalPipeline.InvokeThreadProc()
00000022bd4cecb0 00007ffbc8cd6b50 System.Management.Automation.Runspaces.PipelineThread.WorkerProc()
00000022bd4cece0 00007ffbd468cef2 System.Threading.ExecutionContext.RunInternal(System.Threading.ExecutionContext, System.Threading.ContextCallback, System.Object, Boolean)
00000022bd4cedb0 00007ffbd468cd75 System.Threading.ExecutionContext.Run(System.Threading.ExecutionContext, System.Threading.ContextCallback, System.Object, Boolean)
00000022bd4cede0 00007ffbd468cd45 System.Threading.ExecutionContext.Run(System.Threading.ExecutionContext, System.Threading.ContextCallback, System.Object)
00000022bd4cee30 00007ffbd4732ca5 System.Threading.ThreadHelper.ThreadStart()
00000022bd4cf080 00007ffbd6576c93 [GCFrame: 00000022bd4cf080]
00000022bd4cf3e0 00007ffbd6576c93 [DebuggerU2MCatchHandlerFrame: 00000022bd4cf3e0]
@c4rlosmarin just to make sure I understand: the above script runs successfully for a couple of minutes, so you're able to continuously get the status of your deployment and see it in the output, but at some point, the call hangs from what seems like an unsuccessful attempt to get the status?
CC: @hyonholee @huangpf
@cormacpayne that's exactly right.
@hyonholee @huangpf would one of you mind taking a look at this issue? Could there be a throttling issue causing this hang to occur?
@c4rlosmarin it sounds like you are likely hitting throttling issues from calling the ASM (RDFE) interface repeatedly. With this older interface you need to ensure that there is adequate time between calls to maintain proper functionality. Please let us know if this issue disappears if 10 seconds or more of wait is added between calls. If the problem goes away with the additional wait time added, then we ask that you use the additional wait time between calls as a workaround.
Closing this issue for now. Please let us know if this issue does not resolve itself after adding the wait between calls.
Hi there,
I'm getting no response from "Get-AzureDeployment" on my subscription where there is no activity at all, there are no additional ARM requests being sent to it. I was able to repro with the following script:
while ($true) { Write-Host "rn rn";Get-AzureDeployment -ServiceName myazurecloudservice001 -Slot Production -Verbose; }
After 1 or 2 minutes of successfully getting responses within 1 second, the command then hangs forever.
I've uploaded a memory dump here: https://1drv.ms/u/s!Aoa12lKyhAfZh7N5WUxDnXtBo30VVA?e=yNTe4J