Azure / azure-functions-durable-extension

Durable Task Framework extension for Azure Functions
MIT License
715 stars 271 forks source link

Did not find any initialized language workers causes Failed Orchestrations #2621

Open RichardBurns1982 opened 1 year ago

RichardBurns1982 commented 1 year ago

Description

We are running multiple Azure function apps on Consumption within the same app service plan, the documentation seems to recommend a single app per a plan, however we are limited to only 100 consumption plans per region and we have more than 100 function apps across all environments so in some there are more than 1.

We have been consistently seeing the following error:

System.InvalidOperationException: Did not find any initialized language workers

When this happens it causes all of the Orchestration functions that are running to fail and they do not recover. On the 4th October 21 Orchestration instances were marked as failed in one of our production environments within 1 minute while this error happened on all functions that tried to run.

Expected behavior

If an error occurs on the host I would not expect the Orchestration Instances to be marked as failed and should be able to continue until they successfully run our code.

Actual behavior

Our code is never called, the stack trace logged is entirely from Microsoft.Azure.WebJobs and there is no opportunity for us to attempt to recover the orchestration instance.

Relevant source code snippets

None

Known workarounds

None

App Details

Stack Trace from Orchestration Table Storage

{"message":"Function 'DataSetValidationOrchestrationFunction' failed with an unhandled exception.","details":"Microsoft.Azure.WebJobs.Host.FunctionInvocationException: Exception while executing function: Functions.DataSetValidationOrchestrationFunction\r\n ---> System.InvalidOperationException: Did not find any initialized language workers\r\n at Microsoft.Azure.WebJobs.Script.Workers.Rpc.RpcFunctionInvocationDispatcherLoadBalancer.GetLanguageWorkerChannel(IEnumerable1 rpcWorkerChannels) in /_/src/WebJobs.Script/Workers/Rpc/FunctionRegistration/RpcFunctionInvocationDispatcherLoadBalancer.cs:line 27\r\n at Microsoft.Azure.WebJobs.Script.Workers.Rpc.RpcFunctionInvocationDispatcher.InvokeAsync(ScriptInvocationContext invocationContext) in /_/src/WebJobs.Script/Workers/Rpc/FunctionRegistration/RpcFunctionInvocationDispatcher.cs:line 400\r\n at Microsoft.Azure.WebJobs.Script.Description.WorkerFunctionInvoker.InvokeCore(Object[] parameters, FunctionInvocationContext context) in /_/src/WebJobs.Script/Description/Workers/WorkerFunctionInvoker.cs:line 97\r\n at Microsoft.Azure.WebJobs.Script.Description.FunctionInvokerBase.Invoke(Object[] parameters) in /_/src/WebJobs.Script/Description/FunctionInvokerBase.cs:line 82\r\n at Microsoft.Azure.WebJobs.Script.Description.FunctionGenerator.Coerce[T](Task1 src) in /_/src/WebJobs.Script/Description/FunctionGenerator.cs:line 225\r\n at Microsoft.Azure.WebJobs.Host.Executors.FunctionInvoker2.InvokeAsync(Object instance, Object[] arguments) in D:\\a\\_work\\1\\s\\src\\Microsoft.Azure.WebJobs.Host\\Executors\\FunctionInvoker.cs:line 52\r\n at Microsoft.Azure.WebJobs.Extensions.DurableTask.OutOfProcMiddleware.<>c__DisplayClass10_0.<<CallOrchestratorAsync>b__0>d.MoveNext() in D:\\a\\_work\\1\\s\\src\\WebJobs.Extensions.DurableTask\\OutOfProcMiddleware.cs:line 124\r\n--- End of stack trace from previous location ---\r\n at Microsoft.Azure.WebJobs.Host.Executors.TriggeredFunctionExecutor1.<>c__DisplayClass7_0.<b__0>d.MoveNext() in D:\a\_work\1\s\src\Microsoft.Azure.WebJobs.Host\Executors\TriggeredFunctionExecutor.cs:line 50\r\n--- End of stack trace from previous location ---\r\n at Microsoft.Azure.WebJobs.Host.Executors.FunctionExecutor.InvokeWithTimeoutAsync(IFunctionInvoker invoker, ParameterHelper parameterHelper, CancellationTokenSource timeoutTokenSource, CancellationTokenSource functionCancellationTokenSource, Boolean throwOnTimeout, TimeSpan timerInterval, IFunctionInstance instance) in D:\a\_work\1\s\src\Microsoft.Azure.WebJobs.Host\Executors\FunctionExecutor.cs:line 581\r\n at Microsoft.Azure.WebJobs.Host.Executors.FunctionExecutor.ExecuteWithWatchersAsync(IFunctionInstanceEx instance, ParameterHelper parameterHelper, ILogger logger, CancellationTokenSource functionCancellationTokenSource) in D:\a\_work\1\s\src\Microsoft.Azure.WebJobs.Host\Executors\FunctionExecutor.cs:line 527\r\n at Microsoft.Azure.WebJobs.Host.Executors.FunctionExecutor.ExecuteWithLoggingAsync(IFunctionInstanceEx instance, FunctionStartedMessage message, FunctionInstanceLogEntry instanceLogEntry, ParameterHelper parameterHelper, ILogger logger, CancellationToken cancellationToken) in D:\a\_work\1\s\src\Microsoft.Azure.WebJobs.Host\Executors\FunctionExecutor.cs:line 306\r\n --- End of inner exception stack trace ---\r\n at Microsoft.Azure.WebJobs.Host.Executors.FunctionExecutor.ExecuteWithLoggingAsync(IFunctionInstanceEx instance, FunctionStartedMessage message, FunctionInstanceLogEntry instanceLogEntry, ParameterHelper parameterHelper, ILogger logger, CancellationToken cancellationToken) in D:\a\_work\1\s\src\Microsoft.Azure.WebJobs.Host\Executors\FunctionExecutor.cs:line 352\r\n at Microsoft.Azure.WebJobs.Host.Executors.FunctionExecutor.TryExecuteAsync(IFunctionInstance functionInstance, CancellationToken cancellationToken) in D:\a\_work\1\s\src\Microsoft.Azure.WebJobs.Host\Executors\FunctionExecutor.cs:line 108"}

If deployed to Azure

We have access to a lot of telemetry that can help with investigations. Please provide as much of the following information as you can to help us investigate!

lilyjma commented 9 months ago

@RichardBurns1982 - Has this problem happened again? If yes, when?

lxalln commented 4 months ago

We are experiencing this issue after updating to dotnet 8 isolated functions. Did you ever resolve this issue @RichardBurns1982 ?