Azure / azure-functions-dotnet-worker

Azure Functions out-of-process .NET language worker
MIT License
428 stars 182 forks source link

Function invocation is canceled and will not be sent to the worker #1588

Open AGCodesDotNet opened 1 year ago

AGCodesDotNet commented 1 year ago

What do I need to look at to resolve this exception? Is this a known bug? Since the function doesn't trigger, I am trying to figure out what to look at.

Azure functions fail to trigger intermittently with a TaskCancelled Exception. Diagnostics show the exception is coming out of runtime/worker before the function gets called.

image

public class AlpacaSample 
    {
        private ILogger logger;

        public AlpacaSample(ILoggerFactory loggerFactory)
        {
            this.logger = loggerFactory.CreateLogger<AlpacaSample>();
        }

        [Function("AlpacaSample")]
        public async Task<HttpResponseData> Run([HttpTrigger(AuthorizationLevel.Anonymous, "get", "post")] HttpRequestData req)
        {
            try
            {
                this.requestData = req;
                string requestBody = await new StreamReader(req.Body).ReadToEndAsync();
                var message = $"Received a trigger {requestBody}";
                this.logger.LogInformation(message);

                // Business logic removed for brevity 

                return await this.CreateStringResponse(req, "All Good");
            }

            catch (Exception ex)
            {
                string responseMessage =
                       $"Error occurrend while handling the message. Error: {ex.Message} \n {ex}";
                this.logger.LogError(responseMessage);
                return await this.CreateStringResponse(req, responseMessage);
            }

        }

        private async Task<HttpResponseData> CreateStringResponse(HttpRequestData req, string msg, HttpStatusCode httpStatusCode = HttpStatusCode.OK)
        {
            var res = HttpResponseData.CreateResponse(req);
            res.StatusCode = httpStatusCode;
            await res.WriteStringAsync(msg);
            return res;
        }
}

Full Call stack :

   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at Microsoft.Azure.WebJobs.Host.Executors.FunctionExecutor+<ExecuteWithLoggingAsync>d__26.MoveNext (Microsoft.Azure.WebJobs.Host, Version=3.0.37.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35: D:\a\_work\1\s\src\Microsoft.Azure.WebJobs.Host\Executors\FunctionExecutor.cs:352)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter`1.GetResult (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at Microsoft.Azure.WebJobs.Host.Executors.FunctionExecutor+<TryExecuteAsync>d__18.MoveNext (Microsoft.Azure.WebJobs.Host, Version=3.0.37.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35: D:\a\_work\1\s\src\Microsoft.Azure.WebJobs.Host\Executors\FunctionExecutor.cs:108)
Inner exception Microsoft.Azure.WebJobs.Script.Description.FunctionInvocationCanceledException handled at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw:
   at Microsoft.Azure.WebJobs.Script.Description.WorkerFunctionInvoker+<InvokeCore>d__9.MoveNext (Microsoft.Azure.WebJobs.Script, Version=4.21.0.0, Culture=neutral, PublicKeyToken=null: /_/src/WebJobs.Script/Description/Workers/WorkerFunctionInvoker.cs:106)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter`1.GetResult (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at Microsoft.Azure.WebJobs.Script.Description.FunctionInvokerBase+<Invoke>d__24.MoveNext (Microsoft.Azure.WebJobs.Script, Version=4.21.0.0, Culture=neutral, PublicKeyToken=null: /_/src/WebJobs.Script/Description/FunctionInvokerBase.cs:82)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter`1.GetResult (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at Microsoft.Azure.WebJobs.Script.Description.FunctionGenerator+<Coerce>d__3`1.MoveNext (Microsoft.Azure.WebJobs.Script, Version=4.21.0.0, Culture=neutral, PublicKeyToken=null: /_/src/WebJobs.Script/Description/FunctionGenerator.cs:225)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter`1.GetResult (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at Microsoft.Azure.WebJobs.Host.Executors.FunctionInvoker`2+<InvokeAsync>d__10.MoveNext (Microsoft.Azure.WebJobs.Host, Version=3.0.37.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35: D:\a\_work\1\s\src\Microsoft.Azure.WebJobs.Host\Executors\FunctionInvoker.cs:52)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter`1.GetResult (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at Microsoft.Azure.WebJobs.Host.Executors.FunctionExecutor+<InvokeWithTimeoutAsync>d__33.MoveNext (Microsoft.Azure.WebJobs.Host, Version=3.0.37.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35: D:\a\_work\1\s\src\Microsoft.Azure.WebJobs.Host\Executors\FunctionExecutor.cs:581)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter`1.GetResult (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at Microsoft.Azure.WebJobs.Host.Executors.FunctionExecutor+<ExecuteWithWatchersAsync>d__32.MoveNext (Microsoft.Azure.WebJobs.Host, Version=3.0.37.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35: D:\a\_work\1\s\src\Microsoft.Azure.WebJobs.Host\Executors\FunctionExecutor.cs:527)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.GetResult (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at Microsoft.Azure.WebJobs.Host.Executors.FunctionExecutor+<ExecuteWithLoggingAsync>d__26.MoveNext (Microsoft.Azure.WebJobs.Host, Version=3.0.37.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35: D:\a\_work\1\s\src\Microsoft.Azure.WebJobs.Host\Executors\FunctionExecutor.cs:306)
Inner exception System.Threading.Tasks.TaskCanceledException handled at Microsoft.Azure.WebJobs.Script.Description.WorkerFunctionInvoker+<InvokeCore>d__9.MoveNext:
   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter`1.GetResult (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at Microsoft.Azure.WebJobs.Script.Description.WorkerFunctionInvoker+<InvokeCore>d__9.MoveNext (Microsoft.Azure.WebJobs.Script, Version=4.21.0.0, Culture=neutral, PublicKeyToken=null: /_/src/WebJobs.Script/Description/Workers/WorkerFunctionInvoker.cs:101)

Just My Code call stack

   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at Microsoft.Azure.WebJobs.Host.Executors.FunctionExecutor+<ExecuteWithLoggingAsync>d__26.MoveNext (Microsoft.Azure.WebJobs.Host, Version=3.0.37.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35: D:\a\_work\1\s\src\Microsoft.Azure.WebJobs.Host\Executors\FunctionExecutor.cs:352)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter`1.GetResult (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at Microsoft.Azure.WebJobs.Host.Executors.FunctionExecutor+<TryExecuteAsync>d__18.MoveNext (Microsoft.Azure.WebJobs.Host, Version=3.0.37.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35: D:\a\_work\1\s\src\Microsoft.Azure.WebJobs.Host\Executors\FunctionExecutor.cs:108)
Inner exception Microsoft.Azure.WebJobs.Script.Description.FunctionInvocationCanceledException handled at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw:
   at Microsoft.Azure.WebJobs.Script.Description.WorkerFunctionInvoker+<InvokeCore>d__9.MoveNext (Microsoft.Azure.WebJobs.Script, Version=4.21.0.0, Culture=neutral, PublicKeyToken=null: /_/src/WebJobs.Script/Description/Workers/WorkerFunctionInvoker.cs:106)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter`1.GetResult (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at Microsoft.Azure.WebJobs.Script.Description.FunctionInvokerBase+<Invoke>d__24.MoveNext (Microsoft.Azure.WebJobs.Script, Version=4.21.0.0, Culture=neutral, PublicKeyToken=null: /_/src/WebJobs.Script/Description/FunctionInvokerBase.cs:82)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter`1.GetResult (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at Microsoft.Azure.WebJobs.Script.Description.FunctionGenerator+<Coerce>d__3`1.MoveNext (Microsoft.Azure.WebJobs.Script, Version=4.21.0.0, Culture=neutral, PublicKeyToken=null: /_/src/WebJobs.Script/Description/FunctionGenerator.cs:225)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter`1.GetResult (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at Microsoft.Azure.WebJobs.Host.Executors.FunctionInvoker`2+<InvokeAsync>d__10.MoveNext (Microsoft.Azure.WebJobs.Host, Version=3.0.37.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35: D:\a\_work\1\s\src\Microsoft.Azure.WebJobs.Host\Executors\FunctionInvoker.cs:52)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter`1.GetResult (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at Microsoft.Azure.WebJobs.Host.Executors.FunctionExecutor+<InvokeWithTimeoutAsync>d__33.MoveNext (Microsoft.Azure.WebJobs.Host, Version=3.0.37.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35: D:\a\_work\1\s\src\Microsoft.Azure.WebJobs.Host\Executors\FunctionExecutor.cs:581)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter`1.GetResult (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at Microsoft.Azure.WebJobs.Host.Executors.FunctionExecutor+<ExecuteWithWatchersAsync>d__32.MoveNext (Microsoft.Azure.WebJobs.Host, Version=3.0.37.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35: D:\a\_work\1\s\src\Microsoft.Azure.WebJobs.Host\Executors\FunctionExecutor.cs:527)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.GetResult (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at Microsoft.Azure.WebJobs.Host.Executors.FunctionExecutor+<ExecuteWithLoggingAsync>d__26.MoveNext (Microsoft.Azure.WebJobs.Host, Version=3.0.37.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35: D:\a\_work\1\s\src\Microsoft.Azure.WebJobs.Host\Executors\FunctionExecutor.cs:306)
Inner exception System.Threading.Tasks.TaskCanceledException handled at Microsoft.Azure.WebJobs.Script.Description.WorkerFunctionInvoker+<InvokeCore>d__9.MoveNext:
   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter`1.GetResult (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at Microsoft.Azure.WebJobs.Script.Description.WorkerFunctionInvoker+<InvokeCore>d__9.MoveNext (Microsoft.Azure.WebJobs.Script, Version=4.21.0.0, Culture=neutral, PublicKeyToken=null: /_/src/WebJobs.Script/Description/Workers/WorkerFunctionInvoker.cs:101)

Trace


6/1/2023, 7:45:03 AM
-
REQUEST
URL: [https://******.azurewebsites.net/api/******](https://%2A%2A%2A%2A%2A%2A.azurewebsites.net/api/******)
Response code: 0
Response time: 3.1 s
6/1/2023, 7:45:03 AM
-
TRACE
FUNCTIONS_WORKER_RUNTIME set to dotnet-isolated. Skipping WorkerConfig for language: java
Severity level: Information
6/1/2023, 7:45:03 AM
-
TRACE
FUNCTIONS_WORKER_RUNTIME set to dotnet-isolated. Skipping WorkerConfig for language: node
Severity level: Information
6/1/2023, 7:45:03 AM
-
TRACE
FUNCTIONS_WORKER_RUNTIME set to dotnet-isolated. Skipping WorkerConfig for language: powershell
Severity level: Information
6/1/2023, 7:45:03 AM
-
TRACE
Initializing Warmup Extension.
Severity level: Information
6/1/2023, 7:45:03 AM
-
TRACE
Initializing Host. OperationId: 'fd2e77ef-c3b6-4a31-9d77-24c0de283e1f'.
Severity level: Information
6/1/2023, 7:45:03 AM
-
TRACE
Host initialization: ConsecutiveErrors=0, StartupCount=4, OperationId=fd2e77ef-c3b6-4a31-9d77-24c0de283e1f
Severity level: Information
6/1/2023, 7:45:03 AM
-
TRACE
Loading functions metadata
Severity level: Information
6/1/2023, 7:45:03 AM
-
TRACE
Reading functions metadata
Severity level: Information
6/1/2023, 7:45:04 AM
-
TRACE
2 functions found
Severity level: Information
6/1/2023, 7:45:04 AM
-
TRACE
1 functions loaded
Severity level: Information
6/1/2023, 7:45:04 AM
-
TRACE
ApplicationInsightsLoggerOptions { "SamplingSettings": { "EvaluationInterval": "00:00:15", "InitialSamplingPercentage": 100.0, "MaxSamplingPercentage": 100.0, "MaxTelemetryItemsPerSecond": 20.0, "MinSamplingPercentage": 0.1, "MovingAverageRatio": 0.25, "SamplingPercentageDecreaseTimeout": "00:02:00", "SamplingPercentageIncreaseTimeout": "00:15:00" }, "SamplingExcludedTypes": "Request", "SamplingIncludedTypes": null, "SnapshotConfiguration": null, "EnablePerformanceCountersCollection": true, "HttpAutoCollectionOptions": { "EnableHttpTriggerExtendedInfoCollection": true, "EnableW3CDistributedTracing": true, "EnableResponseHeaderInjection": true }, "LiveMetricsInitializationDelay": "00:00:15", "EnableLiveMetrics": true, "EnableDependencyTracking": true, "DependencyTrackingOptions": null }
Severity level: Information
6/1/2023, 7:45:04 AM
-
TRACE
LoggerFilterOptions { "MinLevel": "None", "Rules": [ { "ProviderName": null, "CategoryName": null, "LogLevel": null, "Filter": "<AddFilter>b__0" }, { "ProviderName": "Microsoft.Azure.WebJobs.Script.WebHost.Diagnostics.SystemLoggerProvider", "CategoryName": null, "LogLevel": "None", "Filter": null }, { "ProviderName": "Microsoft.Azure.WebJobs.Script.WebHost.Diagnostics.SystemLoggerProvider", "CategoryName": null, "LogLevel": null, "Filter": "<AddFilter>b__0" }, { "ProviderName": "Microsoft.Azure.WebJobs.Logging.ApplicationInsights.ApplicationInsightsLoggerProvider", "CategoryName": null, "LogLevel": "Trace", "Filter": null } ] }
Severity level: Information
6/1/2023, 7:45:04 AM
-
TRACE
LoggerFilterOptions { "MinLevel": "None", "Rules": [ { "ProviderName": null, "CategoryName": null, "LogLevel": null, "Filter": "<AddFilter>b__0" }, { "ProviderName": "Microsoft.Azure.WebJobs.Script.WebHost.Diagnostics.SystemLoggerProvider", "CategoryName": null, "LogLevel": "None", "Filter": null }, { "ProviderName": "Microsoft.Azure.WebJobs.Script.WebHost.Diagnostics.SystemLoggerProvider", "CategoryName": null, "LogLevel": null, "Filter": "<AddFilter>b__0" }, { "ProviderName": "Microsoft.Azure.WebJobs.Logging.ApplicationInsights.ApplicationInsightsLoggerProvider", "CategoryName": null, "LogLevel": "Trace", "Filter": null } ] }
Severity level: Information
6/1/2023, 7:45:04 AM
-
TRACE
ConcurrencyOptions { "DynamicConcurrencyEnabled": false, "MaximumFunctionConcurrency": 500, "CPUThreshold": 0.8, "SnapshotPersistenceEnabled": true }
Severity level: Information
6/1/2023, 7:45:04 AM
-
TRACE
FunctionResultAggregatorOptions { "BatchSize": 1000, "FlushTimeout": "00:00:30", "IsEnabled": true }
Severity level: Information
6/1/2023, 7:45:04 AM
-
TRACE
SingletonOptions { "LockPeriod": "00:00:15", "ListenerLockPeriod": "00:01:00", "LockAcquisitionTimeout": "10675199.02:48:05.4775807", "LockAcquisitionPollingInterval": "00:00:05", "ListenerLockRecoveryPollingInterval": "00:01:00" }
Severity level: Information
6/1/2023, 7:45:04 AM
-
TRACE
ScaleOptions { "ScaleMetricsMaxAge": "00:02:00", "ScaleMetricsSampleInterval": "00:00:10", "MetricsPurgeEnabled": true, "IsTargetScalingEnabled": true, "IsRuntimeScalingEnabled": false }
Severity level: Information
6/1/2023, 7:45:04 AM
-
TRACE
Starting JobHost
Severity level: Information
6/1/2023, 7:45:04 AM
-
TRACE
Starting Host (HostId=******, InstanceId=8f937b40-6d44-4919-******-ec66a372643a, Version=4.21.2.20699, ProcessId=5844, AppDomainId=1, InDebugMode=False, InDiagnosticMode=False, FunctionsExtensionVersion=~4)
Severity level: Information
6/1/2023, 7:45:04 AM
-
TRACE
Generating 1 job function(s)
Severity level: Information
6/1/2023, 7:45:04 AM
-
TRACE
Found the following functions: Host.Functions.******
Severity level: Information
6/1/2023, 7:45:04 AM
-
TRACE
Initializing function HTTP routes Mapped function route 'api/******' [get,post] to '******'
Severity level: Information
6/1/2023, 7:45:04 AM
-
TRACE
Host initialized (12ms)
Severity level: Information
6/1/2023, 7:45:04 AM
-
TRACE
Host started (12ms)
Severity level: Information
6/1/2023, 7:45:04 AM
-
TRACE
Job host started
Severity level: Information
6/1/2023, 7:45:04 AM
-
TRACE
HttpOptions { "DynamicThrottlesEnabled": true, "EnableChunkedRequestBinding": false, "MaxConcurrentRequests": 100, "MaxOutstandingRequests": 200, "RoutePrefix": "api" }
Severity level: Information
6/1/2023, 7:45:04 AM
-
TRACE
Executing 'Functions.******' (Reason='This function was programmatically called via the host APIs.', Id=1fd81ebd-e455-4a79-a9e8-5a02972a124d)
Severity level: Information
6/1/2023, 7:45:06 AM
-
TRACE
Worker process started and initialized.
Severity level: Information
6/1/2023, 7:45:06 AM
-
EXCEPTION
Exception while executing function: Functions.******
Problem Id: System.Threading.Tasks.TaskCanceledException at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess
6/1/2023, 7:45:06 AM
-
TRACE
Executed 'Functions.******' (Failed, Id=1fd81ebd-e455-4a79-a9e8-5a02972a124d, Duration=2825ms)
Severity level: Error
6/1/2023, 7:45:06 AM
-
EXCEPTION
Exception while executing function: Functions.******
Problem Id: System.Threading.Tasks.TaskCanceledException at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess
6/1/2023, 7:45:09 AM
-
TRACE
Host lock lease acquired by instance ID '3caa8b34bd20508b527345686484c415'.
Severity level: Information```
martingottlander commented 1 year ago

We got this problem as well, specifically The invocation request with id '' was canceled before the request was sent to the worker.

AGDoesCode commented 1 year ago

In my case

We got this problem as well, specifically The invocation request with id '' was canceled before the request was sent to the worker.

If I call the function via Postman, everything works fine. But It fails if a third party calls the endpoint. I do not have any IP restrictions. Is your case similar?

torepaulsson commented 1 year ago

We also get this error, could it be when the function host scales down? We've seen a host get started and immediately canceled later on. It is annoying since it triggers error logs which we have alert rules on, would be great if this exception is handled in the worker, a 503 returned to the caller so a retry could be performed.

AGDoesCode commented 1 year ago

It turned out to be a bug introduced in the latest version of Azure function runtime released at the end of May. Pinning the Function runtime to 4.17.3 fixes the issue. https://learn.microsoft.com/en-us/azure/azure-functions/set-runtime-version?tabs=portal

Puzjak commented 1 year ago

Hi. We are also getting this error both on HTTP and ServiceBus triggered functions. When it happens on ServiceBus triggered function, message ends up in DeadLetter queue (We use max retry count 1). It happens at random intervals and is not predictable at all. We tried setting the Function runtime version to 4.17.3 but it did not help.

AGDoesCode commented 1 year ago

The issue is back and Function runtime 4.17.3 is no longer available

vladkasianenko commented 1 year ago

The same for me. Locally everything works fine with func start, but once deployed to the Azure, I have this error.

jon-mey commented 1 year ago

Seeing this as well. Can it happen if the client has a short timeout and cancels the request during cold start?

fabricciotc commented 1 year ago

The same for me. I'm getting a lot of these messages on my AppInsigths in Azure

SergiyKostenko commented 1 year ago

Same for me, any idea when it will be fixed?

fabricciotc commented 1 year ago

Hi gentlemen. We have identified on my team a potential cause for the error you're experiencing. It might be related to the hostId collision.

In our findings, we noticed that while Azure documentation suggests a hostId length of 32 characters or less, using the maximum 32 characters led to an error. To rectify this, in your Azure Function app settings within the Azure Portal, please set a custom hostId. You can do this by adding the property AzureFunctionsWebHost__hostid and ensuring the value's length is less than 32 characters. In our case, using a value with 24 characters resolved the issue.

https://github.com/Azure/azure-functions-host/wiki/Host-IDs#host-id-override https://learn.microsoft.com/en-us/azure/azure-functions/errors-diagnostics/diagnostic-events/azfd0004

Petit-Pas commented 1 year ago

Facing the same issue on an isolated function app (mcr.microsoft.com/azure-functions/dotnet-isolated:4-dotnet-isolated7.0) The hostId trick from @fabricciotc did not change anything for me :(

fabricciotc commented 1 year ago

@Petit-Pas You can also try to define the FUNCTIONS_EXTENSION_VERSION on the app settings within the Azure Portal. You should settle it as dotnet-isolated

Eniemela commented 1 year ago

Facing the same issue. When using .NET 7 isolated, the first request with cold-start halts and throws, next requests go fine. With quick testing with a dummy application it seems that when using

using Microsoft.Extensions.Hosting;

var host = new HostBuilder()
    .ConfigureFunctionsWorkerDefaults()
    .Build();

host.Run();

on Program.cs, the error does not appear, however when changing to using ASP.NET Core integration with

using Microsoft.Extensions.Hosting;
using Microsoft.Azure.Functions.Worker;

var host = new HostBuilder()
    .ConfigureFunctionsWebApplication()
    .Build();

host.Run();

as recommended here the issue comes present

Holler-Jason commented 1 year ago

We're seeing the exact same issue @Eniemela

Hardt-Coded commented 11 months ago

Facing the same issue. When using .NET 7 isolated, the first request with cold-start halts and throws, next requests go fine. With quick testing with a dummy application it seems that when using

using Microsoft.Extensions.Hosting;

var host = new HostBuilder()
    .ConfigureFunctionsWorkerDefaults()
    .Build();

host.Run();

on Program.cs, the error does not appear, however when changing to using ASP.NET Core integration with

using Microsoft.Extensions.Hosting;
using Microsoft.Azure.Functions.Worker;

var host = new HostBuilder()
    .ConfigureFunctionsWebApplication()
    .Build();

host.Run();

as recommended here the issue comes present

I use the old version with the "WorkerDefaults" and the error occurs also on my end.

brettsam commented 9 months ago

This error typically happens when an HttpClient has a timeout faster than the time it takes to dispatch the request to the worker. There has been a new configuration setting added recently (not yet in production) that should give more control over these scenarios -- https://github.com/Azure/azure-functions-host/pull/9523.

I'm going to move this to our next Sprint so someone can have a look and make sure the scenarios are covered.

7rakir commented 9 months ago

This error typically happens when an HttpClient has a timeout faster than the time it takes to dispatch the request to the worker. There has been a new configuration setting added recently (not yet in production) that should give more control over these scenarios -- Azure/azure-functions-host#9523.

I'm going to move this to our next Sprint so someone can have a look and make sure the scenarios are covered.

Considering the comment by @Eniemela I don't see a reason why this should be the case - when it works with ConfigureFunctionsWorkerDefaults but does not work with ConfigureFunctionsWebApplication.

steveland83 commented 9 months ago

This error typically happens when an HttpClient has a timeout faster than the time it takes to dispatch the request to the worker. There has been a new configuration setting added recently (not yet in production) that should give more control over these scenarios -- Azure/azure-functions-host#9523.

I'm going to move this to our next Sprint so someone can have a look and make sure the scenarios are covered.

@brettsam @fabiocav While your summation above is likely correct on the surface I think there may be more to this - particularly wrt why the requests are not being served before the client times out. We've done some investigation into this as we've seen these exceptions for quite some time and across a varied group of function apps (all Linux and running various .Net versions).

Our findings:

We presume that there may be various potential root causes, but based on the above we wondered whether there may be some degree of problematic load balancing, where nodes are being added prematurely/removed from rotation too late.

We would be happy to provide corroborating logs for your investigation if that would be helpful.

fabiocav commented 8 months ago

The fix for this is being deployed (in host version 4.30). We'll share guidance here once the changes are out.

fabiocav commented 7 months ago

4.30 has been globally deployed. We'll share a link to the documentation covering the new configuration options for this once that is available.

Adrian-Wennberg commented 7 months ago

We are currently facing what seems to be the same issue with a lot of our functions. We have the same situation where the first call of a function does not respond at all, but we are not getting the exact same error message as above. One of our functions apps has several functions that need to be called once a day and the first function called always fails after 59.9 seconds with the following error. After that failure, the function app works as expected. This seems to only happen when the function is cold starting, so it only affects function apps that are used infrequently and function apps in our test and dev environments.

Exception while executing function: ... Result: Failure
Exception: System.Threading.Tasks.TaskCanceledException: A task was canceled.
   at Microsoft.Azure.Functions.Worker.Extensions.Http.AspNetCore.DefaultHttpCoordinator.SetFunctionContextAsync(String invocationId, FunctionContext context) in D:\a\_work\1\s\extensions\Worker.Extensions.Http.AspNetCore\src\Coordinator\DefaultHttpCoordinator.cs:line 35
   at Microsoft.Azure.Functions.Worker.Extensions.Http.AspNetCore.FunctionsHttpProxyingMiddleware.Invoke(FunctionContext context, FunctionExecutionDelegate next) in D:\a\_work\1\s\extensions\Worker.Extensions.Http.AspNetCore\src\FunctionsMiddleware\FunctionsHttpProxyingMiddleware.cs:line 41
   at Microsoft.Azure.Functions.Worker.FunctionsApplication.InvokeFunctionAsync(FunctionContext context) in D:\a\_work\1\s\src\DotNetWorker.Core\FunctionsApplication.cs:line 77
   at Microsoft.Azure.Functions.Worker.Handlers.InvocationHandler.InvokeAsync(InvocationRequest request) in D:\a\_work\1\s\src\DotNetWorker.Grpc\Handlers\InvocationHandler.cs:line 88
Stack:    at Microsoft.Azure.Functions.Worker.Extensions.Http.AspNetCore.DefaultHttpCoordinator.SetFunctionContextAsync(String invocationId, FunctionContext context) in D:\a\_work\1\s\extensions\Worker.Extensions.Http.AspNetCore\src\Coordinator\DefaultHttpCoordinator.cs:line 35
   at Microsoft.Azure.Functions.Worker.Extensions.Http.AspNetCore.FunctionsHttpProxyingMiddleware.Invoke(FunctionContext context, FunctionExecutionDelegate next) in D:\a\_work\1\s\extensions\Worker.Extensions.Http.AspNetCore\src\FunctionsMiddleware\FunctionsHttpProxyingMiddleware.cs:line 41
   at Microsoft.Azure.Functions.Worker.FunctionsApplication.InvokeFunctionAsync(FunctionContext context) in D:\a\_work\1\s\src\DotNetWorker.Core\FunctionsApplication.cs:line 77
   at Microsoft.Azure.Functions.Worker.Handlers.InvocationHandler.InvokeAsync(InvocationRequest request) in D:\a\_work\1\s\src\DotNetWorker.Grpc\Handlers\InvocationHandler.cs:line 88 

This is a new issue that started occurring after updating our functions from In-Process to Isolated, and from dotnet 6 to dotnet 8. We have tested creating a new function app with a simple dummy function and that does not cause the same issue. We are currently testing to figure out what part of our project causes this issue to appear

innominate227 commented 7 months ago

We are also having the issue described by Adrian-Wennberg the first call of the day fails after a timeout. The invocation details show the same exception and stack trace posted above. The very first line of the user code of the function is writing a log message and that log message never gets written. The function is Isolated dotnet 6.

Invocation Details for first call of the day that times out: image

Invocation Details for call that works: image

liliankasem commented 6 months ago

@Adrian-Wennberg and @innominate227 - the issue you're describing looks to be different to what this issue was opened for - can you please open a new issue for what you're describing? Thanks!

rhythmnewt commented 6 months ago

One more with same issue as @Adrian-Wennberg, same upgrade path in process .net 6 to isolated .net 8. First cold start http trigger request times out.

vmartinlantek commented 6 months ago

We are getting the exact same behaviour in our http-triggered function (consumption) after we upgraded it to net8 isolated. The first request after the host has been drained times out, subsequent requests work fine until the host is drained again. When can we expect a fix for this?

efernandesvieira commented 6 months ago

I'm experiencing exactly the same around here.

Just let me share that using App Function linux-x64 runtime and setting

`

true linux-x64

`

in the project, the issue appears to disappear. Anyway I'm looking forward to have a solution for a Windows runtime.

vmartinlantek commented 6 months ago

OMG @efernandesvieira your fix worked for me! The first request is now running correctly. I can tell that it is working by looking the log stream but also because it is significantly slower than subsequent requests. Thanks a lot for sharing!!

erkenP commented 6 months ago

We are having the same issue as described by @Adrian-Wennberg and @innominate227 The function is Isolated dotnet 8, and the first call returns the same exception as described before, while the coming calls works as expected.

@liliankasem Has there been any new updates about this issue?

filip-be commented 6 months ago

Hi! We are also experiencing the same issue on isolated model with .NET 8. Function is on consumption plan, not used very much, there are 1 request coming in every 10 minutes and sometimes it is failing during cold start.

We have tried running function on Windows & Linux, using ASP.NET Core integration or just pure function worker and we are facing timeouts in each of these cases.

Here is some logs on Azure Function running on Windows with ASP.NET Core integration:

btw. this issue was also raised in Microsoft Learn: https://learn.microsoft.com/en-us/answers/questions/1609713/azure-function-fails-with-functioninvocationexcept?page=1&orderby=helpful

vmartinlantek commented 6 months ago

@filip-be @erkenP I was also experiencing the same problem and the workaround proposed by @efernandesvieira worked for me.

areriklq commented 5 months ago

I just deployed a fresh .NET Core 8 isolated function on Linux and I am seeing this as well. Consumption plan.

Exception while executing function: Functions.GetImportantMessages Result: Failure
Exception: System.Threading.Tasks.TaskCanceledException: A task was canceled.
   at Microsoft.Azure.Functions.Worker.Extensions.Http.AspNetCore.DefaultHttpCoordinator.SetFunctionContextAsync(String invocationId, FunctionContext context) in D:\a\_work\1\s\extensions\Worker.Extensions.Http.AspNetCore\src\Coordinator\DefaultHttpCoordinator.cs:line 35

Are there any working remedies or a long term fix in progress?

efernandesvieira commented 5 months ago

I just deployed a fresh .NET Core 8 isolated function on Linux and I am seeing this as well. Consumption plan.

Exception while executing function: Functions.GetImportantMessages Result: Failure
Exception: System.Threading.Tasks.TaskCanceledException: A task was canceled.
   at Microsoft.Azure.Functions.Worker.Extensions.Http.AspNetCore.DefaultHttpCoordinator.SetFunctionContextAsync(String invocationId, FunctionContext context) in D:\a\_work\1\s\extensions\Worker.Extensions.Http.AspNetCore\src\Coordinator\DefaultHttpCoordinator.cs:line 35

Are there any working remedies or a long term fix in progress?

@areriklq After many tests, I never saw that error on linux runtime. It was really easy to reproduce on Windows runtime. Maybe in your case the task is really being cancelled due some channel being closed or similar? Do you have any other error?

areriklq commented 5 months ago

I just deployed a fresh .NET Core 8 isolated function on Linux and I am seeing this as well. Consumption plan.

Exception while executing function: Functions.GetImportantMessages Result: Failure
Exception: System.Threading.Tasks.TaskCanceledException: A task was canceled.
   at Microsoft.Azure.Functions.Worker.Extensions.Http.AspNetCore.DefaultHttpCoordinator.SetFunctionContextAsync(String invocationId, FunctionContext context) in D:\a\_work\1\s\extensions\Worker.Extensions.Http.AspNetCore\src\Coordinator\DefaultHttpCoordinator.cs:line 35

Are there any working remedies or a long term fix in progress?

@areriklq After many tests, I never saw that error on linux runtime. It was really easy to reproduce on Windows runtime. Maybe in your case the task is really being cancelled due some channel being closed or similar? Do you have any other error?

No, subsequent calls work. The function is a wrapper API for a CosmosDB database. The function method is a HTTP-Trigger for a GET endpoint. So as simple as it gets.

efernandesvieira commented 5 months ago

@areriklq Ok. Are you using 64 bits environment?

areriklq commented 5 months ago

@areriklq Ok. Are you using 64 bits environment?

@efernandesvieira I am using Consumption Plan, I don't think 64bit Linux is available? However, this the site config:

          "linuxFxVersion": "DOTNET-ISOLATED|8.0",
          "use32BitWorkerProcess": null,
efernandesvieira commented 5 months ago

I'm using Consumption plan Y1 with linux 64 bits runtime.

areriklq commented 5 months ago

RuntimeIdentifier>linux-x64

@efernandesvieira According to the docs there is nothing like a Linux 32bit RID, so I guess I run 64bit already?

And PublishReadyToRun doesn't exist for .NET Core 8

efernandesvieira commented 5 months ago

@areriklq I have this in the project:

true linux-x64

And in the deployment:

vmartinlantek commented 5 months ago

RuntimeIdentifier>linux-x64

@efernandesvieira According to the docs there is nothing like a Linux 32bit RID, so I guess I run 64bit already?

And PublishReadyToRun doesn't exist for .NET Core 8

PublishReadyToRun is documented here

efernandesvieira commented 5 months ago

RuntimeIdentifier>linux-x64

@efernandesvieira According to the docs there is nothing like a Linux 32bit RID, so I guess I run 64bit already?

And PublishReadyToRun doesn't exist for .NET Core 8

Yep, supported linux is 64bits already.

areriklq commented 5 months ago

RuntimeIdentifier>linux-x64

@efernandesvieira According to the docs there is nothing like a Linux 32bit RID, so I guess I run 64bit already? And PublishReadyToRun doesn't exist for .NET Core 8

PublishReadyToRun is documented here

Thanks for the reference and trying to help out. I will make sure to test this property and report back!

marcroussy commented 5 months ago

I've had the same issue as mentioned above by everyone else with a project that was upgrade from .NET6 in-process to .NET8 isolated.

I've found the source of the problem in our case. The Functions project references a custom Nuget package (which references a few ASP.NET Core assemblies, see below), and if I remove that custom NuGet package from the Functions project, I no longer get the timeout on cold start.

The Nuget package that causes the issue has references to the following assemblies: `

`
I hope that can help someone at Microsoft figure out the issue, because ideally I would like to be able to continue to use our NuGet package for our custom logger.

filip-be commented 5 months ago

I can confirm @marcroussy finding! We have removed all references to the ASP.NET Core from our code and we are not facing any more cancelled executions during app cold start.

So it looks like the issue is coupled to the ASP.NET Core libraries - @liliankasem - could you address this issue? Should we create another issue for that?

ben-graffle commented 5 months ago

Removing references to the packages mentioned by @marcroussy has resolved this for me as well. .NET 8 isolated function apps running on windows consumption plan

liliankasem commented 5 months ago

I can confirm @marcroussy finding! We have removed all references to the ASP.NET Core from our code and we are not facing any more cancelled executions during app cold start.

So it looks like the issue is coupled to the ASP.NET Core libraries - @liliankasem - could you address this issue? Should we create another issue for that?

Please create a new issue for this - thank you!

areriklq commented 5 months ago

I just wanted to chime in here that my issues completely disappeared starting May 17 18:00CET (region NorthEurope). I have done no re-deploy or change to my dotnet 8 function. My best guess is that the issues I had was due to some issues in the underlying Azure infrastructure and the maintenance window in May (BS08-PPG) solved this issue.

efernandesvieira commented 5 months ago

@areriklq I'm aligned with that. I'm trying my original project configuration, where I could reproduce the issue very easily, and now I just can't get the original error, without any tunning or package removal.

adreed-msft commented 4 months ago

Entirely removing Asp.NET from our project seems to resolve this for us. Without that, the function never begins running in the first place.

marcroussy commented 4 months ago

@adreed-msft Any chance you could open a new issue for it as requested by @liliankasem? I've been meaning to do it but I'm a bit swamped at the moment.