Azure / azure-functions-dotnet-worker

Azure Functions out-of-process .NET language worker
MIT License
414 stars 181 forks source link

Multiple Issues of FunctionTimeOut and ServiceBusMessages not Consuming in dotnet isolated ServiceBusTriggered function -- Urgent #2169

Closed inishantmishra closed 6 months ago

inishantmishra commented 8 months ago

Hi,

I have a functionapp which has service bus triggered function.

ServiceBus Triggered function sometimes kind of get stuck and doesn't consume messages from session based queue and when I check in appinsights i cant see any logs but can see functiontimeout exception and Language worker process restart or Did not find any initialized language workers.

It doesn't run and over appinsights i cant even see the complete logs why it got timedout as it seems it never actually triggered my code. Because, in my code, the first line is of logging a trace saying- Message received and also logs the message and when it completes or throw any exception it will log the trace again that it completed or threw any exception. But it never logs these traces. It just logs Severity level: Information, Message: Executing 'Functions.MyServiceTriggered' that's it nothing else and after 30 minutes as i run on Dedicated plan so default time out is of 30 minutes.

Please help what should I do, it seems every day i have to come online as some messages get stuck in active queue and not getting consumed so I have either restart whole functionApp or disable/enable the function to consume the messages.

Exception Examples-

`Full Exception : Microsoft.Azure.WebJobs.Host.FunctionTimeoutException /Timeout value of 00:30:00 was exceeded by function /Functions.MyFunction

async Microsoft.Azure.WebJobs.Host.Executors.FunctionExecutor.TryHandleTimeoutAsync(Task invokeTask,CancellationToken shutdownToken,Boolean throwOnTimeout,CancellationToken timeoutToken,TimeSpan timeoutInterval,IFunctionInstance instance,Action onTimeout) D:\a_work\1\s\src\Microsoft.Azure.WebJobs.Host\Executors\FunctionExecutor.cs /663

System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()

async Microsoft.Azure.WebJobs.Host.Executors.FunctionExecutor.InvokeWithTimeoutAsync(IFunctionInvoker invoker,ParameterHelper parameterHelper,CancellationTokenSource timeoutTokenSource,CancellationTokenSource functionCancellationTokenSource,Boolean throwOnTimeout,TimeSpan timerInterval,IFunctionInstance instance) D:\a_work\1\s\src\Microsoft.Azure.WebJobs.Host\Executors\FunctionExecutor.cs /571

System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()

async Microsoft.Azure.WebJobs.Host.Executors.FunctionExecutor.ExecuteWithWatchersAsync(IFunctionInstanceEx instance,ParameterHelper parameterHelper,ILogger logger,CancellationTokenSource functionCancellationTokenSource) D:\a_work\1\s\src\Microsoft.Azure.WebJobs.Host\Executors\FunctionExecutor.cs /527

System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()

async Microsoft.Azure.WebJobs.Host.Executors.FunctionExecutor.ExecuteWithLoggingAsync(IFunctionInstanceEx instance,FunctionStartedMessage message,FunctionInstanceLogEntry instanceLogEntry,ParameterHelper parameterHelper,ILogger logger,CancellationToken cancellationToken) D:\a_work\1\s\src\Microsoft.Azure.WebJobs.Host\Executors\FunctionExecutor.cs /306

System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()

async Microsoft.Azure.WebJobs.Host.Executors.FunctionExecutor.ExecuteWithLoggingAsync(IFunctionInstanceEx instance,FunctionStartedMessage message,FunctionInstanceLogEntry instanceLogEntry,ParameterHelper parameterHelper,ILogger logger,CancellationToken cancellationToken) D:\a_work\1\s\src\Microsoft.Azure.WebJobs.Host\Executors\FunctionExecutor.cs /352

System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()

async Microsoft.Azure.WebJobs.Host.Executors.FunctionExecutor.TryExecuteAsync(IFunctionInstance functionInstance,CancellationToken cancellationToken) D:\a_work\1\s\src\Microsoft.Azure.WebJobs.Host\Executors\FunctionExecutor.cs /108`

Another Exception `Full Exception : Exception while executing function /Functions.MyFunction ---> System.InvalidOperationException /Did not find any initialized language workers

Microsoft.Azure.WebJobs.Script.Workers.Rpc.RpcFunctionInvocationDispatcherLoadBalancer.GetLanguageWorkerChannel(IEnumerable`1 rpcWorkerChannels) /_/src/WebJobs.Script/Workers/Rpc/FunctionRegistration/RpcFunctionInvocationDispatcherLoadBalancer.cs /27

async Microsoft.Azure.WebJobs.Script.Workers.Rpc.RpcFunctionInvocationDispatcher.InvokeAsync(ScriptInvocationContext invocationContext) /_/src/WebJobs.Script/Workers/Rpc/FunctionRegistration/RpcFunctionInvocationDispatcher.cs /400

System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()

async Microsoft.Azure.WebJobs.Script.Description.WorkerFunctionInvoker.InvokeCore(Object[] parameters,FunctionInvocationContext context) /_/src/WebJobs.Script/Description/Workers/WorkerFunctionInvoker.cs /97

System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()

async Microsoft.Azure.WebJobs.Script.Description.FunctionInvokerBase.Invoke(Object[] parameters) /_/src/WebJobs.Script/Description/FunctionInvokerBase.cs /82

System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()

async Microsoft.Azure.WebJobs.Host.Executors.VoidTaskMethodInvoker`2.InvokeAsync[TReflected,TReturnType](TReflected instance,Object[] arguments) D:\a_work\1\s\src\Microsoft.Azure.WebJobs.Host\Executors\VoidTaskMethodInvoker.cs /20

System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()

async Microsoft.Azure.WebJobs.Host.Executors.FunctionInvoker`2.InvokeAsync[TReflected,TReturnValue](Object instance,Object[] arguments)

System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()

async Microsoft.Azure.WebJobs.Host.Executors.FunctionExecutor.InvokeWithTimeoutAsync(IFunctionInvoker invoker,ParameterHelper parameterHelper,CancellationTokenSource timeoutTokenSource,CancellationTokenSource functionCancellationTokenSource,Boolean throwOnTimeout,TimeSpan timerInterval,IFunctionInstance instance) D:\a_work\1\s\src\Microsoft.Azure.WebJobs.Host\Executors\FunctionExecutor.cs /581

System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()

async Microsoft.Azure.WebJobs.Host.Executors.FunctionExecutor.ExecuteWithWatchersAsync(IFunctionInstanceEx instance,ParameterHelper parameterHelper,ILogger logger,CancellationTokenSource functionCancellationTokenSource) D:\a_work\1\s\src\Microsoft.Azure.WebJobs.Host\Executors\FunctionExecutor.cs /527

System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()

async Microsoft.Azure.WebJobs.Host.Executors.FunctionExecutor.ExecuteWithLoggingAsync(IFunctionInstanceEx instance,FunctionStartedMessage message,FunctionInstanceLogEntry instanceLogEntry,ParameterHelper parameterHelper,ILogger logger,CancellationToken cancellationToken) D:\a_work\1\s\src\Microsoft.Azure.WebJobs.Host\Executors\FunctionExecutor.cs /306 End of inner exception

Microsoft.Azure.WebJobs.ServiceBus.SessionMessageProcessor.CompleteProcessingMessageAsync(ServiceBusSessionMessageActions actions,ServiceBusReceivedMessage message,FunctionResult result,CancellationToken cancellationToken)

async Microsoft.Azure.WebJobs.ServiceBus.Listeners.ServiceBusListener.ProcessSessionMessageAsync(ProcessSessionMessageEventArgs args)

System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()

async Azure.Messaging.ServiceBus.ServiceBusProcessor.OnProcessSessionMessageAsync(ProcessSessionMessageEventArgs args)

System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()

async Azure.Messaging.ServiceBus.ServiceBusSessionProcessor.OnProcessSessionMessageAsync(ProcessSessionMessageEventArgs args)

System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()

async Azure.Messaging.ServiceBus.SessionReceiverManager.OnMessageHandler(EventArgs args)

System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()

async Azure.Messaging.ServiceBus.ReceiverManager.ProcessOneMessage(ServiceBusReceivedMessage triggerMessage,CancellationToken cancellationToken)`

Packages added in Project `

  <PackageReference Include="Microsoft.ApplicationInsights.WorkerService" Version="2.21.0" />
  <PackageReference Include="Microsoft.Azure.Functions.Worker.ApplicationInsights" Version="1.0.0" />
  <PackageReference Include="Microsoft.Azure.Functions.Worker" Version="1.19.0" />
  <PackageReference Include="Microsoft.Azure.Functions.Worker.Extensions.ServiceBus" Version="5.11.0" />
  <PackageReference Include="Microsoft.Azure.Functions.Worker.Sdk" Version="1.15.1" />
  <PackageReference Include="Newtonsoft.Json" Version="13.0.1" />`

We are using .net 7 isolated function. We are on Dedicated AppService Plan with Always On. Please find the link for the minimum repo- Link for Repo

Host.json is { "version": "2.0", "logging": { "applicationInsights": { "samplingSettings": { "isEnabled": true, "excludedTypes": "Request" }, "enableLiveMetricsFilters": true } }, "extensions": { "serviceBus": { "sessionIdleTimeout": "00:00:01", "maxAutoLockRenewalDuration": "01:05:00" } }, "functionTimeout": "01:05:00" }

Please help me with what could be the issue. We are facing this repeatedly.

@brettsam @kshyju @satvu @fabiocav

inishantmishra commented 8 months ago

Can I get anyone's attention here? I also upgraded the packages for servicebus triggered one still faced the issue.

kshyju commented 8 months ago

@inishantmishra Is this a dupe of your other issue #2168 ?

inishantmishra commented 8 months ago

Hi @kshyju

NO this is not duplicate one. THis one has different issue, here we are seeing service bus getting stuck automatically sometimes and messages not getting consumed from service bus queue. And here also, for these type of scenarios we don't see any of of code traces saying i have recievd the message starting the operation etc.

See the example of log we get when it gets stuck. image

After restarting funtionAPp or disabling/enabling the function it starts executing smoothly and no issue with ms it process then.

Matthewsre commented 8 months ago

I have been having similar issues with my ServiceBusTrigger functions not running and messages just sitting on the queue. We kept restarting the function when this happened, which seemed to solve the issue, but then I found buried somewhere in documentation that sometimes you have to "view" the Function App in the Azure Portal to refresh the functions schema. Now when it is stuck with messages on the queue not being processed I just go load up the Function App page in the Azure Portal and sure enough just the act of viewing the function it will start processing the ServiceBus messages. I am also not getting any valuable logging when this scenario occurs.

This started happening a lot more recently, especially after we perform new releases, and I suspect it started happening more after updating the azure function worker/SDK. We stayed on Microsoft.Azure.Functions.Worker 1.10.0 for a long time because we had issues upgrading to anything else and always ended up rolling back to this version. We finally upgraded to 1.19.0 (and all related Microsoft.Azure.Functions.Worker.* packages to latest) in August 2023 to try and get better logging and upgraded to 1.20.0 in November 2023 to use the new bindings we had been waiting for (ServiceBusMessageActions).

This issue never happened when we were on 1.10.0, but we are no longer in a position to roll back and have a terrible work around of "viewing" the Azure Function in the portal when this happens. We have considered having something load this page up regularly (hourly?) to make sure our ServiceBusTriggers are working, but hoping the root issue ends up getting resolved.

My issue might be different, but I wanted to share my context in case it helps.

inishantmishra commented 8 months ago

@Matthewsre Your issue is also similar to mine.

bvmeer commented 8 months ago

@Matthewsre @inishantmishra

We have the same issue for a longer time. We have a .net 7 isolated function app which stops consuming messages from a session enabled servicebus queue. However after a couple of hours the consumption resumes, properly because a httptrigger in the same function app is triggerd (but we not sure if this is the main reason the consumption resumes).

After a lot trying we isolated 1 nuget update to cause this issue Microsoft.Azure.Functions.Worker.Sdk when we upgrade this from 1.15.1 to 1.16.4 then issues occurs. When we rollback to sdk 1.15.1 the issue disapears.

We are still busy with azure support to find the root cause, but downgrading only the sdk helped for us. Maybey it can help you to. Other function.worker packages have the lasted release version.

inishantmishra commented 8 months ago

It seems like this issue is faced by multiple people.

SymbioticKilla commented 8 months ago

Same here

inishantmishra commented 8 months ago

It seems like multiple people are facing this. Is there any issue with some specific version then please let us know which version should we use for .net 7 and .net 8?

kshyju commented 8 months ago

I have been having similar issues with my ServiceBusTrigger functions not running and messages just sitting on the queue. We kept restarting the function when this happened, which seemed to solve the issue, but then I found buried somewhere in documentation that sometimes you have to "view" the Function App in the Azure Portal to refresh the functions schema. Now when it is stuck with messages on the queue not being processed I just go load up the Function App page in the Azure Portal and sure enough just the act of viewing the function it will start processing the ServiceBus messages. I am also not getting any valuable logging when this scenario occurs.

This started happening a lot more recently, especially after we perform new releases, and I suspect it started happening more after updating the azure function worker/SDK. We stayed on Microsoft.Azure.Functions.Worker 1.10.0 for a long time because we had issues upgrading to anything else and always ended up rolling back to this version. We finally upgraded to 1.19.0 (and all related Microsoft.Azure.Functions.Worker.* packages to latest) in August 2023 to try and get better logging and upgraded to 1.20.0 in November 2023 to use the new bindings we had been waiting for (ServiceBusMessageActions).

This issue never happened when we were on 1.10.0, but we are no longer in a position to roll back and have a terrible work around of "viewing" the Azure Function in the portal when this happens. We have considered having something load this page up regularly (hourly?) to make sure our ServiceBusTriggers are working, but hoping the root issue ends up getting resolved.

My issue might be different, but I wanted to share my context in case it helps.

@Matthewsre The information shared in the GH issue comment may not be sufficient to understand the issue with your application. A more in-depth investigation is necessary to pinpoint the cause of this unexpected behavior.

We're here to assist you in resolving this issue. To streamline the investigation process, please open a support ticket. We look forward to working with you to address this matter promptly.

kshyju commented 8 months ago

@inishantmishra The information shared in the GH issue may not be sufficient to understand the issue with your app. A more in-depth investigation is necessary to pinpoint the cause of this unexpected behavior. Could you please open a support ticket?

inishantmishra commented 8 months ago

@kshyju Its been more than 1 month and i am not getting any luck from anywhere.. even not from support. PLease help

hentie commented 8 months ago

We also ran into the exact same issue. We created a new dotnet 8 isolated function with ServiceBusTriggers.

The ServiceBusTriggers stops triggering on new message when the function is idle for 15 to 30 mins. After downgrading Microsoft.Azure.Functions.Worker.Sdk to version 1.15.1 (as mentioned above), the ServiceBusTriggers keeps on working as expected

kshyju commented 8 months ago

We also ran into the exact same issue. We created a new dotnet 8 isolated function with ServiceBusTriggers.

  • Microsoft.Azure.Functions.Worker.Sdk 1.16.4
  • Microsoft.Azure.Functions.Worker.Extensions.ServiceBus 5.15.0

The ServiceBusTriggers stops triggering on new message when the function is idle for 15 to 30 mins. After downgrading Microsoft.Azure.Functions.Worker.Sdk to version 1.15.1 (as mentioned above), the ServiceBusTriggers keeps on working as expected

@hentie What SKU are you deploying your apps to? Do you see any interesting log messages in your Application Insights logs during the time when the trigger is not working?

hentie commented 8 months ago

We also ran into the exact same issue. We created a new dotnet 8 isolated function with ServiceBusTriggers.

  • Microsoft.Azure.Functions.Worker.Sdk 1.16.4
  • Microsoft.Azure.Functions.Worker.Extensions.ServiceBus 5.15.0

The ServiceBusTriggers stops triggering on new message when the function is idle for 15 to 30 mins. After downgrading Microsoft.Azure.Functions.Worker.Sdk to version 1.15.1 (as mentioned above), the ServiceBusTriggers keeps on working as expected

@hentie What SKU are you deploying your apps to? Do you see any interesting log messages in your Application Insights logs during the time when the trigger is not working?

Hi @kshyju we are using a Consumption Plan (Y1) and nothing special in Application Insights other than it stops logging as well after about 15~30mins.

kshyju commented 8 months ago

@inishantmishra Can you share app name, region and a timestamp where you observed the failure, so that we can take a look.

Berthelmaster commented 8 months ago

@hentie We had similar problems the last couple of days, however this was with Azure Queues using:

It's seems to be running smoothly again this morning, so i'm very confused what have changed 🤔

inishantmishra commented 8 months ago

Hi @kshyju I can't share the app name and details publicly. Can I get your email Id so that I can add you under the support ticket email trail or i can share you the details over there.

kshyju commented 8 months ago

@inishantmishra Does this work ? Sharing Your Function App name privately

inishantmishra commented 8 months ago

Hi @kshyju Please find the details Id=72ae2fa4-5f53-47be-bc51-0a336c556269 Central US 2024-01-12 16:01:49.921

kshyju commented 8 months ago

Hi @kshyju Please find the details Id=72ae2fa4-5f53-47be-bc51-0a336c556269 Central US 2024-01-12 16:01:49.921

I took a quick look at your app and can see that your function execution runs into timeout issues - this means your function code is taking more than 30 minutes to complete. I recommend you review your function code to determine why it takes more than 30 minutes. If this execution time is expected, you may consider increasing the timeout value for your application (30 minutes is default for premium plan). See https://learn.microsoft.com/en-us/azure/azure-functions/functions-scale#timeout

I also recommend exploring the "Diagnose and solve problems" blade in the azure portal. That will allow you to see these kind of errors yourself.

SeanFeldman commented 8 months ago

I also recommend exploring the "Diagnose and solve problems" blade in the azure portal. That will allow you to see these kind of errors yourself.

That. It should be in the users face when abnormal behavior is detected, IMO. Not everyone is aware of this blade, @kshyju.

inishantmishra commented 8 months ago

Hi @kshyju , Whenever the function timeout comes, if you see the logs at that time, it wont even trigger our source code.. Because not a single log says that source code is running. It just gets stuck at Executing and then gets timedout after 30 minutes which is default. My source code doesnt run whenever these timeout occurs.

Please check and help.

savbace commented 7 months ago

I am experiencing the same issue. Function not consume Service Bus messages. It seems to be sleeping. My setup:

After downgrade to Microsoft.Azure.Functions.Worker.Sdk 1.15.1 the issue disappears.

inishantmishra commented 7 months ago

Hi @kshyju , Can you confirm is there any issue with 1.20.0 + 1.16.4 version for Functions.Worker & Functions.Worker.SDK respectively? It seems the combination of 1.19.0 and 1.15.1 works for most of the users. Please confirm, shall we downgrade it?

Bandgren commented 7 months ago

Hi, @inishantmishra, @kshyju

I gave some more details in https://github.com/Azure/azure-functions-dotnet-worker/issues/2202#issuecomment-1910900627, the issue was resolved by soley downgrading the worker SDK from 1.16.4 -> 1.15.1. Im still on 1.20.1 for just the worker package.

inishantmishra commented 7 months ago

Hi @Bandgren , @kshyju I am still seeing the issue of the function becoming idle and not consuming the messages from the service bus ( using session based queue). I am using the following packages. image

Also, see the Queue Incoming and Outgoing messages metric. We can see there were no outgoing messages for at least 28 minutes. image

kshyju commented 7 months ago

@inishantmishra Did this recommendation work in your case?

inishantmishra commented 7 months ago

@kshyju No it did not work for me. Please see this comment of mine.

kshyju commented 7 months ago

@inishantmishra As I mentioned in one of my previous comments, with the the limited information shared in the GH issue, it might be hard to understand what is causing your issue in prod. Hopefully you are working with the support team and making progress.

I want to share another update - We recently Identified an issue with our scaling component which was caused by a glitch in the a recent worker.sdk change (introduced in 1.16.x). A preview version of the worker.sdk package was released with this fix. Would you mind trying that out?

Try the latest preview version - https://www.nuget.org/packages/Microsoft.Azure.Functions.Worker.Sdk/1.17.0-preview3

inishantmishra commented 7 months ago

Hi @kshyju Support is suggesting to downgrade packages of Functions.Worker.Sdk to 1.15.1 and Microsoft.Azure.Functions.Worker.Extensions.ServiceBus = 5.11.0

Please suggest is it okay to use these packages?

kshyju commented 7 months ago

Hi @kshyju Support is suggesting to downgrade packages of Functions.Worker.Sdk to 1.15.1 and Microsoft.Azure.Functions.Worker.Extensions.ServiceBus = 5.11.0

Please suggest is it okay to use these packages?

Yes, please follow the recommendations from support team who is investigating your case.

inishantmishra commented 7 months ago

@kshyju If I upgrade my function to .net 8 then which package versions should I use for the following?

Microsoft.Azure.Functions.Worker Microsoft.Azure.Functions.Worker.Sdk Microsoft.Azure.Functions.Worker.Extensions.ServiceBus

kshyju commented 7 months ago

@inishantmishra

Microsoft.Azure.Functions.Worker Microsoft.Azure.Functions.Worker.Sdk

Use latest stable version for these packages.

Microsoft.Azure.Functions.Worker.Extensions.ServiceBus

See this https://github.com/Azure/azure-functions-dotnet-worker/issues/2248#issuecomment-1921847568

jaliyaudagedara commented 6 months ago

/bot not-stale

See https://github.com/Azure/azure-functions-dotnet-worker/issues/2248#issuecomment-1953640845

microsoft-github-policy-service[bot] commented 6 months ago

Thank you jaliyaudagedara. This issue will not be automatically closed and a member of the team will review it soon.

mattchenderson commented 6 months ago

Closing this issue as intended by the automation. Issue 2248 is unrelated to this. Here, all questions appear to have been answered, and there doesn't seem to be any work to track here at this point.

jaliyaudagedara commented 6 months ago

@mattchenderson

I kept it open because of this comment: Use latest stable version for these packages.

With the latest stable version of the packages, there is the issue mentioned in https://github.com/Azure/azure-functions-dotnet-worker/issues/2248 (but then that's depending on azure functions host).

But I am good with closing as long as all the issues are tracked.

gr-ale commented 6 months ago

Hello all,

Just wanting to share that, from my point of view, this is still not resolved and I am still experiencing issues. I initially followed the workaroung given here and downgraded Microsoft.Azure.Functions.Worker.Sdk to version 1.15.1. After doing that, the initial invocation:

Reason='Timer fired at 2024-03-01T00:30:00.0595884+00:00', Id=d237d830-a7ab-4c10-b2f8-b509daefa94f)

worked fine and I thought that was the end of it.

But over the weekend (Australia Southeast)

"Reason='Timer fired at 2024-03-03T00:30:01.3616435+00:00', Id=75bee33c-80e4-45c8-9eba-9a10404f699f"
"Reason='Timer fired at 2024-03-04T00:30:01.3826565+00:00', Id=20540c26-45d4-4cec-8763-ac22da0377b8"

Both failed with the exact same behaviour.

Adding Stack trace

Microsoft.Azure.WebJobs.Host.FunctionTimeoutException:
   at Microsoft.Azure.WebJobs.Host.Executors.FunctionExecutor+<TryHandleTimeoutAsync>d__35.MoveNext (Microsoft.Azure.WebJobs.Host, Version=3.0.39.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35: D:\a\_work\1\s\src\Microsoft.Azure.WebJobs.Host\Executors\FunctionExecutor.cs:663)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter`1.GetResult (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at Microsoft.Azure.WebJobs.Host.Executors.FunctionExecutor+<InvokeWithTimeoutAsync>d__33.MoveNext (Microsoft.Azure.WebJobs.Host, Version=3.0.39.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35: D:\a\_work\1\s\src\Microsoft.Azure.WebJobs.Host\Executors\FunctionExecutor.cs:571)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter`1.GetResult (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at Microsoft.Azure.WebJobs.Host.Executors.FunctionExecutor+<ExecuteWithWatchersAsync>d__32.MoveNext (Microsoft.Azure.WebJobs.Host, Version=3.0.39.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35: D:\a\_work\1\s\src\Microsoft.Azure.WebJobs.Host\Executors\FunctionExecutor.cs:527)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.GetResult (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at Microsoft.Azure.WebJobs.Host.Executors.FunctionExecutor+<ExecuteWithLoggingAsync>d__26.MoveNext (Microsoft.Azure.WebJobs.Host, Version=3.0.39.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35: D:\a\_work\1\s\src\Microsoft.Azure.WebJobs.Host\Executors\FunctionExecutor.cs:306)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at Microsoft.Azure.WebJobs.Host.Executors.FunctionExecutor+<ExecuteWithLoggingAsync>d__26.MoveNext (Microsoft.Azure.WebJobs.Host, Version=3.0.39.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35: D:\a\_work\1\s\src\Microsoft.Azure.WebJobs.Host\Executors\FunctionExecutor.cs:352)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter`1.GetResult (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at Microsoft.Azure.WebJobs.Host.Executors.FunctionExecutor+<TryExecuteAsync>d__18.MoveNext (Microsoft.Azure.WebJobs.Host, Version=3.0.39.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35: D:\a\_work\1\s\src\Microsoft.Azure.WebJobs.Host\Executors\FunctionExecutor.cs:108)

.csproj file

<PropertyGroup>
  <TargetFramework>net8.0</TargetFramework>
  <AzureFunctionsVersion>v4</AzureFunctionsVersion>
  <OutputType>Exe</OutputType>
  <ImplicitUsings>enable</ImplicitUsings>
  <Nullable>enable</Nullable>
</PropertyGroup>
<ItemGroup>
  <PackageReference Include="AutoMapper" Version="10.1.1" />
  <PackageReference Include="AutoMapper.Extensions.Microsoft.DependencyInjection" Version="8.1.1" />
  <PackageReference Include="Handlebars.Net" Version="2.1.4" />
  <PackageReference Include="Microsoft.ApplicationInsights.WorkerService" Version="2.22.0" />
  <PackageReference Include="Microsoft.Azure.Functions.Worker" Version="1.20.1" />
  <PackageReference Include="Microsoft.Azure.Functions.Worker.ApplicationInsights" Version="1.1.0" />
  <PackageReference Include="Microsoft.Azure.Functions.Worker.Extensions.ServiceBus" Version="5.13.0" />
  <PackageReference Include="Microsoft.Azure.Functions.Worker.Extensions.Timer" Version="4.3.0" />
  <PackageReference Include="Microsoft.Azure.Functions.Worker.Sdk" Version="1.15.1" />
  <PackageReference Include="Microsoft.Extensions.Caching.StackExchangeRedis" Version="8.0.1" />
  <PackageReference Include="Microsoft.Extensions.Http" Version="8.0.0" />
  <PackageReference Include="Mindscape.Raygun4Net.AspNetCore" Version="8.2.0" />
  <PackageReference Include="Newtonsoft.Json" Version="13.0.3" />
  <PackageReference Include="Stripe.net" Version="39.76.0" />
  <PackageReference Include="System.ServiceModel.Http" Version="8.0.0" />
</ItemGroup>
<ItemGroup>
  <Using Include="System.Threading.ExecutionContext" Alias="ExecutionContext" />
</ItemGroup>

Runtime version is v4 using dotnet 8 (using dotnet-isolated)

I have opened a support ticket with Azure, just wanted to flag here that this issue still seems to be ongoing. This is just one of the functions that is failing with this error in my app. This is running in Windows dotnet 8 in isolated mode.

jaliyaudagedara commented 6 months ago

@gr-ale,

Could you please share some more information?

@inishantmishra,

if you can, could you please update your issue description to use Markdown (especially put the stack trace inside ```) so it's easy to read?

Thanks in advance!

gr-ale commented 6 months ago

@jaliyaudagedara updated as requested!

inishantmishra commented 5 months ago

I can confirm that after upgrading the packages to following, we don't see any issue and can say our issue is solved now.

 <PackageReference Include="Microsoft.Azure.Functions.Worker" Version="1.21.0" />
 <PackageReference Include="Microsoft.Azure.Functions.Worker.Extensions.ServiceBus" Version="5.16.0" />
 <PackageReference Include="Microsoft.Azure.Functions.Worker.Sdk" Version="1.17.0" />

I also upgraded by .net Framework with - net8.0

Or if you want to use older version, here are the versions which were stable for me.

        .Net Target Framework-> net7.0
       <PackageReference Include="Microsoft.Azure.Functions.Worker" Version="1.19.0" />
      <PackageReference Include="Microsoft.Azure.Functions.Worker.Extensions.ServiceBus" Version="5.11.0" />
      <PackageReference Include="Microsoft.Azure.Functions.Worker.Sdk" Version="1.15.1" />