Azure / azure-sdk-for-net

This repository is for active development of the Azure SDK for .NET. For consumers of the SDK we recommend visiting our public developer docs at https://learn.microsoft.com/dotnet/azure/ or our versioned developer docs at https://azure.github.io/azure-sdk-for-net.
MIT License
5.25k stars 4.59k forks source link

[BUG] ServiceBus MessageLock Exception on CompleteAsync #25275

Closed IvanG96 closed 1 year ago

IvanG96 commented 2 years ago

Describe the bug I'm using the Azure.Messaging.ServiceBus.dll version 7.4.0, ServiceBusReceiver. It is used to get message out of the queue using peek\lock mode and then process it (sometimes processing takes up to 1 hour). Lock time is 5 min, and I have a timer that calls token renew every 4 minutes. The issue is that for the whole time token renew works without any exceptions from library side, but failed on CompleteAsync: The lock supplied is invalid. Either the lock expired, or the message has already been removed from the queue, or was received by a different receiver instance. (MessageLockLost)

According to my logging. Application renewed token for another 5 min, and then failed to complete after 1 min, 11/11/2021, 9:47:38.650 AM | Message lock on '18a3d6f3-0173-4e02-a457-9b118874473b' is renewed. Locked until: '11/11/2021 09:52:38 +00:00'

11/11/2021, 9:48:38.681 AM | Azure.Messaging.ServiceBus.ServiceBusException at Azure.Messaging.ServiceBus.Amqp.AmqpReceiver.ThrowLockLostException

Points of interest: 1) it happens randomly 2) after exception message is still in the queue for the remaining 4 minutes and can not be received by any other receiver 3) I have one receiver per pode/application 4) I have 8 parallel running pods 5) each reveiver consumes only 1 message at a time

Stak trace (from app insights) {"method":"Azure.Messaging.ServiceBus.Amqp.AmqpReceiver.ThrowLockLostException","assembly":"Azure.Messaging.ServiceBus, Version=7.4.0.0, Culture=neutral, PublicKeyToken=92742159e12e44c8","level":0,"line":0} {"method":"Azure.Messaging.ServiceBus.Amqp.AmqpReceiver+d42.MoveNext","assembly":"Azure.Messaging.ServiceBus, Version=7.4.0.0, Culture=neutral, PublicKeyToken=92742159e12e44c8","level":1,"line":0} {"method":"Azure.Messaging.ServiceBus.Amqp.AmqpReceiver+d41.MoveNext","assembly":"Azure.Messaging.ServiceBus, Version=7.4.0.0, Culture=neutral, PublicKeyToken=92742159e12e44c8","level":5,"line":0} {"method":"Azure.Messaging.ServiceBus.Amqp.AmqpReceiver+<>c+<b40_0>d.MoveNext","assembly":"Azure.Messaging.ServiceBus, Version=7.4.0.0, Culture=neutral, PublicKeyToken=92742159e12e44c8","level":9,"line":0} {"method":"Azure.Messaging.ServiceBus.ServiceBusRetryPolicy+<>c20`1+<b20_0>d.MoveNext","assembly":"Azure.Messaging.ServiceBus, Version=7.4.0.0, Culture=neutral, PublicKeyToken=92742159e12e44c8","level":13,"line":0} {"method":"Azure.Messaging.ServiceBus.ServiceBusRetryPolicy+d212.MoveNext","assembly":"Azure.Messaging.ServiceBus, Version=7.4.0.0, Culture=neutral, PublicKeyToken=92742159e12e44c8","level":19,"line":0} {"method":"Azure.Messaging.ServiceBus.ServiceBusRetryPolicy+<RunOperation>d__212.MoveNext","assembly":"Azure.Messaging.ServiceBus, Version=7.4.0.0, Culture=neutral, PublicKeyToken=92742159e12e44c8","level":21,"line":0} {"method":"Azure.Messaging.ServiceBus.ServiceBusRetryPolicy+d20`1.MoveNext","assembly":"Azure.Messaging.ServiceBus, Version=7.4.0.0, Culture=neutral, PublicKeyToken=92742159e12e44c8","level":27,"line":0} {"method":"Azure.Messaging.ServiceBus.Amqp.AmqpReceiver+d40.MoveNext","assembly":"Azure.Messaging.ServiceBus, Version=7.4.0.0, Culture=neutral, PublicKeyToken=92742159e12e44c8","level":31,"line":0} {"method":"Azure.Messaging.ServiceBus.ServiceBusReceiver+d47.MoveNext","assembly":"Azure.Messaging.ServiceBus, Version=7.4.0.0, Culture=neutral, PublicKeyToken=92742159e12e44c8","level":35,"line":0} {"method":"Azure.Messaging.ServiceBus.ServiceBusReceiver+d46.MoveNext","assembly":"Azure.Messaging.ServiceBus, Version=7.4.0.0, Culture=neutral, PublicKeyToken=92742159e12e44c8","level":39,"line":0} [My Code]

Environment:

jsquire commented 2 years ago

Thank you for your feedback. Tagging and routing to the team member best able to assist.

JoshLove-msft commented 2 years ago

It's possible that the renew lock could succeed whereas the message completion could fail if the link was temporarily disconnected and had to reconnect. Can you check in your logs to see if this happened? It would be logged in the SDK logging as a Verbose event.

IvanG96 commented 2 years ago

Thank you for your quick reply. I've enabled logging and will let you know once this issue happens again (today unfrotunately it hasn't reproduced).

It's possible that the renew lock could succeed whereas the message completion could fail if the link was temporarily disconnected and had to reconnect.

I'm wondering if its an SDK known issue, or if it's a normal AMQP behaviour written in the spec ?

JoshLove-msft commented 2 years ago

This isn't part of the AMQP spec, but it is a known issue in the SDK/service. We will be looking at ways to improve the lock lost experience in the coming months.

shlomiassaf commented 2 years ago

@JoshLove-msft This is also happing to us.

I've looked at the statistics from Application Insights and this is what I see:

Of course, once failed, the remaining lock time has to be waited for the message to return. In our scenario, we have a 5 minutes lock wait and we're processing the subscribers @ 200ms + 2 seconds for the complete to return lock failure which means we wait 5 minutes for the message to come back.

Is there a workaround here? manually completing? checking something?

JoshLove-msft commented 2 years ago

If the complete is failing due to the message lock being lost, you shouldn't have to wait for 5 minutes to receive the message again. Can you include logs from an instance where this occurs?

shlomiassaf commented 2 years ago

@JoshLove-msft

I don't think it's the case. The lock is lost at the client side, but it did not get revoked.
The message did not complete nor abandoned.

The client was disconnected and reconnected thus loosing the lock so the server has no idea and we just wait for the lock to release due to timeout.

The metics in Application Insights here are quite clear.
We have an intensive 2 minute processing flow which we loost about 1% to this error.
After 4 to 5 minutes we see these pop back to get handled in App insights

JoshLove-msft commented 2 years ago

@JoshLove-msft

I don't think it's the case. The lock is lost at the client side, but it did not get revoked. The message did not complete nor abandoned.

The client was disconnected and reconnected thus loosing the lock so the server has no idea and we just wait for the lock to release due to timeout.

The metics in Application Insights here are quite clear. We have an intensive 2 minute processing flow which we loost about 1% to this error. After 4 to 5 minutes we see these pop back to get handled in App insights

I see, can you confirm that the connection is dropped from the logs?

shlomiassaf commented 2 years ago

I'll try to reproduce it locally, since verbose level on a live host will just fill up our logs...

For the time being, any hints on a workaround? can we check something before completing,?

JoshLove-msft commented 2 years ago

I don't think so - there are some things we may be able to do in the library to help with this - namely trying to complete over the mgmt link if the receive link gets disconnected.

I think it is odd that you would be having the links/connections dropped so frequently, so it would be great to get a repro.

JoshLove-msft commented 2 years ago

Also, which version are you using? There was a fix added in 7.4.0 involving unlocking messages when the processor closes.

shlomiassaf commented 2 years ago

@JoshLove-msft I'll try reproducing it.

Do you know the diagnostics source key I can use to catch this event? I'll send it to application insights so I can get a time series.

I'm using 7.3, i'll try 7.4

JoshLove-msft commented 2 years ago

@JoshLove-msft I'll try reproducing it.

Do you know the diagnostics source key I can use to catch this event? I'll send it to application insights so I can get a time series.

I'm using 7.3, i'll try 7.4

May as well go to 7.5 😄

I believe it would be this event - https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/servicebus/Azure.Messaging.ServiceBus/src/Diagnostics/ServiceBusEventSource.cs#L1030

shlomiassaf commented 2 years ago

@JoshLove-msft

I might have found something. It appears that before every error there is a warning:

RunOperation encountered an exception and will retry. Exception: Azure.Messaging.ServiceBus.ServiceBusException: The request was terminated because the namespace sharegain-perf-queue is being throttled. Error code : 50009. Please wait 2 seconds and try again. To know more visit https://aka.ms/sbResourceMgrExceptions

This fits perfectly with the 2 seconds delay I get for every complete that fails.

For some reason, when throttling it looses the lock, strange.
I would expect it to just wait and commit, now we need some funky logic to span the requests?

Also strange, is that the entire service (9 instances) only throttle on a single queue, other queue work just fine and does not get throttled

image

JoshLove-msft commented 2 years ago

Ah yes, there is an existing issue about throttling resulting in losing the locks. https://github.com/Azure/azure-sdk-for-net/issues/24060 This shouldn't be happening.

@DorothySun216 any update on the investigation into this?

JoshLove-msft commented 2 years ago

There was an issue in the SDK where we were not backing off when being throttled. Fixing in https://github.com/Azure/azure-sdk-for-net/pull/25700

JoshLove-msft commented 2 years ago

The fix has been merged and we will be shipping a patch release tomorrow. Please reopen if you still encounter the issue.

JoshLove-msft commented 2 years ago

Reopening - while the fix seems to have made it less likely to occur, there still seems to be an issue on the service side.

JoshLove-msft commented 2 years ago

I talked to @xinchen10, and he mentioned that the service should not be returning settled = true in this case:

[6420.017 22:53:30.910] RECV FRM(01B6|0002 disposition(role:False,first:37,settled:True,state:rejected(error:error(condition:com.microsoft:server-busy,description:The request was terminated because the namespace is being throttled. Error code : 50009. Please wait 2 seconds and try again. To know more visit https://aka.ms/sbResourceMgrExceptions TrackingId:c14075bc00007e6e0834367861afe5e9_G63_B31, SystemTracker:sb-jolovservicebus:Queue:1e41b56f-084c-serverbusyrespe, Timestamp:2021-12-07T22:53:30))))

ghost commented 2 years ago

Thanks for the feedback! We are routing this to the appropriate team for follow-up. cc @shankarsama, @DorothySun216.

Issue Details
**Describe the bug** I'm using the Azure.Messaging.ServiceBus.dll version 7.4.0, ServiceBusReceiver. It is used to get message out of the queue using peek\lock mode and then process it (sometimes processing takes up to 1 hour). Lock time is 5 min, and I have a timer that calls token renew every 4 minutes. The issue is that for the whole time token renew works without any exceptions from library side, but failed on CompleteAsync: _The lock supplied is invalid. Either the lock expired, or the message has already been removed from the queue, or was received by a different receiver instance. (MessageLockLost)_ According to my logging. Application renewed token for another 5 min, and then failed to complete after 1 min, _11/11/2021, 9:47:38.650 AM | Message lock on '18a3d6f3-0173-4e02-a457-9b118874473b' is renewed. Locked until: '11/11/2021 09:52:38 +00:00'_ _11/11/2021, 9:48:38.681 AM | Azure.Messaging.ServiceBus.ServiceBusException at Azure.Messaging.ServiceBus.Amqp.AmqpReceiver.ThrowLockLostException_ Points of interest: 1) it happens randomly 2) after exception message is still in the queue for the remaining 4 minutes and can not be received by any other receiver 3) I have one receiver per pode/application 4) I have 8 parallel running pods 5) each reveiver consumes only 1 message at a time **Stak trace (from app insights)** {"method":"Azure.Messaging.ServiceBus.Amqp.AmqpReceiver.ThrowLockLostException","assembly":"Azure.Messaging.ServiceBus, Version=7.4.0.0, Culture=neutral, PublicKeyToken=92742159e12e44c8","level":0,"line":0} {"method":"Azure.Messaging.ServiceBus.Amqp.AmqpReceiver+d__42.MoveNext","assembly":"Azure.Messaging.ServiceBus, Version=7.4.0.0, Culture=neutral, PublicKeyToken=92742159e12e44c8","level":1,"line":0} {"method":"Azure.Messaging.ServiceBus.Amqp.AmqpReceiver+d__41.MoveNext","assembly":"Azure.Messaging.ServiceBus, Version=7.4.0.0, Culture=neutral, PublicKeyToken=92742159e12e44c8","level":5,"line":0} {"method":"Azure.Messaging.ServiceBus.Amqp.AmqpReceiver+<>c+<b__40_0>d.MoveNext","assembly":"Azure.Messaging.ServiceBus, Version=7.4.0.0, Culture=neutral, PublicKeyToken=92742159e12e44c8","level":9,"line":0} {"method":"Azure.Messaging.ServiceBus.ServiceBusRetryPolicy+<>c__20`1+<b__20_0>d.MoveNext","assembly":"Azure.Messaging.ServiceBus, Version=7.4.0.0, Culture=neutral, PublicKeyToken=92742159e12e44c8","level":13,"line":0} {"method":"Azure.Messaging.ServiceBus.ServiceBusRetryPolicy+d__21`2.MoveNext","assembly":"Azure.Messaging.ServiceBus, Version=7.4.0.0, Culture=neutral, PublicKeyToken=92742159e12e44c8","level":19,"line":0} {"method":"Azure.Messaging.ServiceBus.ServiceBusRetryPolicy+d__21`2.MoveNext","assembly":"Azure.Messaging.ServiceBus, Version=7.4.0.0, Culture=neutral, PublicKeyToken=92742159e12e44c8","level":21,"line":0} {"method":"Azure.Messaging.ServiceBus.ServiceBusRetryPolicy+d__20`1.MoveNext","assembly":"Azure.Messaging.ServiceBus, Version=7.4.0.0, Culture=neutral, PublicKeyToken=92742159e12e44c8","level":27,"line":0} {"method":"Azure.Messaging.ServiceBus.Amqp.AmqpReceiver+d__40.MoveNext","assembly":"Azure.Messaging.ServiceBus, Version=7.4.0.0, Culture=neutral, PublicKeyToken=92742159e12e44c8","level":31,"line":0} {"method":"Azure.Messaging.ServiceBus.ServiceBusReceiver+d__47.MoveNext","assembly":"Azure.Messaging.ServiceBus, Version=7.4.0.0, Culture=neutral, PublicKeyToken=92742159e12e44c8","level":35,"line":0} {"method":"Azure.Messaging.ServiceBus.ServiceBusReceiver+d__46.MoveNext","assembly":"Azure.Messaging.ServiceBus, Version=7.4.0.0, Culture=neutral, PublicKeyToken=92742159e12e44c8","level":39,"line":0} [My Code] **Environment:** - Azure.Messaging.ServiceBus.dll version 7.4.0 - application is running in Azure aks cluster - wasn't able to reproduce on local windows machine - Docker containers FROM mcr.microsoft.com/dotnet/aspnet:3.1 - kubernetes version 1.19.11 - Message lock time = 5min - Receiver settings: "TransportType": "AmqpWebSockets", "Retry": { "MaxRetries": 5, "TryTimeout": 1200000, "Delay": 30000 }
Author: IvanG96
Assignees: JoshLove-msft
Labels: `Service Bus`, `Service Attention`, `Client`, `customer-reported`, `question`
Milestone: Backlog
Blackbaud-IanThomas commented 2 years ago

@shankarsama @DorothySun216 can you provide any updates on this bug? I see it was last routed towards both of you for "service follow-up". We are seeing exactly the issue described here when attempting to complete messages when the service bus instance is getting throttled.

JoshLove-msft commented 2 years ago

@EldertGrootenboer can you help follow up on this issue?

CodeTherapist commented 2 years ago

Possibly we do have the same issue. Is there any workaround to mitigate the issue?

EldertGrootenboer commented 2 years ago

We have assigned this to an engineer for investigation, and will update this issue when more information is available.

Rayzbam commented 2 years ago

It seems we have the same issue.

Another thing on our side, this behavior just get worse after we upgraded to Premium tier. We used to have this kind of error sometimes but not really often, then we switched up to the Premium tier and now we just have this kind of error everyday without any code update.

An example of last night : image

We have a bunch of error logs at the same period (around 1-2seconds with like 100 LockLostException) then nothing. We have 2 nodes listening on the namespace, and we can see that these 2 machines throw this exception at the same time (in the image, half is on nodeA other half is on nodeB) so it seems like a kind of server error i guess. In Azure portal we can't see any "Throttling requests".

We can't enable verbose logs on our Production environment, so i can't really say that it was throttling at this exact time. Tbh i don't think we can throttle the instance, even more in a Premium tier but i might be wrong (it would be sad), but we actually have a lot of traffic through send and receive operations.

Rayzbam commented 2 years ago

@shankarsama, @DorothySun216 Any update ? Workaround ? Still a big issue..

CodeTherapist commented 2 years ago

After further investigation (also with @danielmarbach - thanks a lot 😃) of the issue, it seems likely to be a deadlock within the Azure Service Bus SDK or it's dependencies.

We observed that CompleteAsync start is written and the CompleteAsync done is missing. After this, the receiver link is somehow "broken" and the connection to the SB backend is closed after 10 minutes. The logs shows exactly that behavior:

2022-09-10T01:01:18.153025072Z       por******-5e5e0ea1-7b22-4070-b78c-b6bc90361d67-Receiver: CompleteAsync start. MessageCount = 1, LockToken = 37ad7e01-437f-4a53-a008-5a2a574d1a64\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:18.168381560Z info: Azure.Messaging.ServiceBus[17]\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:18.168401162Z       por******-5e5e0ea1-7b22-4070-b78c-b6bc90361d67-Receiver: CompleteAsync done. LockToken = 37ad7e01-437f-4a53-a008-5a2a574d1a64\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:18.169076401Z info: Azure.Messaging.ServiceBus[103]\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:18.169090701Z       por******-5e5e0ea1-7b22-4070-b78c-b6bc90361d67: User message handler complete: Message: SequenceNumber: 8170616, LockToken: 37ad7e01-437f-4a53-a008-5a2a574d1a64\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:18.169095302Z info: Azure.Messaging.ServiceBus[7]\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:18.169099402Z       por******-5e5e0ea1-7b22-4070-b78c-b6bc90361d67-Receiver: ReceiveBatchAsync start. MessageCount = 1\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:18.306455047Z info: Azure.Messaging.ServiceBus[8]\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:18.306537152Z       por******-5e5e0ea1-7b22-4070-b78c-b6bc90361d67-Receiver: ReceiveBatchAsync done. Received '1' messages. LockTokens = <LockToken>7e3274f0-2990-4205-8431-47b651769f64</LockToken>\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:18.307425903Z info: Azure.Messaging.ServiceBus[56]\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:18.307441704Z       por******-5e5e0ea1-7b22-4070-b78c-b6bc90361d67: Processor RenewMessageLock start. MessageCount = 1, LockToken = 7e3274f0-2990-4205-8431-47b651769f64\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:18.308291354Z info: Azure.Messaging.ServiceBus[102]\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:18.308305554Z       por******-5e5e0ea1-7b22-4070-b78c-b6bc90361d67: User message handler start: Message: SequenceNumber: 8170617, LockToken: 7e3274f0-2990-4205-8431-47b651769f64\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:18.308319055Z trce: Ene****.Messaging.ServiceBus.IotMessageConsumerService[0]\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:18.308323955Z       Message '45f00a0108134eedbb642a31aa3f64f8' from device 'e758a762-93f6-462e-a9ca-0579153a95d7' of type 'ConsumptionMessageV2'.\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:18.308336456Z info: Azure.Messaging.ServiceBus[16]\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:18.308340756Z       por******-5e5e0ea1-7b22-4070-b78c-b6bc90361d67-Receiver: CompleteAsync start. MessageCount = 1, LockToken = 7e3274f0-2990-4205-8431-47b651769f64\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:18.326996736Z info: Azure.Messaging.ServiceBus[17]\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:18.327013437Z       por******-5e5e0ea1-7b22-4070-b78c-b6bc90361d67-Receiver: CompleteAsync done. LockToken = 7e3274f0-2990-4205-8431-47b651769f64\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:18.327809583Z info: Azure.Messaging.ServiceBus[103]\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:18.327823183Z       por******-5e5e0ea1-7b22-4070-b78c-b6bc90361d67: User message handler complete: Message: SequenceNumber: 8170617, LockToken: 7e3274f0-2990-4205-8431-47b651769f64\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:18.328569527Z info: Azure.Messaging.ServiceBus[7]\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:18.328585027Z       por******-5e5e0ea1-7b22-4070-b78c-b6bc90361d67-Receiver: ReceiveBatchAsync start. MessageCount = 1\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:18.618963824Z info: Azure.Messaging.ServiceBus[8]\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:18.618981726Z       por******-5e5e0ea1-7b22-4070-b78c-b6bc90361d67-Receiver: ReceiveBatchAsync done. Received '1' messages. LockTokens = <LockToken>9f932789-7427-4480-96d5-188e5cc82f21</LockToken>\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:18.619967083Z info: Azure.Messaging.ServiceBus[56]\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:18.619983183Z       por******-5e5e0ea1-7b22-4070-b78c-b6bc90361d67: Processor RenewMessageLock start. MessageCount = 1, LockToken = 9f932789-7427-4480-96d5-188e5cc82f21\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:18.619987884Z info: Azure.Messaging.ServiceBus[102]\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:18.619999584Z       por******-5e5e0ea1-7b22-4070-b78c-b6bc90361d67: User message handler start: Message: SequenceNumber: 8170618, LockToken: 9f932789-7427-4480-96d5-188e5cc82f21\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:18.620004185Z trce: Ene****.Messaging.ServiceBus.IotMessageConsumerService[0]\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:18.620008385Z       Message '46f0171c4e914cd59f9d7bd524c171c5' from device '9f0c33b6-3c50-4019-a05d-fa16b61e1cfa' of type 'ConsumptionMessageV2'.\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:18.620594519Z info: Azure.Messaging.ServiceBus[16]\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:18.620608120Z       por******-5e5e0ea1-7b22-4070-b78c-b6bc90361d67-Receiver: CompleteAsync start. MessageCount = 1, LockToken = 9f932789-7427-4480-96d5-188e5cc82f21\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:18.638961781Z info: Azure.Messaging.ServiceBus[17]\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:18.639040486Z       por******-5e5e0ea1-7b22-4070-b78c-b6bc90361d67-Receiver: CompleteAsync done. LockToken = 9f932789-7427-4480-96d5-188e5cc82f21\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:18.639285400Z info: Azure.Messaging.ServiceBus[103]\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:18.639296601Z       por******-5e5e0ea1-7b22-4070-b78c-b6bc90361d67: User message handler complete: Message: SequenceNumber: 8170618, LockToken: 9f932789-7427-4480-96d5-188e5cc82f21\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:18.639301701Z info: Azure.Messaging.ServiceBus[7]\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:18.639306301Z       por******-5e5e0ea1-7b22-4070-b78c-b6bc90361d67-Receiver: ReceiveBatchAsync start. MessageCount = 1\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:18.908258159Z info: Azure.Messaging.ServiceBus[8]\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:18.908278060Z       por******-5e5e0ea1-7b22-4070-b78c-b6bc90361d67-Receiver: ReceiveBatchAsync done. Received '1' messages. LockTokens = <LockToken>c8149c85-639e-4cf1-b240-28b8730ae9ed</LockToken>\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:18.908283960Z info: Azure.Messaging.ServiceBus[56]\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:18.908288661Z       por******-5e5e0ea1-7b22-4070-b78c-b6bc90361d67: Processor RenewMessageLock start. MessageCount = 1, LockToken = c8149c85-639e-4cf1-b240-28b8730ae9ed\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:18.908293361Z info: Azure.Messaging.ServiceBus[102]\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:18.908297661Z       por******-5e5e0ea1-7b22-4070-b78c-b6bc90361d67: User message handler start: Message: SequenceNumber: 8170619, LockToken: c8149c85-639e-4cf1-b240-28b8730ae9ed\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:18.908302061Z trce: Ene****.Messaging.ServiceBus.IotMessageConsumerService[0]\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:18.908306562Z       Message 'e593a5fbe3234cd99a41b6da9cd334f7' from device '7595416e-50e6-4949-a821-112ae5df787e' of type 'ConsumptionMessageV1'.\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:18.908316462Z info: Azure.Messaging.ServiceBus[16]\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:18.908320962Z       por******-5e5e0ea1-7b22-4070-b78c-b6bc90361d67-Receiver: CompleteAsync start. MessageCount = 1, LockToken = c8149c85-639e-4cf1-b240-28b8730ae9ed\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:18.918614758Z info: Azure.Messaging.ServiceBus[17]\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:18.918632959Z       por******-5e5e0ea1-7b22-4070-b78c-b6bc90361d67-Receiver: CompleteAsync done. LockToken = c8149c85-639e-4cf1-b240-28b8730ae9ed\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:18.925228440Z info: Azure.Messaging.ServiceBus[103]\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:18.925246441Z       por******-5e5e0ea1-7b22-4070-b78c-b6bc90361d67: User message handler complete: Message: SequenceNumber: 8170619, LockToken: c8149c85-639e-4cf1-b240-28b8730ae9ed\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:18.925251942Z info: Azure.Messaging.ServiceBus[7]\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:18.925256342Z       por******-5e5e0ea1-7b22-4070-b78c-b6bc90361d67-Receiver: ReceiveBatchAsync start. MessageCount = 1\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:20.187015029Z info: Azure.Messaging.ServiceBus[8]\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:20.187067832Z       por******-5e5e0ea1-7b22-4070-b78c-b6bc90361d67-Receiver: ReceiveBatchAsync done. Received '1' messages. LockTokens = <LockToken>363a750c-e4d6-4451-9ad2-532fe2aad042</LockToken>\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:20.188309604Z info: Azure.Messaging.ServiceBus[56]\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:20.188326205Z       por******-5e5e0ea1-7b22-4070-b78c-b6bc90361d67: Processor RenewMessageLock start. MessageCount = 1, LockToken = 363a750c-e4d6-4451-9ad2-532fe2aad042\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:20.188331305Z info: Azure.Messaging.ServiceBus[102]\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:20.188336105Z       por******-5e5e0ea1-7b22-4070-b78c-b6bc90361d67: User message handler start: Message: SequenceNumber: 8170620, LockToken: 363a750c-e4d6-4451-9ad2-532fe2aad042\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:20.188386108Z trce: Ene****.Messaging.ServiceBus.IotMessageConsumerService[0]\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:20.188391809Z       Message '7a16765dfb3145438702e1f09da9948b' from device 'de8bf214-1cb6-4e60-b35b-36ad1fe8ee37' of type 'ConsumptionMessageV2'.\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:20.188396009Z info: Azure.Messaging.ServiceBus[16]\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:20.188400109Z       por******-5e5e0ea1-7b22-4070-b78c-b6bc90361d67-Receiver: CompleteAsync start. MessageCount = 1, LockToken = 363a750c-e4d6-4451-9ad2-532fe2aad042\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:20.269896723Z info: Azure.Messaging.ServiceBus[17]\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:20.269945626Z       por******-5e5e0ea1-7b22-4070-b78c-b6bc90361d67-Receiver: CompleteAsync done. LockToken = 363a750c-e4d6-4451-9ad2-532fe2aad042\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:20.270811276Z info: Azure.Messaging.ServiceBus[103]\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:20.270826677Z       por******-5e5e0ea1-7b22-4070-b78c-b6bc90361d67: User message handler complete: Message: SequenceNumber: 8170620, LockToken: 363a750c-e4d6-4451-9ad2-532fe2aad042\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:20.270831377Z info: Azure.Messaging.ServiceBus[7]\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:20.270835478Z       por******-5e5e0ea1-7b22-4070-b78c-b6bc90361d67-Receiver: ReceiveBatchAsync start. MessageCount = 1\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:20.522777951Z info: Azure.Messaging.ServiceBus[8]\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:20.522807553Z       por******-5e5e0ea1-7b22-4070-b78c-b6bc90361d67-Receiver: ReceiveBatchAsync done. Received '1' messages. LockTokens = <LockToken>a468dcc4-e97f-45f4-88b3-7a70af1a4a56</LockToken>\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:20.522815053Z info: Azure.Messaging.ServiceBus[56]\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:20.522819854Z       por******-5e5e0ea1-7b22-4070-b78c-b6bc90361d67: Processor RenewMessageLock start. MessageCount = 1, LockToken = a468dcc4-e97f-45f4-88b3-7a70af1a4a56\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:20.522824454Z info: Azure.Messaging.ServiceBus[102]\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:20.522828854Z       por******-5e5e0ea1-7b22-4070-b78c-b6bc90361d67: User message handler start: Message: SequenceNumber: 8170621, LockToken: a468dcc4-e97f-45f4-88b3-7a70af1a4a56\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:20.522844655Z trce: Ene****.Messaging.ServiceBus.IotMessageConsumerService[0]\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:20.522849455Z       Message '37ae65179c634c6692a87bc7b913472e' from device 'f159b50f-25b7-41fe-87af-77967d525091' of type 'ConsumptionMessageV2'.\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:20.522853756Z info: Azure.Messaging.ServiceBus[16]\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:20.522858056Z       por******-5e5e0ea1-7b22-4070-b78c-b6bc90361d67-Receiver: CompleteAsync start. MessageCount = 1, LockToken = a468dcc4-e97f-45f4-88b3-7a70af1a4a56\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:20.547231066Z info: Azure.Messaging.ServiceBus[17]\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:20.547272568Z       por******-5e5e0ea1-7b22-4070-b78c-b6bc90361d67-Receiver: CompleteAsync done. LockToken = a468dcc4-e97f-45f4-88b3-7a70af1a4a56\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:20.547278969Z info: Azure.Messaging.ServiceBus[103]\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:20.547283869Z       por******-5e5e0ea1-7b22-4070-b78c-b6bc90361d67: User message handler complete: Message: SequenceNumber: 8170621, LockToken: a468dcc4-e97f-45f4-88b3-7a70af1a4a56\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:20.547288469Z info: Azure.Messaging.ServiceBus[7]\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:20.547292869Z       por******-5e5e0ea1-7b22-4070-b78c-b6bc90361d67-Receiver: ReceiveBatchAsync start. MessageCount = 1\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:21.533658717Z info: Azure.Messaging.ServiceBus[8]\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:21.534800183Z       por******-5e5e0ea1-7b22-4070-b78c-b6bc90361d67-Receiver: ReceiveBatchAsync done. Received '1' messages. LockTokens = <LockToken>23497f79-2333-402f-95bc-5030f357c14d</LockToken>\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:21.535118901Z info: Azure.Messaging.ServiceBus[56]\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:21.535139502Z       por******-5e5e0ea1-7b22-4070-b78c-b6bc90361d67: Processor RenewMessageLock start. MessageCount = 1, LockToken = 23497f79-2333-402f-95bc-5030f357c14d\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:21.535319713Z info: Azure.Messaging.ServiceBus[102]\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:21.535372216Z       por******-5e5e0ea1-7b22-4070-b78c-b6bc90361d67: User message handler start: Message: SequenceNumber: 8170622, LockToken: 23497f79-2333-402f-95bc-5030f357c14d\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:21.536030954Z trce: Ene****.Messaging.ServiceBus.IotMessageConsumerService[0]\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:21.536051055Z       Message '9ae69b8f51424dada7efbdefdab2f0e3' from device '390058e1-526c-4d63-89e0-76c29ea71fc2' of type 'ConsumptionMessageV2'.\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:21.536059556Z info: Azure.Messaging.ServiceBus[16]\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:21.536067856Z       por******-5e5e0ea1-7b22-4070-b78c-b6bc90361d67-Receiver: CompleteAsync start. MessageCount = 1, LockToken = 23497f79-2333-402f-95bc-5030f357c14d\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:21.605630279Z info: Azure.Messaging.ServiceBus[17]\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:21.605654680Z       por******-5e5e0ea1-7b22-4070-b78c-b6bc90361d67-Receiver: CompleteAsync done. LockToken = 23497f79-2333-402f-95bc-5030f357c14d\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:21.605669781Z info: Azure.Messaging.ServiceBus[103]\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:21.605674681Z       por******-5e5e0ea1-7b22-4070-b78c-b6bc90361d67: User message handler complete: Message: SequenceNumber: 8170622, LockToken: 23497f79-2333-402f-95bc-5030f357c14d\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:21.605678881Z info: Azure.Messaging.ServiceBus[7]\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:21.605692182Z       por******-5e5e0ea1-7b22-4070-b78c-b6bc90361d67-Receiver: ReceiveBatchAsync start. MessageCount = 1\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:21.615313639Z info: Azure.Messaging.ServiceBus[8]\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:21.615332140Z       por******-5e5e0ea1-7b22-4070-b78c-b6bc90361d67-Receiver: ReceiveBatchAsync done. Received '1' messages. LockTokens = <LockToken>8dd2385f-a27e-4da1-8003-9f1f63633eea</LockToken>\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:21.615337740Z info: Azure.Messaging.ServiceBus[56]\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:21.615341840Z       por******-5e5e0ea1-7b22-4070-b78c-b6bc90361d67: Processor RenewMessageLock start. MessageCount = 1, LockToken = 8dd2385f-a27e-4da1-8003-9f1f63633eea\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:21.615346040Z info: Azure.Messaging.ServiceBus[102]\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:21.615358141Z       por******-5e5e0ea1-7b22-4070-b78c-b6bc90361d67: User message handler start: Message: SequenceNumber: 8170623, LockToken: 8dd2385f-a27e-4da1-8003-9f1f63633eea\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:21.615362441Z trce: Ene****.Messaging.ServiceBus.IotMessageConsumerService[0]\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:21.615366342Z       Message '51c3022161aa49ee99e7da0c1ca3d049' from device '5abd3dab-8610-4868-b611-11cfce6fe4b7' of type 'ConsumptionMessageV2'.\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:21.615370142Z info: Azure.Messaging.ServiceBus[16]\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:01:21.615373942Z       por******-5e5e0ea1-7b22-4070-b78c-b6bc90361d67-Receiver: CompleteAsync start. MessageCount = 1, LockToken = 8dd2385f-a27e-4da1-8003-9f1f63633eea\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:11:25.607430932Z info: Azure.Messaging.ServiceBus[38]\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}
2022-09-10T01:11:25.611619278Z       Receive Link Closed. Identifier: por******-5e5e0ea1-7b22-4070-b78c-b6bc90361d67-Receiver, SessionId: , linkException: Azure.Messaging.ServiceBus.ServiceBusException: The link 'G2:27800721:amqps://en*****-prod.servicebus.windows.net/-18c9170c;30:37:38:source(address:/por******,filter:[])' is force detached. Code: consumer(link3048477). Details: AmqpMessageConsumer.IdleTimerExpired: Idle timeout: 00:10:00. TrackingId:875fb75300000014002e841d631bcf6c_G2_B4, SystemTracker:en*****-prod:Queue:por******, Timestamp:2022-09-10T01:11:25 (GeneralError).\n\n", "level": "Informational", "EventStampType": "Stamp", "EventPrimaryStampName": "waws-xxx-zrh-xxxx", "EventStampName": "waws-xxx-zrh-xxxx", "Host": "RD501AC520E361", "EventIpAddress": "x.x.x.x"}

Unfortunately, we are unable to reproduce that behavior constantly by intension when we want it. Any help and hints to solve that issue would be highly appreciated.

danielmarbach commented 2 years ago

@JoshLove-msft @jsquire To add a bit more detail here we already went through most of the custom code that was in place and are fairly certain there is no other potential source of deadlock because all the custom code completes but the actual underlying AMQP link interaction and completion seems to hang as indicated in the logs.

We have already verified a few things like:

image

They are running on 7.10 which also uses 2.5.10 of AMQP library which already contains the fix for the singleton object task completion source creation using TaskContinuationOptions.RunAsynchronously (that would have been a potential source of deadlock). They have also tried to switch from the processor to do manual receives and completions and have seen the same pattern happen occasionally.

I have tried to reproduce it with a "silly" console application

sing Azure.Core.Diagnostics;
using Azure.Messaging.ServiceBus;
using Azure.Messaging.ServiceBus.Administration;

var queueName = "somequeue";
var managementClient = new ServiceBusAdministrationClient(Environment.GetEnvironmentVariable("AzureServiceBusConnectionString"));
if (await managementClient.QueueExistsAsync(queueName) == true)
{
    await managementClient.DeleteQueueAsync(queueName);
}
await managementClient.CreateQueueAsync(queueName);

var serviceBusClient = new ServiceBusClient(Environment.GetEnvironmentVariable("AzureServiceBusConnectionString"));
await using var sender = serviceBusClient.CreateSender(queueName);
await sender.SendMessageAsync(new ServiceBusMessage("FOOBAR"));
await sender.CloseAsync();

using AzureEventSourceListener listener = AzureEventSourceListener.CreateConsoleLogger();
await using var processor = serviceBusClient.CreateProcessor(queueName, new ServiceBusProcessorOptions
{
    AutoCompleteMessages = false,
    PrefetchCount = 10,
    MaxConcurrentCalls = 1,
    MaxAutoLockRenewalDuration = TimeSpan.FromMinutes(4)
});
processor.ProcessErrorAsync += args =>
{
    Console.WriteLine(args.Exception);
    return Task.CompletedTask;
};
var tcs = new TaskCompletionSource(TaskCreationOptions.RunContinuationsAsynchronously);
processor.ProcessMessageAsync += async args =>
{
    Console.WriteLine($"Got message with id: {args.Message.MessageId}");
    await tcs.Task;
    await args.CompleteMessageAsync(args.Message, args.CancellationToken);
};
await processor.StartProcessingAsync(CancellationToken.None);
Console.ReadLine();
tcs.TrySetResult();

Console.ReadLine();
await processor.StopProcessingAsync(CancellationToken.None);

The idea was to once the message is being received cap the network connection, wait for the link timeout and renewals to be attempted and then reconnect to see if we can reproduce the scenario, but we failed so far. I was planning to have a more in-depth look into the underlying AMQP interaction at some point but haven't had time yet. To me, it looks like the receiver link is "broken" and doesn't recover and then things come to a grinding halt.

danielmarbach commented 2 years ago

A quick update. We are trying to get a memory dump from production when that happens and will report back with further investigations. While I was spelunking in the code I bit I have observed two potential areas of problems, but I'm not entirely sure if they are just red hearings because I look at the code from the sidelines.

One of the prominent things in the stack trace is the ServiceBusRetryPolicy. Could it be that the retry policy goes into some kind of infinite loop due to the server busy indicator reordering? Another thing I observed (which could be totally my fault :face_with_head_bandage: ) is that disposing the message takes a buffer from the pool and then calls AMQP with batchable = true. Once the method returns the buffer is returned. If I'm not mistaken, batchable = true means the AMQP IO writers will write the actual payload "behind the scenes". Could we run into edge cases where the lock token buffer is returned before the operation is executed in the background, and then all sorts of craziness happens?

EldertGrootenboer commented 1 year ago

We fixed a bug related to this recently. Production rollout for this has been completed. Please reopen this issue if it reproduces in future.