Azure / azure-service-bus-java

☁️ Java client library for Azure Service Bus
https://azure.microsoft.com/services/service-bus
MIT License
60 stars 59 forks source link

Invalid UUID error might cause OutOfMemoryError #336

Closed jemag closed 5 years ago

jemag commented 5 years ago

Describe the bug I will try to be brief since the issue, with all the necessary details, has already been created on the spring-cloud-azure side. (https://github.com/Microsoft/spring-cloud-azure/issues/472)

Basically, there was a problem in the service bus topic stream binder that would not accept ids without "-". This creates an exception with their message converter. Those specific exceptions are not the problem however.

It seems that repeated such exceptions will then bring about an OutOfMemory exception. A quick glance at the heap dump (found in this post of the issue https://github.com/Microsoft/spring-cloud-azure/issues/472#issuecomment-463732071) seems to report the following:

"144,204 instances of "org.apache.qpid.proton.engine.impl.DeliveryImpl", loaded by "sun.misc.Launcher$AppClassLoader @ 0xf7150000" occupy 154,626,752 (72.99%) bytes."

Since it seems this is part of service bus java sdk, the folks at spring-cloud-azure have redirected me here.

To Reproduce

Expected behavior Memory does not steadily increases

Observed behavior Memory keeps increasing until OutOfMemory error

Environment (please complete the following information):

Additional context Most of the information should be available in the spring-cloud-azure issue referenced at the beginning.

yvgopal commented 5 years ago

I have looked at the errors generated by the sprint integration library. Fixing that error will fix the memory issue too. Those DeliveryImpl objects that you found on the heap are not leaks. AMQP receivers have to keep deliveries in memory, so they can be either completed/deadlettered/abandoned/deferred. When a client is receiving messages but not acting on them ( in your cases, it is because of the message id error), those deliveries pile up in memory. Fix the error and memory issue will be gone. If the message converter class of the Spring library cannot process a message, it must catch the exception and deadletter the message or defer it. It just should not throw an exception and leave it like that.

warrenzhu25 commented 5 years ago

I have used IMessageHandler to handle message in an aync way. I suppose this IMessageHandler should handle any exception thrown by onMessageAsync() properly. Do you have a code sample to show how to defer the message when exception thrown in onMessageAsync()?

yvgopal commented 5 years ago

IMessageHandler handles any exception thrown by onMessageAsync() method. That's why that onMessageAysnc loop keeps running, even with all the exceptions. But it doesn't abandon or complete the message if AutoComplete option of MessageHandlerOptions is not true. That's why you are seeing those memory leaks.

warrenzhu25 commented 5 years ago

I still couldn't get your point. Let's suppose in my scenario AutoComplete is false, and inside onMessageAsync, some exception is thrown. Then service bus sdk should handle this exception in a robust way. If I neither abandon or complete this message, sdk shouldn't have any memory leak and this message supposed to reappear after defined timeout.

yvgopal commented 5 years ago

What you say is correct. If you keep receiving message without abandoning or completing messages, a receiver is going to leak memory. But I think it is an application error, and application will have to fix it. AMQP expects a link to keep a list of unsettled deliveries in memory. To fix this leak, I will have to impost some sort of hard limit on count or elapsed time or something else. But I will treat this as a low priority item and will not fix it in the near future.

warrenzhu25 commented 5 years ago

Thanks for your clear explanation. It seems abandoning message in my processing logic is right way at the moment. I think if it could be possible to auto abandon message when you encounter exception thrown from user's processing logic.