jetty / jetty.project

Eclipse Jetty® - Web Container & Clients - supports HTTP/2, HTTP/1.1, HTTP/1.0, websocket, servlets, and more
https://eclipse.dev/jetty
Other
3.84k stars 1.91k forks source link

Memory leak in `ArrayRetainableByteBufferPool$RetainedBucket` #11858

Open KGOH opened 4 months ago

KGOH commented 4 months ago

Jetty version(s) 11.0.17

Java version/vendor openjdk version "21.0.3" 2024-04-16 LTS OpenJDK Runtime Environment Temurin-21.0.3+9 (build 21.0.3+9-LTS) OpenJDK 64-Bit Server VM Temurin-21.0.3+9 (build 21.0.3+9-LTS, mixed mode, sharing)

OS type/version Inside of a eclipse-temurin:21-alpine docker container

Description On several (not all) different instances of my application that are using the same functionality I observe a constantly growing direct memory usage image (the saw-tooth pattern is due to manual application restarts)

I created a heap dump using jmap -dump:format=b,file=heap.hprof $PID. I inspected it with VisualVM and Eclipse MAT and both point to org.eclipse.jetty.io.ArrayRetainableByteBufferPool$RetainedBucket

Here are screenshots from VisualVM and Eclipse MAT from the dump inspection:

image image image image

How to reproduce? 🤷

lorban commented 4 months ago

Memory leaks are notoriously hard to track down, and sometimes hard to differentiate from normal (but large) memory consumption. In your MAT screenshot, I can see that the retained bucket has almost 1.3 million buffer entries in it, so I'm fairly positive that there is indeed some form of leak somewhere.

Unfortunately, almost everything in Jetty works with buffers so just knowing that something, somewhere is not always releasing them isn't nearly enough information to stand a chance to track down the bug. We are going to need your help here to narrow down the areas that may contain the leaky code.

The key to identify what's causing the leak is to find some correlation between certain request types and the growth of memory usage, then try to reproduce the leak with a much narrowed down sample app.

Here's a list of places to start looking for correlations:

In the meantime, you can try limiting the maximum size of the memory retained by the buffer pool by configuring the maxBucketSize, maxHeapMemory and maxDirectMemory of the ArrayByteBufferPool. This could affect performance, but should limit the maximum memory Jetty uses for its buffers.

Finally, another thing worth mentioning: Jetty 12 is much more robust in the face of buffer leaks, as in between other improvements, it contains some code to detect and repair them most of the time so you may want to give it a shot if you can.

KGOH commented 1 month ago

Hello, sorry for silence. I've been investigating another memory leak which has been occuring in caddy which proxies my jetty server. I've replaced caddy with nginx and the memory consumption rate has changed significantly. I'll continue observation for another couple of weeks.

If you have any idea how caddy could cause the memory leak in jetty, let me know

In the meanwhile here's the thread in the caddy repo's issue https://github.com/caddyserver/caddy/issues/6322#issuecomment-2273220741

image image
lorban commented 1 month ago

Caddy must be doing something slightly differently from nginx that's triggering the leak in Jetty, but it's hard to be more precise than that.

Something else that may help: we have backported the leak-tracking connection pool from Jetty 12, and it's going to land in the soon-to-be-released Jetty 11.0.23. Once that release is out, you may want to try configuring the leak tracking connection pool, wait until you can confirm the memory consumption goes to an abnormal level then take a heap dump.

The tracking pool should then contain some fairly precise information about the root cause of the leak that should help us track it down.