elastic / elasticsearch-java

Official Elasticsearch Java Client
Apache License 2.0
414 stars 241 forks source link

Netty Buffer Leak when upgrading Java API client version 8.15.X #893

Closed Jangis93 closed 1 week ago

Jangis93 commented 1 week ago

Java API client version

8.15.2

Java version

OpenJDK Runtime Environment Temurin-21.0.2+13

Elasticsearch Version

8.13.4

Problem description

Hello,

We noticed after upgrading the Java API client version from 8.14.1 -> 8.15.0-8.15.2, ES version still 8.13.4 we started to monitor memory leaks in our logs for our component, trace is attached to this case. We saw that there was a problem regarding netty leaks that was fixed for the server side: https://github.com/elastic/elasticsearch/issues/108369, however our issue resides on the client side. Is there a known issue for the Java API client versions 8.15.X regarding this?

LEAK: ByteBuf.release() was not called before it's garbage-collected. See https://netty.io/wiki/reference-counted-objects.html for more information. Recent access records: Created at: io.netty.buffer.PooledByteBufAllocator.newDirectBuffer(PooledByteBufAllocator.java:410) io.netty.buffer.AbstractByteBufAllocator.directBuffer(AbstractByteBufAllocator.java:188) io.netty.buffer.AbstractByteBufAllocator.directBuffer(AbstractByteBufAllocator.java:179) io.netty.buffer.AbstractByteBufAllocator.buffer(AbstractByteBufAllocator.java:116) org.springframework.core.io.buffer.NettyDataBufferFactory.allocateBuffer(NettyDataBufferFactory.java:72) org.springframework.core.io.buffer.NettyDataBufferFactory.allocateBuffer(NettyDataBufferFactory.java:39) org.springframework.http.codec.json.AbstractJackson2Encoder.encodeValue(AbstractJackson2Encoder.java:266) org.springframework.http.codec.json.AbstractJackson2Encoder.lambda$encode$0(AbstractJackson2Encoder.java:158) reactor.core.publisher.FluxMapFuseable$MapFuseableSubscriber.onNext(FluxMapFuseable.java:113) 
reactor.core.publisher.FluxHide$SuppressFuseableSubscriber.onNext(FluxHide.java:137) reactor.core.publisher.MonoFlatMap$FlatMapMain.secondComplete(MonoFlatMap.java:245) reactor.core.publisher.MonoFlatMap$FlatMapInner.onNext(MonoFlatMap.java:305) reactor.core.publisher.FluxOnErrorResume$ResumeSubscriber.onNext(FluxOnErrorResume.java:79) reactor.core.publisher.FluxMapFuseable$MapFuseableSubscriber.onNext(FluxMapFuseable.java:129) reactor.core.publisher.FluxHide$SuppressFuseableSubscriber.onNext(FluxHide.java:137) reactor.core.publisher.Operators$BaseFluxToMonoOperator.completePossiblyEmpty(Operators.java:2097) reactor.core.publisher.MonoReduceSeed$ReduceSeedSubscriber.onComplete(MonoReduceSeed.java:163) reactor.core.publisher.FluxMap$MapSubscriber.onComplete(FluxMap.java:144) reactor.core.publisher.FluxMergeSequential$MergeSequentialMain.drain(FluxMergeSequential.java:374) reactor.core.publisher.FluxMergeSequential$MergeSequentialMain.innerComplete(FluxMergeSequential.java:335) reactor.core.publisher.FluxMergeSequential$MergeSequentialInner.onComplete(FluxMergeSequential.java:591) reactor.core.publisher.MonoZip$ZipCoordinator.signal(MonoZip.java:298) reactor.core.publisher.MonoZip$ZipInner.onNext(MonoZip.java:478) reactor.core.publisher.MonoPeekTerminal$MonoTerminalPeekSubscriber.onNext(MonoPeekTerminal.java:180) reactor.core.publisher.SerializedSubscriber.onNext(SerializedSubscriber.java:99) reactor.core.publisher.FluxRetryWhen$RetryWhenMainSubscriber.onNext(FluxRetryWhen.java:178) reactor.core.publisher.FluxPeek$PeekSubscriber.onNext(FluxPeek.java:200) reactor.core.publisher.FluxOnErrorResume$ResumeSubscriber.onNext(FluxOnErrorResume.java:79) io.github.resilience4j.reactor.circuitbreaker.operator.CircuitBreakerSubscriber.hookOnNext(CircuitBreakerSubscriber.java:59) reactor.core.publisher.BaseSubscriber.onNext(BaseSubscriber.java:160) reactor.core.publisher.FluxPeek$PeekSubscriber.onNext(FluxPeek.java:200) reactor.core.publisher.MonoCompletionStage$MonoCompletionStageSubscription.apply(MonoCompletionStage.java:121) reactor.core.publisher.MonoCompletionStage$MonoCompletionStageSubscription.apply(MonoCompletionStage.java:67) java.base/java.util.concurrent.CompletableFuture.uniHandle(CompletableFuture.java:934) java.base/java.util.concurrent.CompletableFuture$UniHandle.tryFire(CompletableFuture.java:911) java.base/java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:510) java.base/java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:2179) co.elastic.clients.transport.ElasticsearchTransportBase.lambda$performRequestAsync$0(ElasticsearchTransportBase.java:213) java.base/java.util.concurrent.CompletableFuture.uniHandle(CompletableFuture.java:934) java.base/java.util.concurrent.CompletableFuture$UniHandle.tryFire(CompletableFuture.java:911) java.base/java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:510) java.base/java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:2179) co.elastic.clients.transport.rest_client.RestClientHttpClient$1.onSuccess(RestClientHttpClient.java:115) datadog.trace.instrumentation.elasticsearch7.RestResponseListener.onSuccess(RestResponseListener.java:26) org.elasticsearch.client.RestClient$FailureTrackingResponseListener.onSuccess(RestClient.java:680) org.elasticsearch.client.RestClient$1.completed(RestClient.java:403) org.elasticsearch.client.RestClient$1.completed(RestClient.java:397) datadog.trace.instrumentation.apachehttpasyncclient.TraceContinuedFutureCallback.completeDelegate(TraceContinuedFutureCallback.java:83) datadog.trace.instrumentation.apachehttpasyncclient.TraceContinuedFutureCallback.completed(TraceContinuedFutureCallback.java:43) org.apache.http.concurrent.BasicFuture.completed(BasicFuture.java:122) org.apache.http.impl.nio.client.DefaultClientExchangeHandlerImpl.responseCompleted(DefaultClientExchangeHandlerImpl.java:182) org.apache.http.nio.protocol.HttpAsyncRequestExecutor.processResponse(HttpAsyncRequestExecutor.java:448) org.apache.http.nio.protocol.HttpAsyncRequestExecutor.inputReady(HttpAsyncRequestExecutor.java:338) org.apache.http.impl.nio.DefaultNHttpClientConnection.consumeInput(DefaultNHttpClientConnection.java:265) org.apache.http.impl.nio.client.InternalIODispatch.onInputReady(InternalIODispatch.java:87) org.apache.http.impl.nio.client.InternalIODispatch.onInputReady(InternalIODispatch.java:40) org.apache.http.impl.nio.reactor.AbstractIODispatch.inputReady(AbstractIODispatch.java:121) org.apache.http.impl.nio.reactor.BaseIOReactor.readable(BaseIOReactor.java:162) org.apache.http.impl.nio.reactor.AbstractIOReactor.processEvent(AbstractIOReactor.java:337) org.apache.http.impl.nio.reactor.AbstractIOReactor.processEvents(AbstractIOReactor.java:315) org.apache.http.impl.nio.reactor.AbstractIOReactor.execute(AbstractIOReactor.java:276) org.apache.http.impl.nio.reactor.BaseIOReactor.execute(BaseIOReactor.java:104) org.apache.http.impl.nio.reactor.AbstractMultiworkerIOReactor$Worker.run(AbstractMultiworkerIOReactor.java:591) java.base/java.lang.Thread.run(Thread.java:1583)

l-trotta commented 1 week ago

Hello, the java client does not use Netty, in particular it does not allocate any ByteBuf, which is a Netty specific class. Most probably the leak occurs in other components which are consuming the response that the client provides.

Jangis93 commented 1 week ago

Hi Laura,

Thank you for your fast reply and the clarification.

I guess the hunt continues for what the issue is.

Kind regards,

Michaela Jangefalk

On Thu, Oct 10, 2024 at 4:27 PM Laura Trotta @.***> wrote:

Hello, the java client does not use Netty, in particular it does not allocate any ByteBuf, which is a Netty specific class. Most probably the leak occurs in other components which are consuming the response that the client provides.

— Reply to this email directly, view it on GitHub https://github.com/elastic/elasticsearch-java/issues/893#issuecomment-2405246799, or unsubscribe https://github.com/notifications/unsubscribe-auth/AGTLDH64L2PGOQCHM345MHDZ22FDRAVCNFSM6AAAAABPVJWX4WVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDIMBVGI2DMNZZHE . You are receiving this because you authored the thread.Message ID: @.***>

l-trotta commented 1 week ago

good luck!