fluent / fluent-bit

Fast and Lightweight Logs and Metrics processor for Linux, BSD, OSX and Windows
https://fluentbit.io
Apache License 2.0
5.73k stars 1.56k forks source link

fluent-bit memory not released #8274

Closed littlejoyo closed 1 week ago

littlejoyo commented 9 months ago

Bug Report

Describe the bug I have a problem about memory, my fluent-bit is enabled jemalloc, I found that when fluent-bit processes logs, the memory usage will gradually increase, and when the traffic stops, I will find that the memory is not released immediately, is this part normal? Is it related to jemalloc?

After fluent-bit processed the log, I found that the memory usage was about 340Mi, and then I stopped the traffic and saw that the memory usage was still about 340Mi the next day, why wouldn't fluent-bit release the occupied memory? Although the data is no longer processed

PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND 206 root 20 0 6295508 346876 6424 S 0.7 1.1 73:17.02 fluent-bit

Your Environment

drbugfinder-work commented 9 months ago

Which pulsar input/output plugin are you using? I cannot see any built-in plugin named like this. Maybe there's an issue in a custom plugin

github-actions[bot] commented 6 months ago

This issue is stale because it has been open 90 days with no activity. Remove stale label or comment or this will be closed in 5 days. Maintainers can add the exempt-stale label.

littlejoyo commented 5 months ago

yes,pulsar plugin build by fluent-bit/src/proxy/go,input and output golang plugin

thipokch commented 3 months ago

I’m having the same issue with OpenTelemetry too

drbugfinder-work commented 3 months ago

As this is no official Fluent Bit plugin (at least to my knowledge), I don't see a reason why this issue was opened here. My first guess would be that there's a memleak in the implementation of that 3rd party plugin.

In case it really is an issue with the Fluent Bit golang interface, I think the best way to narrow it down would be to create a very simple dummy golang plugin as a reproducer.

@patrick-stephens What's your opinion about this?

patrick-stephens commented 3 months ago

Yeah I think it needs a reproducer that is simple and using something officially supported by OSS. We can't be debugging every possible implementation. It may be there is some general issue with the golang interface but then it should be reproducible.

We need a reproducible problem report with the provided plugins or a way to do so. I'm not sure if @thipokch is saying there is a problem using the official OTEL input or output plugin either.

thipokch commented 3 months ago

I’m using OTEL as output.

P.S. I’m not sure if this is relevant, but it might be useful. In Grafana Alloy, they implemented memory limiter for OTEL:

https://grafana.com/docs/alloy/latest/reference/components/otelcol.processor.memory_limiter/

https://github.com/open-telemetry/opentelemetry-collector/blob/main/processor/memorylimiterprocessor/README.md

github-actions[bot] commented 2 weeks ago

This issue is stale because it has been open 90 days with no activity. Remove stale label or comment or this will be closed in 5 days. Maintainers can add the exempt-stale label.

github-actions[bot] commented 1 week ago

This issue was closed because it has been stalled for 5 days with no activity.