fluent / fluent-bit

Fast and Lightweight Logs and Metrics processor for Linux, BSD, OSX and Windows
https://fluentbit.io
Apache License 2.0
5.79k stars 1.57k forks source link

Multiline filter crashes after buffer limit is reached causing high cpu and memory usage #4940

Closed staniondaniel closed 4 months ago

staniondaniel commented 2 years ago

Bug Report

Describe the bug

Hello

Multiline filter is crashing on pods that generate a large amount of logs after reaching Emitter_Mem_Buf_Limit . On pods with a normal/low number of logs it works without problems

To Reproduce

[2022/02/25 15:40:06] [ info] [input:tail:tail.0] inode=4540032 handle rotation(): /var/log/containers/xxxxx.log => /var/lib/docker/containers/xxxxx/xxxxx-json.log.5
[2022/02/25 15:40:06] [ info] [input:tail:tail.0] inotify_fs_add(): inode=4540032 watch_fd=113 name=/var/lib/docker/containers/xxxxx/xxxxx-json.log.5
[2022/02/25 15:40:07] [ info] [input:tail:tail.0] inotify_fs_remove(): inode=4592513 watch_fd=111
[2022/02/25 15:40:12] [ info] [input:tail:tail.0] inotify_fs_remove(): inode=3353782 watch_fd=112
[2022/02/25 15:40:12] [ info] [input:tail:tail.0] inotify_fs_remove(): inode=4540032 watch_fd=113
[2022/02/25 15:40:27] [ info] [input:tail:tail.0] inode=3353789 handle rotation(): /var/log/containers/xxxxx.log => /var/lib/docker/containers/xxxxx/xxxxx-json.log.7
[2022/02/25 15:40:27] [ info] [input:tail:tail.0] inotify_fs_add(): inode=3353789 watch_fd=114 name=/var/lib/docker/containers/xxxxx/xxxxx-json.log.7
[2022/02/25 15:40:27] [ warn] [input] emitter.3 paused (mem buf overlimit)
[2022/02/25 15:40:27] [error] [input:emitter:emitter_for_multiline.0] error registering chunk with tag: kube.var.log.containers.xxxxx.log
[2022/02/25 15:40:27] [error] [input:emitter:emitter_for_multiline.0] error registering chunk with tag: kube.var.log.containers.xxxxx.log
[2022/02/25 15:40:27] [error] [input:emitter:emitter_for_multiline.0] error registering chunk with tag: kube.var.log.containers.xxxxx.log
[2022/02/25 15:40:27] [error] [input:emitter:emitter_for_multiline.0] error registering chunk with tag: kube.var.log.containers.xxxxx.log
[2022/02/25 15:40:27] [error] [input:emitter:emitter_for_multiline.0] error registering chunk with tag: kube.var.log.containers.xxxxx.log
[2022/02/25 15:40:27] [error] [input:emitter:emitter_for_multiline.0] error registering chunk with tag: kube.var.log.containers.xxxxx.log

This is my configuration (i left only the relevant parts):

    [INPUT]
        Name tail
        Path /var/log/containers/*.log
        DB /var/log/containers/fluentbit_db.db
        Parser docker
        Tag kube.*
        Mem_Buf_Limit 10MB
        Buffer_Chunk_Size 256k
        Buffer_Max_Size 256k
        Skip_Long_Lines On
        Refresh_Interval 1
        multiline.parser docker,cri
......

    [FILTER]
        name multiline
        Emitter_Mem_Buf_Limit 2.4GB
        match kube.*
        multiline.key_content log
        multiline.parser java,go,python
    [FILTER]
        Name kubernetes
        Buffer_Size 512k
        Match kube.*
        Merge_Log On
        Merge_Log_Key log_json
        Merge_Log_Trim On
        Keep_Log On
        K8S-Logging.Parser On
        K8S-Logging.Exclude On

....

Your Environment

Additional context

Fluentbit container keeps crashing after it gets to the memory limit configured for that container. Also a lot of logs like

[error] [input:emitter:emitter_for_multiline.0] error registering chunk with tag:

are flooding fluentbit logs