fluent / fluent-bit

Fast and Lightweight Logs and Metrics processor for Linux, BSD, OSX and Windows
https://fluentbit.io
Apache License 2.0
5.86k stars 1.59k forks source link

fluentbit_filter_drop_records_total metric is increasing when using multiline filter #8923

Closed ashishmodi7 closed 1 month ago

ashishmodi7 commented 5 months ago

Bug Report

Describe the bug fluentbit_filter_drop_records_total metric is increasing when using multiline filter. The records are flowing properly to Splunk, but still filter drop metrics in increasing. This issue is reproducible in Fluent Bit v3.0.6

To Reproduce Steps to reproduce the problem:

  1. Deploy Fluent Bit in Kubernetes (https://docs.fluentbit.io/manual/installation/kubernetes#installing-with-helm-chart)
  2. Configure Port forwarding to view the Prometheus metrics using below command: export POD_NAME=$(kubectl get pods --namespace default -l "app.kubernetes.io/name=fluent-bit,app.kubernetes.io/instance=fluent-bit" -o jsonpath="{.items[0].metadata.name}") kubectl --namespace default port-forward $POD_NAME 2020:2020
  3. Configure Fluent Bit Output to Splunk Server
  4. Configure Multiline Filter using following https://docs.fluentbit.io/manual/administration/configuring-fluent-bit/multiline-parsing
  5. Check the Prometheus Metrics "fluentbit_filter_drop_records_total". The records are flowing properly to Splunk, but still filter drop metrics in increasing. curl -s http://127.0.0.1:2020/api/v2/metrics/prometheus|grep drop

Expected behavior Prometheus Metrics "fluentbit_filter_drop_records_total" should show 0.

Screenshots image

Your Environment Version used: Fluent Bit 3.0.6 Configuration: Deafult configuration Environment name and version (e.g. Kubernetes? What version?): Kubernetes Client Version: v1.30.1 , Kustomize Version: v5.0.4-0.20230601165947-6ce0bf390ce3 , Server Version: v1.30.0 Server type and version: Linux Operating System and version: Rocky Linux 8.9 Filters and plugins: Splunk

Additional context

patrick-stephens commented 5 months ago

I think this is a duplicate of #6699

github-actions[bot] commented 2 months ago

This issue is stale because it has been open 90 days with no activity. Remove stale label or comment or this will be closed in 5 days. Maintainers can add the exempt-stale label.

ashishmodi7 commented 2 months ago

Hello Team, Any update on this ticket?

RohitKhurana88 commented 1 month ago

Hello Team, May we get an update on this ticket?

patrick-stephens commented 1 month ago

Please watch the duplicate https://github.com/fluent/fluent-bit/issues/6699