resmoio / kubernetes-event-exporter

Export Kubernetes events to multiple destinations with routing and filtering
Apache License 2.0
813 stars 159 forks source link

Large number of events are missing, the same event can only be received once #197

Open wangjinxiang0522 opened 3 months ago

wangjinxiang0522 commented 3 months ago

Hi Im running version 1.7.

My configuration is

config:
  logLevel: debug
  kubeQPS: 100
  kubeBurst: 500
  maxEventAgeSeconds: 600
  metricsNamePrefix: 'event_exporter_'
  logFormat: json
  receivers:
    - name: "dump"
      file:
        path: "/dev/stdout"
        layout:
           message: "{{ .Message }}"
           reason: "{{ .Reason }}"
           type: "{{ .Type }}"
           count: "{{ .Count }}"
           kind: "{{ .InvolvedObject.Kind }}"
           name: "{{ .InvolvedObject.Name }}"
           namespace: "{{ .Namespace }}"
           component: "{{ .Source.Component }}"
           host: "{{ .Source.Host }}"

    - name: "loki"
      loki:
        streamLabels:
          application: kube-api
          container: event-exporter
        url: http://loki-gateway.stage.sprucetec.com/loki/api/v1/push
        tls:
          insecure: true
  route:
    routes:
      - match:
          - receiver: "loki"
          - receiver: "dump"

nodeSelector:
  node-type: monitoring

image:
  registry: reg.sprucetec.com
  repository: monitor/kubernetes-event-exporter
  tag: 1.6.1-debian-12-r16

I reviewed the source code and found that the OnUpdate method is not implemented. Is this the reason why the same event can only be received once?

jarnfast commented 2 months ago

You're correct. The problem has been addressed in multiple PRs. The quick fix in https://github.com/resmoio/kubernetes-event-exporter/pull/168 and with a bit more configuration in https://github.com/resmoio/kubernetes-event-exporter/pull/167

Until either PR gets handled I elected to build it myself with the fix provided in https://github.com/resmoio/kubernetes-event-exporter/pull/168 ;-)

tuxerrante commented 6 days ago

Hi @jarnfast it seems also #168 is stuck since a while, any plan to merge this feature soon? Thanks

jarnfast commented 6 days ago

Hi @jarnfast it seems also #168 is stuck since a while, any plan to merge this feature soon? Thanks

All PRs seem to be stuck - unfortunately I cannot help with merging them. Maybe pinging @mustafaakin can do the trick?