logdna / logdna-agent-v2

The blazingly fast, resource efficient log collection client
https://logdna.com
MIT License
62 stars 46 forks source link

Potential memory leak in 3.2.0 #179

Open maxfliri opened 2 years ago

maxfliri commented 2 years ago

I'm running logdna-agent in kubernetes. After upgrading from version 2.2.4 to 3.2.0, I noticed that with the new version the container memory usage constantly increases overtime, up to the configured memory limit; after running close to the limit for a while, the container finally crashes.

Here is a graph taken from our monitoring system showing the memory usage of a container running with the image logdna/logdna-agent:3.2.0 over a period of 9 days – the drops in the graph show when the container crashed and restarted.

logdna-agent-v3 2 0

For comparison, here is a graph showing the memory usage of logdna/logdna-agent:2.2.4 on the same cluster over a period of 7 days - the working set size is mostly flat and sits between 18 and 26 MB.

logdna-agent-v2 2 4-memory
jorgebay commented 2 years ago

Thanks for filing the ticket @maxfliri

Certainly the memory growth is unusual.

Can you retrieve a log message from the agent on a pod that has been running for more than 2 days that has the following format.

Removed file information, currently tracking %d files and directories

For example, you could use the following query on the web ui: host:logdna-agent-xyz currently tracking

Also, the metrics log messages on the same pod: host:logdna-agent-xyz metrics would help (some of the latest messages).

Are you setting any particular setting besides the ingestion key?

jorgebay commented 2 years ago

Any more info @maxfliri?

jorgebay commented 2 years ago

I was able to reproduce it on K8s, there's no need to provide further info.

I'll try to find the underlying cause and keep you posted.

pbadenski commented 2 years ago

Any updates on this? Can't really move from 2.X to 3.X because of this issue :(

SuperQ commented 2 years ago

We tried to upgrade from 3.0.x to 3.3.x and it looks like we're seeing something similar. The process starts up and bursts to 1500Mi memory for a few minutes, then settles down to 300Mi or so.

We have not bisected the versions yet to see what change caused it.

dhable commented 2 years ago

After merging LOG-11566 (Bump rocksdb crate version to 0.17 and switch to tikv-jemallocator) into the master branch, I recompiled and ran a benchmark profile using heaptrack for 1hr 52 min where the agent was streaming data from 4-6 log files that were being continuously written to during the duration of the test. (NOTE: I had to compile heaptrack from source on my system since the existing RPM packages were incompatible with the DWARF format that was emitted from a fresh build of the agent.) The heaptrack test was run with the --debug flag in order to capture detailed code metrics to track down any issues. The following is the heap usage during the entire run.

image

There is a steep initial climb on startup but otherwise the heap usage chart appears to be flat for the duration of the test. This suggests that the agent is not leaking memory under these conditions even thought he agent is constantly performing memory allocations as part of it’s normal operation.

image

heaptrack did flag two allocations by rocksdb as leaked memory.

image

The stack traces for both of these allocations happen in the main() function when the agent initializes its internal state and then holds on to these allocations for the life of the application. I suspect that heaptrack is flagging these as leaks simply because the allocation exists at the time the application terminates and thus must be a leak instead of understanding the global nature of these section of code. There is nothing in the rest of the data captured that makes me believe that this allocation is a problem.

Internally, rocksdb is going to use memory to avoid constantly hitting disk for every operation. We may want to consider making some of the memory buffer parameters within rocksdb configurable for customers who want to control the memory usage of the agent. I’ve opened LOG-11609 so we can flush out/prioritize that effort.

dhable commented 2 years ago

I also completed a 10 hour endurance test of the agent and saw the heap usage climb but also noted that as compaction occurred within rocksdb, the heap usage would drop back to the starting value without crashing.

Screen Shot 2021-12-09 at 10 40 30 PM

This testing was done on the upcoming 3.4 release that is going to beta soon. LOG-11609 was updated to include additional config params and tuning around compaction as well.

maxfliri commented 1 year ago

Is there any update on this?

I have been testing this periodically when new versions of the agent become available, and recently I tested this with version 3.5.0. The problem is still there: the memory grows gradually over time until it reaches the resource limit, then the pod crashes and is restarted. Here are some new charts showing memory usage and restarts from last week over a 7-days period; these metrics are from our non-production kubernetes cluster, where we are running logdna-agent:3.5.0 and where we have a relatively low volume of logs.

logdna-agent-memory logdna-agent-restarts

These are the only logs coming from the pod at restart time – I can see these every time a pod gets restarted, but it's unclear whether these are logged immediately before the crash, or immediately after the restart:

Aug 20 03:41:19 logdna-agent INFO fs::tail] initialized symlink "logdna-agent-5sn4s_logdna-agent_logdna-agent-2b4ffecfbf86ffd5d655f43df259589297473ede69d308aa94ebdd7cb61c2c8f.log" as DefaultKey(74v1)
Aug 20 03:41:19 logdna-agent INFO fs::cache::tailed_file] "/var/log/containers/logdna-agent-5sn4s_logdna-agent_logdna-agent-2b4ffecfbf86ffd5d655f43df259589297473ede69d308aa94ebdd7cb61c2c8f.log" was truncated from 2278885 to 68375
Aug 20 03:41:19 logdna-agent INFO fs::tail] initialize event for symlink "logdna-agent-5sn4s_logdna-agent_logdna-agent-481b4a9f438e0fc2a72904f2e4dada407bc35adeeb0d52255fa9fbec37716fae.log", target "/var/log/pods/logdna-agent_logdna-agent-5sn4s_5507fb3f-f1be-4dcb-8a13-535ec14dd94c/logdna-agent/22.log", final target DefaultKey(114v1)
Aug 20 03:41:19 logdna-agent INFO fs::tail] initialized symlink "logdna-agent-5sn4s_logdna-agent_logdna-agent-481b4a9f438e0fc2a72904f2e4dada407bc35adeeb0d52255fa9fbec37716fae.log" as DefaultKey(114v1)

These are the logs showing that the pod was killed due to OOM

Aug 20 03:41:19 kern.log [8166762.889650] [1460516]     0 1460516   246508    23289   434176        0           984 logdna-agent
Aug 20 03:41:19 kern.log [8166762.890814] Memory cgroup out of memory: Kill process 1460516 (logdna-agent) score 1912 or sacrifice child
Aug 20 03:41:19 kern.log [8166762.891978] Killed process 1460516 (logdna-agent) total-vm:986032kB, anon-rss:75500kB, file-rss:17656kB, shmem-rss:0kB
Aug 20 03:41:19 kern.log [8166762.896673] oom_reaper: reaped process 1460516 (logdna-agent), now anon-rss:0kB, file-rss:0kB, shmem-rss:0kB
toddweb commented 1 year ago

this is still happening on version 3.6.0 for us, any word on fixing this?

stdmje commented 1 year ago

Any news? I can’t believe that 1 year and half later the issue still exists and no progress has been made.

netanel-sayada commented 1 year ago

Any update on this? LogDNA agent keeps getting OOMKilled on version 3.7.0 as well

y-batsianouski commented 1 year ago

Is applicable for us to. Are there any updates on this? Maybe it's posibble to specify some flag to set period for resetting memory usage?

dkhokhlov commented 1 year ago

The workaround for the memory leak is disabling persistent agent state - file offsets database (rocksdb):

LOGDNA_DB_PATH=/dev/null

Note: this change cases agent to behave as if there is no previous file offset state after start. this can be partially mitigated by lookback option.

Uvedale commented 1 year ago

We're also experiencing this. We've just had to accept logdna-agent getting regularly OOMKilled while we wait for a fix. Could this be prioritised?

dkhokhlov commented 1 year ago

@Uvedale This issue has been prioritized up and work is ongoing.

is this workaround not working for you? https://github.com/logdna/logdna-agent-v2/issues/179#issuecomment-1446871271

Uvedale commented 1 year ago

Thanks @dkhokhlov I could be mistaken, but my understanding is that the workaround options would result in potentially duplicated logs or missing logs depending on how you choose to handle a restart when there is no DB. We'd rather keep the DB if that is the case.

Afikoman commented 1 year ago

Any updates? we are facing it for quite some time

JunliWang commented 10 months ago

We are also facing the same issue even upgraded to the latest 3.8.8. I saw commits on Aug2 may have some potential fix to reduce memory footprint, but they are not included in any release. What's the plan to release the fix?

dkhokhlov commented 10 months ago

re: 3.8.8 is it k8s? what image name do you use?

JunliWang commented 10 months ago

yes, we build source code from v3.8.8 using Rust 1.68, use base image ubi8.7, and run it with k8s 1.25 on IBM Cloud, and use the same cpu and memory limit setting https://github.com/logdna/logdna-agent-v2/blob/master/k8s/agent-resources.yaml#L163

dkhokhlov commented 10 months ago

what is RES memory usage? agent prints 'metrics' every minute. what is the rate of 'writes' value? writes per minute how long does it take to get OOM after start?

JunliWang commented 10 months ago

captured a few line before the OOM happened, and this is from a dev cluster.

2023-09-22T18:40:41.625183Z  INFO metrics: {"fs":{"events":816548,"creates":21,"deletes":28,"writes":816499,"lines":310,"bytes":29985,"files_tracked":282},"memory":{"active":541609984,"allocated":526899256,"resident":550993920},"ingest":{"requests":45,"requests_size":206933,"rate_limits":0,"retries":0,"retries_success":0,"retries_failure":0,"requests_duration":1275.252,"requests_timed_out":0,"requests_failed":0,"requests_succeeded":45},"k8s":{"lines":0,"creates":369,"deletes":1,"events":370},"journald":{"lines":0,"bytes":0},"retry":{"pending":0,"storage_used":0}}
2023-09-22T18:41:41.626438Z  INFO metrics: {"fs":{"events":816973,"creates":21,"deletes":28,"writes":816924,"lines":310,"bytes":29985,"files_tracked":282},"memory":{"active":541601792,"allocated":526890968,"resident":550985728},"ingest":{"requests":45,"requests_size":206933,"rate_limits":0,"retries":0,"retries_success":0,"retries_failure":0,"requests_duration":1275.252,"requests_timed_out":0,"requests_failed":0,"requests_succeeded":45},"k8s":{"lines":0,"creates":369,"deletes":1,"events":370},"journald":{"lines":0,"bytes":0},"retry":{"pending":0,"storage_used":0}}
2023-09-22T18:42:41.627303Z  INFO metrics: {"fs":{"events":817315,"creates":21,"deletes":28,"writes":817266,"lines":310,"bytes":29985,"files_tracked":282},"memory":{"active":541609984,"allocated":526899256,"resident":550993920},"ingest":{"requests":45,"requests_size":206933,"rate_limits":0,"retries":0,"retries_success":0,"retries_failure":0,"requests_duration":1275.252,"requests_timed_out":0,"requests_failed":0,"requests_succeeded":45},"k8s":{"lines":0,"creates":369,"deletes":1,"events":370},"journald":{"lines":0,"bytes":0},"retry":{"pending":0,"storage_used":0}}
2023-09-22T18:43:41.628544Z  INFO metrics: {"fs":{"events":817634,"creates":21,"deletes":28,"writes":817585,"lines":310,"bytes":29985,"files_tracked":282},"memory":{"active":541609984,"allocated":526899256,"resident":550993920},"ingest":{"requests":45,"requests_size":206933,"rate_limits":0,"retries":0,"retries_success":0,"retries_failure":0,"requests_duration":1275.252,"requests_timed_out":0,"requests_failed":0,"requests_succeeded":45},"k8s":{"lines":0,"creates":369,"deletes":1,"events":370},"journald":{"lines":0,"bytes":0},"retry":{"pending":0,"storage_used":0}}
2023-09-22T18:44:41.630341Z  INFO metrics: {"fs":{"events":817956,"creates":21,"deletes":28,"writes":817907,"lines":310,"bytes":29985,"files_tracked":282},"memory":{"active":543129600,"allocated":528404120,"resident":552525824},"ingest":{"requests":45,"requests_size":206933,"rate_limits":0,"retries":0,"retries_success":0,"retries_failure":0,"requests_duration":1275.252,"requests_timed_out":0,"requests_failed":0,"requests_succeeded":45},"k8s":{"lines":0,"creates":369,"deletes":1,"events":370},"journald":{"lines":0,"bytes":0},"retry":{"pending":0,"storage_used":0}}
2023-09-22T18:45:41.631049Z  INFO metrics: {"fs":{"events":818353,"creates":21,"deletes":28,"writes":818304,"lines":310,"bytes":29985,"files_tracked":282},"memory":{"active":543637504,"allocated":529031120,"resident":553033728},"ingest":{"requests":45,"requests_size":206933,"rate_limits":0,"retries":0,"retries_success":0,"retries_failure":0,"requests_duration":1275.252,"requests_timed_out":0,"requests_failed":0,"requests_succeeded":45},"k8s":{"lines":0,"creates":369,"deletes":1,"events":370},"journald":{"lines":0,"bytes":0},"retry":{"pending":0,"storage_used":0}}
2023-09-22T18:46:41.632619Z  INFO metrics: {"fs":{"events":818647,"creates":21,"deletes":28,"writes":818598,"lines":310,"bytes":29985,"files_tracked":282},"memory":{"active":543490048,"allocated":528891808,"resident":552886272},"ingest":{"requests":45,"requests_size":206933,"rate_limits":0,"retries":0,"retries_success":0,"retries_failure":0,"requests_duration":1275.252,"requests_timed_out":0,"requests_failed":0,"requests_succeeded":45},"k8s":{"lines":0,"creates":369,"deletes":1,"events":370},"journald":{"lines":0,"bytes":0},"retry":{"pending":0,"storage_used":0}}
2023-09-22T18:47:41.634010Z  INFO metrics: {"fs":{"events":818986,"creates":21,"deletes":28,"writes":818937,"lines":310,"bytes":29985,"files_tracked":282},"memory":{"active":543490048,"allocated":528891856,"resident":552886272},"ingest":{"requests":45,"requests_size":206933,"rate_limits":0,"retries":0,"retries_success":0,"retries_failure":0,"requests_duration":1275.252,"requests_timed_out":0,"requests_failed":0,"requests_succeeded":45},"k8s":{"lines":0,"creates":369,"deletes":1,"events":370},"journald":{"lines":0,"bytes":0},"retry":{"pending":0,"storage_used":0}}
2023-09-22T18:48:41.634864Z  INFO metrics: {"fs":{"events":819254,"creates":21,"deletes":28,"writes":819205,"lines":310,"bytes":29985,"files_tracked":282},"memory":{"active":543490048,"allocated":528891856,"resident":552886272},"ingest":{"requests":45,"requests_size":206933,"rate_limits":0,"retries":0,"retries_success":0,"retries_failure":0,"requests_duration":1275.252,"requests_timed_out":0,"requests_failed":0,"requests_succeeded":45},"k8s":{"lines":0,"creates":369,"deletes":1,"events":370},"journald":{"lines":0,"bytes":0},"retry":{"pending":0,"storage_used":0}}
2023-09-22T18:49:41.635931Z  INFO metrics: {"fs":{"events":819548,"creates":21,"deletes":28,"writes":819499,"lines":310,"bytes":29985,"files_tracked":282},"memory":{"active":543494144,"allocated":528891856,"resident":552890368},"ingest":{"requests":45,"requests_size":206933,"rate_limits":0,"retries":0,"retries_success":0,"retries_failure":0,"requests_duration":1275.252,"requests_timed_out":0,"requests_failed":0,"requests_succeeded":45},"k8s":{"lines":0,"creates":369,"deletes":1,"events":370},"journald":{"lines":0,"bytes":0},"retry":{"pending":0,"storage_used":0}}
2023-09-22T18:50:41.637150Z  INFO metrics: {"fs":{"events":819841,"creates":21,"deletes":28,"writes":819792,"lines":310,"bytes":29985,"files_tracked":282},"memory":{"active":543498240,"allocated":528960328,"resident":552894464},"ingest":{"requests":45,"requests_size":206933,"rate_limits":0,"retries":0,"retries_success":0,"retries_failure":0,"requests_duration":1275.252,"requests_timed_out":0,"requests_failed":0,"requests_succeeded":45},"k8s":{"lines":0,"creates":369,"deletes":1,"events":370},"journald":{"lines":0,"bytes":0},"retry":{"pending":0,"storage_used":0}}
2023-09-22T18:51:41.637999Z  INFO metrics: {"fs":{"events":820151,"creates":21,"deletes":28,"writes":820102,"lines":310,"bytes":29985,"files_tracked":282},"memory":{"active":543490048,"allocated":528952040,"resident":552886272},"ingest":{"requests":45,"requests_size":206933,"rate_limits":0,"retries":0,"retries_success":0,"retries_failure":0,"requests_duration":1275.252,"requests_timed_out":0,"requests_failed":0,"requests_succeeded":45},"k8s":{"lines":0,"creates":369,"deletes":1,"events":370},"journald":{"lines":0,"bytes":0},"retry":{"pending":0,"storage_used":0}}
2023-09-22T18:52:41.639584Z  INFO metrics: {"fs":{"events":820441,"creates":21,"deletes":28,"writes":820392,"lines":310,"bytes":29985,"files_tracked":282},"memory":{"active":543498240,"allocated":528960328,"resident":552894464},"ingest":{"requests":45,"requests_size":206933,"rate_limits":0,"retries":0,"retries_success":0,"retries_failure":0,"requests_duration":1275.252,"requests_timed_out":0,"requests_failed":0,"requests_succeeded":45},"k8s":{"lines":0,"creates":369,"deletes":1,"events":370},"journald":{"lines":0,"bytes":0},"retry":{"pending":0,"storage_used":0}}

And 30 days view of when it reaches the limit, does not seem to be at the same pace, sometimes 2days, sometimes a few hours. image

JunliWang commented 10 months ago

metric logs after OOM and restart

2023-09-22T18:53:41.641727Z  INFO metrics: {"fs":{"events":821128,"creates":21,"deletes":28,"writes":821079,"lines":310,"bytes":29985,"files_tracked":338},"memory":{"active":543903744,"allocated":528960328,"resident":553312256},"ingest":{"requests":45,"requests_size":206933,"rate_limits":0,"retries":0,"retries_success":0,"retries_failure":0,"requests_duration":1275.252,"requests_timed_out":0,"requests_failed":0,"requests_succeeded":45},"k8s":{"lines":0,"creates":369,"deletes":1,"events":370},"journald":{"lines":0,"bytes":0},"retry":{"pending":0,"storage_used":0}}
2023-09-22T18:54:41.643316Z  INFO metrics: {"fs":{"events":821420,"creates":21,"deletes":28,"writes":821371,"lines":310,"bytes":29985,"files_tracked":338},"memory":{"active":543903744,"allocated":528960328,"resident":553312256},"ingest":{"requests":45,"requests_size":206933,"rate_limits":0,"retries":0,"retries_success":0,"retries_failure":0,"requests_duration":1275.252,"requests_timed_out":0,"requests_failed":0,"requests_succeeded":45},"k8s":{"lines":0,"creates":369,"deletes":1,"events":370},"journald":{"lines":0,"bytes":0},"retry":{"pending":0,"storage_used":0}}
2023-09-22T18:55:41.644821Z  INFO metrics: {"fs":{"events":821754,"creates":21,"deletes":28,"writes":821705,"lines":310,"bytes":29985,"files_tracked":338},"memory":{"active":543903744,"allocated":528960328,"resident":553312256},"ingest":{"requests":45,"requests_size":206933,"rate_limits":0,"retries":0,"retries_success":0,"retries_failure":0,"requests_duration":1275.252,"requests_timed_out":0,"requests_failed":0,"requests_succeeded":45},"k8s":{"lines":0,"creates":369,"deletes":1,"events":370},"journald":{"lines":0,"bytes":0},"retry":{"pending":0,"storage_used":0}}
2023-09-22T18:56:41.646640Z  INFO metrics: {"fs":{"events":822037,"creates":21,"deletes":28,"writes":821988,"lines":310,"bytes":29985,"files_tracked":338},"memory":{"active":543895552,"allocated":528952040,"resident":553304064},"ingest":{"requests":45,"requests_size":206933,"rate_limits":0,"retries":0,"retries_success":0,"retries_failure":0,"requests_duration":1275.252,"requests_timed_out":0,"requests_failed":0,"requests_succeeded":45},"k8s":{"lines":0,"creates":369,"deletes":1,"events":370},"journald":{"lines":0,"bytes":0},"retry":{"pending":0,"storage_used":0}}
2023-09-22T18:57:41.648169Z  INFO metrics: {"fs":{"events":822326,"creates":21,"deletes":28,"writes":822277,"lines":310,"bytes":29985,"files_tracked":338},"memory":{"active":543903744,"allocated":528960328,"resident":553312256},"ingest":{"requests":45,"requests_size":206933,"rate_limits":0,"retries":0,"retries_success":0,"retries_failure":0,"requests_duration":1275.252,"requests_timed_out":0,"requests_failed":0,"requests_succeeded":45},"k8s":{"lines":0,"creates":369,"deletes":1,"events":370},"journald":{"lines":0,"bytes":0},"retry":{"pending":0,"storage_used":0}}
2023-09-22T18:58:41.648995Z  INFO metrics: {"fs":{"events":822602,"creates":21,"deletes":28,"writes":822553,"lines":310,"bytes":29985,"files_tracked":338},"memory":{"active":543903744,"allocated":528960328,"resident":553312256},"ingest":{"requests":45,"requests_size":206933,"rate_limits":0,"retries":0,"retries_success":0,"retries_failure":0,"requests_duration":1275.252,"requests_timed_out":0,"requests_failed":0,"requests_succeeded":45},"k8s":{"lines":0,"creates":369,"deletes":1,"events":370},"journald":{"lines":0,"bytes":0},"retry":{"pending":0,"storage_used":0}}
2023-09-22T18:59:41.650624Z  INFO metrics: {"fs":{"events":822925,"creates":21,"deletes":28,"writes":822876,"lines":310,"bytes":29985,"files_tracked":338},"memory":{"active":544120832,"allocated":529121232,"resident":553529344},"ingest":{"requests":45,"requests_size":206933,"rate_limits":0,"retries":0,"retries_success":0,"retries_failure":0,"requests_duration":1275.252,"requests_timed_out":0,"requests_failed":0,"requests_succeeded":45},"k8s":{"lines":0,"creates":369,"deletes":1,"events":370},"journald":{"lines":0,"bytes":0},"retry":{"pending":0,"storage_used":0}}
2023-09-22T19:00:41.652687Z  INFO metrics: {"fs":{"events":823246,"creates":21,"deletes":28,"writes":823197,"lines":310,"bytes":29985,"files_tracked":338},"memory":{"active":548155392,"allocated":533122608,"resident":557592576},"ingest":{"requests":45,"requests_size":206933,"rate_limits":0,"retries":0,"retries_success":0,"retries_failure":0,"requests_duration":1275.252,"requests_timed_out":0,"requests_failed":0,"requests_succeeded":45},"k8s":{"lines":0,"creates":369,"deletes":1,"events":370},"journald":{"lines":0,"bytes":0},"retry":{"pending":0,"storage_used":0}}
2023-09-22T19:01:41.654139Z  INFO metrics: {"fs":{"events":823574,"creates":21,"deletes":28,"writes":823525,"lines":310,"bytes":29985,"files_tracked":338},"memory":{"active":548155392,"allocated":533121576,"resident":557592576},"ingest":{"requests":45,"requests_size":206933,"rate_limits":0,"retries":0,"retries_success":0,"retries_failure":0,"requests_duration":1275.252,"requests_timed_out":0,"requests_failed":0,"requests_succeeded":45},"k8s":{"lines":0,"creates":369,"deletes":1,"events":370},"journald":{"lines":0,"bytes":0},"retry":{"pending":0,"storage_used":0}}
2023-09-22T19:02:41.655566Z  INFO metrics: {"fs":{"events":823859,"creates":21,"deletes":28,"writes":823810,"lines":310,"bytes":29985,"files_tracked":338},"memory":{"active":548134912,"allocated":533105192,"resident":557572096},"ingest":{"requests":45,"requests_size":206933,"rate_limits":0,"retries":0,"retries_success":0,"retries_failure":0,"requests_duration":1275.252,"requests_timed_out":0,"requests_failed":0,"requests_succeeded":45},"k8s":{"lines":0,"creates":369,"deletes":1,"events":370},"journald":{"lines":0,"bytes":0},"retry":{"pending":0,"storage_used":0}}
2023-09-22T19:03:41.657724Z  INFO metrics: {"fs":{"events":824199,"creates":21,"deletes":28,"writes":824150,"lines":310,"bytes":29985,"files_tracked":338},"memory":{"active":548134912,"allocated":533105192,"resident":557572096},"ingest":{"requests":45,"requests_size":206933,"rate_limits":0,"retries":0,"retries_success":0,"retries_failure":0,"requests_duration":1275.252,"requests_timed_out":0,"requests_failed":0,"requests_succeeded":45},"k8s":{"lines":0,"creates":369,"deletes":1,"events":370},"journald":{"lines":0,"bytes":0},"retry":{"pending":0,"storage_used":0}}
2023-09-22T19:04:41.659534Z  INFO metrics: {"fs":{"events":824556,"creates":21,"deletes":28,"writes":824507,"lines":310,"bytes":29985,"files_tracked":338},"memory":{"active":548134912,"allocated":533105192,"resident":557572096},"ingest":{"requests":45,"requests_size":206933,"rate_limits":0,"retries":0,"retries_success":0,"retries_failure":0,"requests_duration":1275.252,"requests_timed_out":0,"requests_failed":0,"requests_succeeded":45},"k8s":{"lines":0,"creates":369,"deletes":1,"events":370},"journald":{"lines":0,"bytes":0},"retry":{"pending":0,"storage_used":0}}
2023-09-22T19:05:41.660980Z  INFO metrics: {"fs":{"events":824899,"creates":21,"deletes":28,"writes":824850,"lines":310,"bytes":29985,"files_tracked":338},"memory":{"active":548081664,"allocated":533097024,"resident":557518848},"ingest":{"requests":45,"requests_size":206933,"rate_limits":0,"retries":0,"retries_success":0,"retries_failure":0,"requests_duration":1275.252,"requests_timed_out":0,"requests_failed":0,"requests_succeeded":45},"k8s":{"lines":0,"creates":369,"deletes":1,"events":370},"journald":{"lines":0,"bytes":0},"retry":{"pending":0,"storage_used":0}}
2023-09-22T19:06:41.661937Z  INFO metrics: {"fs":{"events":825187,"creates":21,"deletes":28,"writes":825138,"lines":310,"bytes":29985,"files_tracked":338},"memory":{"active":548069376,"allocated":533088832,"resident":557506560},"ingest":{"requests":45,"requests_size":206933,"rate_limits":0,"retries":0,"retries_success":0,"retries_failure":0,"requests_duration":1275.252,"requests_timed_out":0,"requests_failed":0,"requests_succeeded":45},"k8s":{"lines":0,"creates":369,"deletes":1,"events":370},"journald":{"lines":0,"bytes":0},"retry":{"pending":0,"storage_used":0}}
2023-09-22T19:07:41.663453Z  INFO metrics: {"fs":{"events":825437,"creates":21,"deletes":28,"writes":825388,"lines":310,"bytes":29985,"files_tracked":338},"memory":{"active":548069376,"allocated":533088832,"resident":557506560},"ingest":{"requests":45,"requests_size":206933,"rate_limits":0,"retries":0,"retries_success":0,"retries_failure":0,"requests_duration":1275.252,"requests_timed_out":0,"requests_failed":0,"requests_succeeded":45},"k8s":{"lines":0,"creates":369,"deletes":1,"events":370},"journald":{"lines":0,"bytes":0},"retry":{"pending":0,"storage_used":0}}
2023-09-22T19:08:41.665724Z  INFO metrics: {"fs":{"events":825683,"creates":21,"deletes":28,"writes":825634,"lines":310,"bytes":29985,"files_tracked":338},"memory":{"active":548069376,"allocated":533088832,"resident":557506560},"ingest":{"requests":45,"requests_size":206933,"rate_limits":0,"retries":0,"retries_success":0,"retries_failure":0,"requests_duration":1275.252,"requests_timed_out":0,"requests_failed":0,"requests_succeeded":45},"k8s":{"lines":0,"creates":369,"deletes":1,"events":370},"journald":{"lines":0,"bytes":0},"retry":{"pending":0,"storage_used":0}}
2023-09-22T19:09:41.667062Z  INFO metrics: {"fs":{"events":825914,"creates":21,"deletes":28,"writes":825865,"lines":310,"bytes":29985,"files_tracked":338},"memory":{"active":548069376,"allocated":533088832,"resident":557506560},"ingest":{"requests":45,"requests_size":206933,"rate_limits":0,"retries":0,"retries_success":0,"retries_failure":0,"requests_duration":1275.252,"requests_timed_out":0,"requests_failed":0,"requests_succeeded":45},"k8s":{"lines":0,"creates":369,"deletes":1,"events":370},"journald":{"lines":0,"bytes":0},"retry":{"pending":0,"storage_used":0}}
2023-09-22T19:10:41.668576Z  INFO metrics: {"fs":{"events":826172,"creates":21,"deletes":28,"writes":826123,"lines":310,"bytes":29985,"files_tracked":338},"memory":{"active":548352000,"allocated":533351192,"resident":557793280},"ingest":{"requests":45,"requests_size":206933,"rate_limits":0,"retries":0,"retries_success":0,"retries_failure":0,"requests_duration":1275.252,"requests_timed_out":0,"requests_failed":0,"requests_succeeded":45},"k8s":{"lines":0,"creates":369,"deletes":1,"events":370},"journald":{"lines":0,"bytes":0},"retry":{"pending":0,"storage_used":0}}
dkhokhlov commented 10 months ago

what is the effective config that is printed after agent started?

JunliWang commented 10 months ago

I'm not seeing any config in the log, just below lines of watching or ignoring files based on the env LOGDNA_EXCLUSION_RULES this piece is between above 2 metrics piece I posted earlier.

2023-09-22T18:52:47.344212Z  INFO fs::tail: restarting stream, interval=21600
2023-09-22T18:52:47.345030Z  INFO fs::cache: watching "/var/log/"
2023-09-22T18:52:47.345136Z  INFO fs::cache: ignoring "/var/log/syslog"
2023-09-22T18:52:47.345181Z  INFO fs::cache: ignoring "/var/log/falcon-sensor.log.3"
2023-09-22T18:52:47.345241Z  INFO fs::cache: ignoring "/var/log/containerd.log"
2023-09-22T18:52:47.345460Z  INFO fs::cache: watching "/var/log/calico"
2023-09-22T18:52:47.345669Z  INFO fs::cache: watching "/var/log/calico/cni"
2023-09-22T18:52:47.345984Z  INFO fs::cache: watching "/var/log/at-no-rotate"
2023-09-22T18:52:47.346064Z  INFO fs::cache: ignoring "/var/log/falcon-sensor.log.2"
2023-09-22T18:52:47.346085Z  INFO fs::cache: ignoring "/var/log/dmesg"
2023-09-22T18:52:47.346313Z  INFO fs::cache: watching "/var/log/journal"
2023-09-22T18:52:47.346477Z  INFO fs::cache: watching "/var/log/journal/8ebcb27021c24eb79f906530d2418392"
2023-09-22T18:52:47.346533Z  INFO fs::cache: ignoring "/var/log/journal/8ebcb27021c24eb79f906530d2418392/system@c9cf30df7c174854a7186154b01631ae-0000000000012f8f-0006056cc4bd4190.journal"
2023-09-22T18:52:47.346564Z  INFO fs::cache: ignoring "/var/log/journal/8ebcb27021c24eb79f906530d2418392/system@c9cf30df7c174854a7186154b01631ae-0000000000035494-000605c361a83579.journal"
2023-09-22T18:52:47.346616Z  INFO fs::cache: ignoring "/var/log/journal/8ebcb27021c24eb79f906530d2418392/system@c9cf30df7c174854a7186154b01631ae-0000000000024a02-0006059724304574.journal"
2023-09-22T18:52:47.346646Z  INFO fs::cache: ignoring "/var/log/journal/8ebcb27021c24eb79f906530d2418392/system@c9cf30df7c174854a7186154b01631ae-0000000000000001-00060548add959f1.journal"
2023-09-22T18:52:47.346671Z  INFO fs::cache: ignoring "/var/log/journal/8ebcb27021c24eb79f906530d2418392/system.journal"
2023-09-22T18:52:47.346791Z  INFO fs::cache: watching "/var/log/journal/fafbcfdcd5e54268be0e9613f8831331"
2023-09-22T18:52:47.346837Z  INFO fs::cache: ignoring "/var/log/journal/fafbcfdcd5e54268be0e9613f8831331/system.journal"
2023-09-22T18:52:47.347115Z  INFO fs::cache: watching "/var/log/ibmc-block.log"
2023-09-22T18:52:47.347154Z  INFO fs::cache: ignoring "/var/log/btmp"
2023-09-22T18:52:47.347299Z  INFO fs::cache: watching "/var/log/falcon-sensor.log"
2023-09-22T18:52:47.347332Z  INFO fs::cache: ignoring "/var/log/wtmp"
2023-09-22T18:52:47.347507Z  INFO fs::cache: watching "/var/log/landscape"
2023-09-22T18:52:47.347852Z  INFO fs::cache: watching "/var/log/landscape/sysinfo.log"
2023-09-22T18:52:47.347903Z  INFO fs::cache: ignoring "/var/log/falcon-sensor.log.6"
2023-09-22T18:52:47.348085Z  INFO fs::cache: watching "/var/log/containers"
2023-09-22T18:52:47.348271Z  INFO fs::cache: watching "/var/log/containers/falco-cbh4d_default_falco-f09c34ca9a1d2797e37d34d6a9718b67d19c055d998842c97194ff3146195e02.log"
2023-09-22T18:52:47.348487Z  INFO fs::cache: watching "/var/log/pods/default_falco-cbh4d_4f33ec1d-f8ee-4926-b2e1-0978fbeafd02/falco"
2023-09-22T18:52:47.348788Z  INFO fs::cache: watching "/var/log/pods/default_falco-cbh4d_4f33ec1d-f8ee-4926-b2e1-0978fbeafd02/falco/0.log"
2023-09-22T18:52:47.348868Z  INFO fs::cache: watching "/var/log/containers/node-exporter-d82x9_default_node-exporter-5dcdf4dee1b14eef09934df9a3a7436e062bef8627494c6d3954d05c1848824d.log"
2023-09-22T18:52:47.349059Z  INFO fs::cache: watching "/var/log/pods/default_node-exporter-d82x9_b0173568-f127-4c99-8d87-3313dba27fd8/node-exporter"
2023-09-22T18:52:47.349343Z  INFO fs::cache: watching "/var/log/pods/default_node-exporter-d82x9_b0173568-f127-4c99-8d87-3313dba27fd8/node-exporter/1.log"
2023-09-22T18:52:47.349425Z  INFO fs::cache: ignoring "/var/log/containers/ibm-master-proxy-static-10.171.175.246_kube-system_ibm-master-proxy-static-2d5ae571af50b44f56a20cf1c284d45687a817b8d0337269bb2a694a2434d9fa.log"
2023-09-22T18:52:47.349527Z  INFO fs::cache: watching "/var/log/containers/eventstreams-update-libblkid-wv97n_default_pause-5f82b7979e672b37d4e6be84df0605a23b07477097d289aa2165138d1100960b.log"
2023-09-22T18:52:47.349698Z  INFO fs::cache: watching "/var/log/pods/default_eventstreams-update-libblkid-wv97n_5fd865b6-ee01-49e8-bb96-c9942e411321/pause"
2023-09-22T18:52:47.350093Z  INFO fs::cache: watching "/var/log/pods/default_eventstreams-update-libblkid-wv97n_5fd865b6-ee01-49e8-bb96-c9942e411321/pause/0.log"
2023-09-22T18:52:47.350173Z  INFO fs::cache: ignoring "/var/log/containers/syslog-configurator-s5hqk_ibm-services-system_syslog-configurator-fb4fd30946fa1dc168f0fc98acb326ebb59d5800ef5c675c993f51bf7c153d74.log"
2023-09-22T18:52:47.350269Z  INFO fs::cache: ignoring "/var/log/containers/calico-node-vp8lt_kube-system_calico-extension-c618cfa09da0a33ff3c390503688d0613b42375df7b4d67a438d6941726db5d4.log"
2023-09-22T18:52:47.350371Z  INFO fs::cache: watching "/var/log/containers/node-local-dns-dxlrd_kube-system_node-cache-b2a5da4af6c0b4cb371f3722f37a11108459cf4152459e0159652cb7965f3adb.log"
2023-09-22T18:52:47.350512Z  INFO fs::cache: watching "/var/log/pods/kube-system_node-local-dns-dxlrd_6fb7d8a3-1055-4fd2-9f09-e639f26572a5/node-cache"
2023-09-22T18:52:47.350773Z  INFO fs::cache: watching "/var/log/pods/kube-system_node-local-dns-dxlrd_6fb7d8a3-1055-4fd2-9f09-e639f26572a5/node-cache/0.log"
2023-09-22T18:52:47.350923Z  INFO fs::cache: ignoring "/var/log/containers/ibm-keepalived-watcher-rhphb_kube-system_keepalived-init-52f8de992b823171d351d7dcb2933984a909a3f7e2a56bd164b514a7fcfc18e5.log"
2023-09-22T18:52:47.351021Z  INFO fs::cache: watching "/var/log/containers/crowdstrike-44w5v_ibm-services-system_crowdstrike-8fbd9260f63b76b1b1b8f681a3bfe0a0ba58a6969f2b3aca06eda4f55151ce7f.log"
2023-09-22T18:52:47.351186Z  INFO fs::cache: watching "/var/log/pods/ibm-services-system_crowdstrike-44w5v_3d931696-d2a9-4944-bf59-d1cad4768345/crowdstrike"
2023-09-22T18:52:47.351474Z  INFO fs::cache: watching "/var/log/pods/ibm-services-system_crowdstrike-44w5v_3d931696-d2a9-4944-bf59-d1cad4768345/crowdstrike/0.log"
2023-09-22T18:52:47.351594Z  INFO fs::cache: ignoring "/var/log/containers/calico-node-vp8lt_kube-system_calico-node-e675c1b4e449acaccf0da11a37863c3b187dfa4d6d627909d7edf8d84a5dbc95.log"
2023-09-22T18:52:47.351708Z  INFO fs::cache: watching "/var/log/containers/log-http-exposer-dxmmb_default_log-http-exposer-9692a2cf3303e6b59af87d11f44aca2b27fcb2c1f19f31f39b051f07c3377934.log"
2023-09-22T18:52:47.351855Z  INFO fs::cache: watching "/var/log/pods/default_log-http-exposer-dxmmb_a8656a71-4d27-4a9d-8f68-41434805c6af/log-http-exposer"
2023-09-22T18:52:47.352150Z  INFO fs::cache: watching "/var/log/pods/default_log-http-exposer-dxmmb_a8656a71-4d27-4a9d-8f68-41434805c6af/log-http-exposer/0.log"
2023-09-22T18:52:47.352262Z  INFO fs::cache: watching "/var/log/containers/falco-cbh4d_default_falco-prom-monitor-fb761e5bd273bf3734dd9725642850d713f8495538a345c04b48a0ccb156a599.log"
2023-09-22T18:52:47.352391Z  INFO fs::cache: watching "/var/log/pods/default_falco-cbh4d_4f33ec1d-f8ee-4926-b2e1-0978fbeafd02/falco-prom-monitor"
2023-09-22T18:52:47.352678Z  INFO fs::cache: watching "/var/log/pods/default_falco-cbh4d_4f33ec1d-f8ee-4926-b2e1-0978fbeafd02/falco-prom-monitor/0.log"
2023-09-22T18:52:47.352812Z  INFO fs::cache: ignoring "/var/log/containers/logdna-agent-22pkm_default_logdna-agent-06940ca788bfd7a2c799adc2db36e8b73a1f6fefd02d9ee4d62645493d87057e.log"
2023-09-22T18:52:47.352907Z  INFO fs::cache: ignoring "/var/log/containers/syslog-configurator-s5hqk_ibm-services-system_dlc-init-e39e99149d531c0d81bc73d0532f86f5842326a1cf1aa03027a5935557c5e266.log"
2023-09-22T18:52:47.353021Z  INFO fs::cache: watching "/var/log/containers/falco-cbh4d_default_falco-metrics-2747e03464d80f578577c2e4c816dbd6aebc1242a7c1c215be6068518f3bdebd.log"
2023-09-22T18:52:47.353155Z  INFO fs::cache: watching "/var/log/pods/default_falco-cbh4d_4f33ec1d-f8ee-4926-b2e1-0978fbeafd02/falco-metrics"
2023-09-22T18:52:47.353370Z  INFO fs::cache: watching "/var/log/pods/default_falco-cbh4d_4f33ec1d-f8ee-4926-b2e1-0978fbeafd02/falco-metrics/0.log"
2023-09-22T18:52:47.353480Z  INFO fs::cache: watching "/var/log/containers/sos-nessus-agent-v9rgd_ibm-services-system_sos-nessus-agent-627feefed422ac344856d0592eb172ff5702d01922f2192e77e97103ef40a447.log"
2023-09-22T18:52:47.353677Z  INFO fs::cache: watching "/var/log/pods/ibm-services-system_sos-nessus-agent-v9rgd_c9180591-e5cd-40b2-a911-cc2fbb42e8a9/sos-nessus-agent"
2023-09-22T18:52:47.353921Z  INFO fs::cache: watching "/var/log/pods/ibm-services-system_sos-nessus-agent-v9rgd_c9180591-e5cd-40b2-a911-cc2fbb42e8a9/sos-nessus-agent/0.log"
2023-09-22T18:52:47.354026Z  INFO fs::cache: watching "/var/log/containers/falco-cbh4d_default_falco-init-1efd496b1ca690551307e94f48175a53b53097045e0092c5ebf8bbb5ffc722e2.log"
2023-09-22T18:52:47.354162Z  INFO fs::cache: watching "/var/log/pods/default_falco-cbh4d_4f33ec1d-f8ee-4926-b2e1-0978fbeafd02/falco-init"
2023-09-22T18:52:47.354418Z  INFO fs::cache: watching "/var/log/pods/default_falco-cbh4d_4f33ec1d-f8ee-4926-b2e1-0978fbeafd02/falco-init/0.log"
2023-09-22T18:52:47.354549Z  INFO fs::cache: ignoring "/var/log/containers/ibm-master-proxy-static-10.171.175.246_kube-system_pause-2449f8bfd5b367f2428f97fa85ea3ca942b751cfb6c12bd6fda7e847b0a908b4.log"
2023-09-22T18:52:47.354653Z  INFO fs::cache: watching "/var/log/containers/node-exporter-d82x9_default_node-exporter-8a23cef3a58ef7f9cb6786fc97ac0506f37914ad4b4acf3a004ae6503575dbee.log"
2023-09-22T18:52:47.355070Z  INFO fs::cache: watching "/var/log/pods/default_node-exporter-d82x9_b0173568-f127-4c99-8d87-3313dba27fd8/node-exporter/0.log"
2023-09-22T18:52:47.355177Z  INFO fs::cache: watching "/var/log/containers/eventstreams-update-libblkid-wv97n_default_init-0adb2abfae4f74ecd55a9b1c93550fa0baddeb0f354118e97b75004b6e9fb5ee.log"
2023-09-22T18:52:47.355297Z  INFO fs::cache: watching "/var/log/pods/default_eventstreams-update-libblkid-wv97n_5fd865b6-ee01-49e8-bb96-c9942e411321/init"
2023-09-22T18:52:47.355627Z  INFO fs::cache: watching "/var/log/pods/default_eventstreams-update-libblkid-wv97n_5fd865b6-ee01-49e8-bb96-c9942e411321/init/0.log"
2023-09-22T18:52:47.355750Z  INFO fs::cache: watching "/var/log/containers/ibmcloud-block-storage-driver-v6pqc_kube-system_ibmcloud-block-storage-driver-container-5f624639d8d55c344f6241cbe6b774aeecc203424807601a1a86a3f4737f2ae6.log"
2023-09-22T18:52:47.355956Z  INFO fs::cache: watching "/var/log/pods/kube-system_ibmcloud-block-storage-driver-v6pqc_95703900-97c7-4d59-abc3-5aea479ace79/ibmcloud-block-storage-driver-container"
2023-09-22T18:52:47.357086Z  INFO fs::cache: watching "/var/log/pods/kube-system_ibmcloud-block-storage-driver-v6pqc_95703900-97c7-4d59-abc3-5aea479ace79/ibmcloud-block-storage-driver-container/0.log"
2023-09-22T18:52:47.357255Z  INFO fs::cache: ignoring "/var/log/containers/logdna-agent-22pkm_default_istio-init-9aefe642bb24d7dd74b2ab56bf5f53aea540b0c3497caf87d60e50d40be3cee1.log"
2023-09-22T18:52:47.357464Z  INFO fs::cache: ignoring "/var/log/containers/falco-cbh4d_default_istio-proxy-c1a697e07cddf66fc6f160e3ff219657c3f4aefe8af409e93ffe48a672f658aa.log"
2023-09-22T18:52:47.357589Z  INFO fs::cache: ignoring "/var/log/containers/ibm-keepalived-watcher-rhphb_kube-system_keepalived-watcher-3d62c008631289412bcf4159706c429a22b74ed0849667b4067344d9ea0e1736.log"
2023-09-22T18:52:47.357653Z  INFO fs::cache: watching "/var/log/containers/change-tracker-9zr7p_ibm-services-system_change-tracker-7e6bcb7755850028a340d8af14f166489f0b3dd80fd4cf2118664bfceac43221.log"
2023-09-22T18:52:47.357848Z  INFO fs::cache: watching "/var/log/pods/ibm-services-system_change-tracker-9zr7p_6effde86-0adc-4836-bb75-8a0d300faba0/change-tracker"
2023-09-22T18:52:47.358162Z  INFO fs::cache: watching "/var/log/pods/ibm-services-system_change-tracker-9zr7p_6effde86-0adc-4836-bb75-8a0d300faba0/change-tracker/0.log"
2023-09-22T18:52:47.358308Z  INFO fs::cache: ignoring "/var/log/containers/calico-node-vp8lt_kube-system_install-cni-1441737bcd4cbe8525b7139a17d77b0670673ce35174efa544a5491d42133d89.log"
2023-09-22T18:52:47.358469Z  INFO fs::cache: ignoring "/var/log/containers/logdna-agent-22pkm_default_istio-proxy-d6e4d6090d5b2e78d20f40ef8b58655c4aa9001ebba369650cabf59ba9c7f6f3.log"
2023-09-22T18:52:47.358640Z  INFO fs::cache: ignoring "/var/log/containers/falco-cbh4d_default_istio-init-416f6c589c2dcd1bb6eb0029e9ca9407a8a7a797d515e8470e6e53822118c082.log"
2023-09-22T18:52:47.358789Z  INFO fs::cache: ignoring "/var/log/containers/konnectivity-agent-t22tv_kube-system_konnectivity-agent-b103b670c4cfba903ef894972520318b6642a5ea011d451ac4f9343535a65ea9.log"
2023-09-22T18:52:47.358925Z  INFO fs::cache: ignoring "/var/log/tallylog"
2023-09-22T18:52:47.358989Z  INFO fs::cache: ignoring "/var/log/docker.log.1.gz"
2023-09-22T18:52:47.359145Z  INFO fs::cache: ignoring "/var/log/ubuntu-advantage.log"
2023-09-22T18:52:47.359230Z  INFO fs::cache: ignoring "/var/log/falcon-sensor.log.5"
2023-09-22T18:52:47.359365Z  INFO fs::cache: ignoring "/var/log/auth.log"
2023-09-22T18:52:47.359452Z  INFO fs::cache: ignoring "/var/log/kubelet.log"
2023-09-22T18:52:47.359561Z  INFO fs::cache: ignoring "/var/log/dpkg.log"
2023-09-22T18:52:47.359695Z  INFO fs::cache: ignoring "/var/log/sudo.log"
2023-09-22T18:52:47.359901Z  INFO fs::cache: watching "/var/log/ubuntu-advantage-timer.log"
2023-09-22T18:52:47.360202Z  INFO fs::cache: watching "/var/log/apt"
2023-09-22T18:52:47.360322Z  INFO fs::cache: ignoring "/var/log/apt/history.log"
2023-09-22T18:52:47.360363Z  INFO fs::cache: ignoring "/var/log/apt/eipp.log.xz"
2023-09-22T18:52:47.360407Z  INFO fs::cache: ignoring "/var/log/apt/term.log"
2023-09-22T18:52:47.360571Z  INFO fs::cache: watching "/var/log/docker.log"
2023-09-22T18:52:47.360684Z  INFO fs::cache: ignoring "/var/log/falconctl.log"
2023-09-22T18:52:47.360906Z  INFO fs::cache: watching "/var/log/unattended-upgrades"
2023-09-22T18:52:47.361049Z  INFO fs::cache: ignoring "/var/log/unattended-upgrades/unattended-upgrades-shutdown.log"
2023-09-22T18:52:47.361099Z  INFO fs::cache: ignoring "/var/log/firstboot.flag"
2023-09-22T18:52:47.361131Z  INFO fs::cache: ignoring "/var/log/intrusion"
2023-09-22T18:52:47.361365Z  INFO fs::cache: watching "/var/log/kern.log"
2023-09-22T18:52:47.361590Z  INFO fs::cache: watching "/var/log/ntpstats"
2023-09-22T18:52:47.361732Z  INFO fs::cache: ignoring "/var/log/dmesg.0"
2023-09-22T18:52:47.361810Z  INFO fs::cache: ignoring "/var/log/bootstrap_base.flag"
2023-09-22T18:52:47.361886Z  INFO fs::cache: ignoring "/var/log/kube-proxy.log"
2023-09-22T18:52:47.362018Z  INFO fs::cache: ignoring "/var/log/cloud-init.log"
2023-09-22T18:52:47.362193Z  INFO fs::cache: watching "/var/log/kubernetes"
2023-09-22T18:52:47.362342Z  INFO fs::cache: ignoring "/var/log/cloud-init-output.log"
2023-09-22T18:52:47.362587Z  INFO fs::cache: watching "/var/log/alternatives.log"
2023-09-22T18:52:47.362698Z  INFO fs::cache: ignoring "/var/log/firstboot.log"
2023-09-22T18:52:47.362737Z  INFO fs::cache: ignoring "/var/log/falcon-sensor.log.7"
2023-09-22T18:52:47.362938Z  INFO fs::cache: watching "/var/log/haproxy.log"
2023-09-22T18:52:47.363037Z  INFO fs::cache: ignoring "/var/log/lastlog"
2023-09-22T18:52:47.363121Z  INFO fs::cache: ignoring "/var/log/falcon-sensor.log.4"
2023-09-22T18:52:47.363178Z  INFO fs::cache: watching "/var/log/pods"
2023-09-22T18:52:47.363336Z  INFO fs::cache: watching "/var/data"
2023-09-22T18:52:47.363517Z  INFO fs::cache: watching "/var/data/kubeletlogs"
2023-09-22T18:52:47.363764Z  INFO fs::cache: watching "/var/data/kubeletlogs/kube-system_ibmcloud-block-storage-driver-v6pqc_95703900-97c7-4d59-abc3-5aea479ace79"
2023-09-22T18:52:47.364019Z  INFO fs::cache: watching "/var/data/kubeletlogs/kube-system_ibmcloud-block-storage-driver-v6pqc_95703900-97c7-4d59-abc3-5aea479ace79/ibmcloud-block-storage-driver-container"
2023-09-22T18:52:47.364324Z  INFO fs::cache: watching "/var/data/kubeletlogs/kube-system_ibmcloud-block-storage-driver-v6pqc_95703900-97c7-4d59-abc3-5aea479ace79/ibmcloud-block-storage-driver-container/0.log"
2023-09-22T18:52:47.364556Z  INFO fs::cache: watching "/var/data/kubeletlogs/ibm-services-system_syslog-configurator-s5hqk_047b731e-d2e6-48a3-bd67-c85aa1ddb2bd"
2023-09-22T18:52:47.364799Z  INFO fs::cache: watching "/var/data/kubeletlogs/ibm-services-system_syslog-configurator-s5hqk_047b731e-d2e6-48a3-bd67-c85aa1ddb2bd/syslog-configurator"
2023-09-22T18:52:47.365082Z  INFO fs::cache: watching "/var/data/kubeletlogs/ibm-services-system_syslog-configurator-s5hqk_047b731e-d2e6-48a3-bd67-c85aa1ddb2bd/syslog-configurator/0.log"
2023-09-22T18:52:47.365368Z  INFO fs::cache: watching "/var/data/kubeletlogs/ibm-services-system_syslog-configurator-s5hqk_047b731e-d2e6-48a3-bd67-c85aa1ddb2bd/dlc-init"
2023-09-22T18:52:47.365621Z  INFO fs::cache: watching "/var/data/kubeletlogs/ibm-services-system_syslog-configurator-s5hqk_047b731e-d2e6-48a3-bd67-c85aa1ddb2bd/dlc-init/0.log"
2023-09-22T18:52:47.365920Z  INFO fs::cache: watching "/var/data/kubeletlogs/default_logdna-agent-22pkm_5bd339f4-80c5-46f6-85a2-867019677953"
2023-09-22T18:52:47.366108Z  INFO fs::cache: watching "/var/data/kubeletlogs/default_logdna-agent-22pkm_5bd339f4-80c5-46f6-85a2-867019677953/istio-init"
2023-09-22T18:52:47.366319Z  INFO fs::cache: watching "/var/data/kubeletlogs/default_logdna-agent-22pkm_5bd339f4-80c5-46f6-85a2-867019677953/istio-init/0.log"
2023-09-22T18:52:47.366507Z  INFO fs::cache: watching "/var/data/kubeletlogs/default_logdna-agent-22pkm_5bd339f4-80c5-46f6-85a2-867019677953/logdna-agent"
2023-09-22T18:52:47.366718Z  INFO fs::cache: watching "/var/data/kubeletlogs/default_logdna-agent-22pkm_5bd339f4-80c5-46f6-85a2-867019677953/logdna-agent/0.log"
2023-09-22T18:52:47.366944Z  INFO fs::cache: watching "/var/data/kubeletlogs/default_logdna-agent-22pkm_5bd339f4-80c5-46f6-85a2-867019677953/istio-proxy"
2023-09-22T18:52:47.367166Z  INFO fs::cache: watching "/var/data/kubeletlogs/default_logdna-agent-22pkm_5bd339f4-80c5-46f6-85a2-867019677953/istio-proxy/0.log"
2023-09-22T18:52:47.367352Z  INFO fs::cache: watching "/var/data/kubeletlogs/kube-system_ibm-master-proxy-static-10.171.175.246_e9335e53e1d0afc4e4a4010bc67adc4d"
2023-09-22T18:52:47.367529Z  INFO fs::cache: watching "/var/data/kubeletlogs/kube-system_ibm-master-proxy-static-10.171.175.246_e9335e53e1d0afc4e4a4010bc67adc4d/ibm-master-proxy-static"
2023-09-22T18:52:47.367754Z  INFO fs::cache: watching "/var/data/kubeletlogs/kube-system_ibm-master-proxy-static-10.171.175.246_e9335e53e1d0afc4e4a4010bc67adc4d/ibm-master-proxy-static/0.log"
2023-09-22T18:52:47.367967Z  INFO fs::cache: watching "/var/data/kubeletlogs/kube-system_ibm-master-proxy-static-10.171.175.246_e9335e53e1d0afc4e4a4010bc67adc4d/ibm-master-proxy-static/0.log.20230919-131554.gz"
2023-09-22T18:52:47.368188Z  INFO fs::cache: watching "/var/data/kubeletlogs/kube-system_ibm-master-proxy-static-10.171.175.246_e9335e53e1d0afc4e4a4010bc67adc4d/ibm-master-proxy-static/0.log.20230921-225749"
2023-09-22T18:52:47.368374Z  INFO fs::cache: watching "/var/data/kubeletlogs/kube-system_ibm-master-proxy-static-10.171.175.246_e9335e53e1d0afc4e4a4010bc67adc4d/pause"
2023-09-22T18:52:47.368594Z  INFO fs::cache: watching "/var/data/kubeletlogs/kube-system_ibm-master-proxy-static-10.171.175.246_e9335e53e1d0afc4e4a4010bc67adc4d/pause/0.log"
2023-09-22T18:52:47.368829Z  INFO fs::cache: watching "/var/data/kubeletlogs/ibm-services-system_crowdstrike-44w5v_3d931696-d2a9-4944-bf59-d1cad4768345"
2023-09-22T18:52:47.369007Z  INFO fs::cache: watching "/var/data/kubeletlogs/ibm-services-system_crowdstrike-44w5v_3d931696-d2a9-4944-bf59-d1cad4768345/crowdstrike"
2023-09-22T18:52:47.369219Z  INFO fs::cache: watching "/var/data/kubeletlogs/ibm-services-system_crowdstrike-44w5v_3d931696-d2a9-4944-bf59-d1cad4768345/crowdstrike/0.log"
2023-09-22T18:52:47.369400Z  INFO fs::cache: watching "/var/data/kubeletlogs/ibm-services-system_change-tracker-9zr7p_6effde86-0adc-4836-bb75-8a0d300faba0"
2023-09-22T18:52:47.369584Z  INFO fs::cache: watching "/var/data/kubeletlogs/ibm-services-system_change-tracker-9zr7p_6effde86-0adc-4836-bb75-8a0d300faba0/change-tracker"
2023-09-22T18:52:47.369807Z  INFO fs::cache: watching "/var/data/kubeletlogs/ibm-services-system_change-tracker-9zr7p_6effde86-0adc-4836-bb75-8a0d300faba0/change-tracker/0.log"
2023-09-22T18:52:47.370000Z  INFO fs::cache: watching "/var/data/kubeletlogs/kube-system_calico-node-vp8lt_43fb6396-5151-49b4-91b0-fab12178d747"
2023-09-22T18:52:47.370183Z  INFO fs::cache: watching "/var/data/kubeletlogs/kube-system_calico-node-vp8lt_43fb6396-5151-49b4-91b0-fab12178d747/calico-node"
2023-09-22T18:52:47.370403Z  INFO fs::cache: watching "/var/data/kubeletlogs/kube-system_calico-node-vp8lt_43fb6396-5151-49b4-91b0-fab12178d747/calico-node/0.log"
2023-09-22T18:52:47.370595Z  INFO fs::cache: watching "/var/data/kubeletlogs/kube-system_calico-node-vp8lt_43fb6396-5151-49b4-91b0-fab12178d747/calico-extension"
2023-09-22T18:52:47.370809Z  INFO fs::cache: watching "/var/data/kubeletlogs/kube-system_calico-node-vp8lt_43fb6396-5151-49b4-91b0-fab12178d747/calico-extension/0.log"
2023-09-22T18:52:47.371063Z  INFO fs::cache: watching "/var/data/kubeletlogs/kube-system_calico-node-vp8lt_43fb6396-5151-49b4-91b0-fab12178d747/install-cni"
2023-09-22T18:52:47.371283Z  INFO fs::cache: watching "/var/data/kubeletlogs/kube-system_calico-node-vp8lt_43fb6396-5151-49b4-91b0-fab12178d747/install-cni/0.log"
2023-09-22T18:52:47.371491Z  INFO fs::cache: watching "/var/data/kubeletlogs/kube-system_ibm-keepalived-watcher-rhphb_3bdf1ec4-39a5-4b3c-853b-e6e62078ce7e"
2023-09-22T18:52:47.371683Z  INFO fs::cache: watching "/var/data/kubeletlogs/kube-system_ibm-keepalived-watcher-rhphb_3bdf1ec4-39a5-4b3c-853b-e6e62078ce7e/keepalived-init"
2023-09-22T18:52:47.371904Z  INFO fs::cache: watching "/var/data/kubeletlogs/kube-system_ibm-keepalived-watcher-rhphb_3bdf1ec4-39a5-4b3c-853b-e6e62078ce7e/keepalived-init/0.log"
2023-09-22T18:52:47.372108Z  INFO fs::cache: watching "/var/data/kubeletlogs/kube-system_ibm-keepalived-watcher-rhphb_3bdf1ec4-39a5-4b3c-853b-e6e62078ce7e/keepalived-watcher"
2023-09-22T18:52:47.372328Z  INFO fs::cache: watching "/var/data/kubeletlogs/kube-system_ibm-keepalived-watcher-rhphb_3bdf1ec4-39a5-4b3c-853b-e6e62078ce7e/keepalived-watcher/0.log"
2023-09-22T18:52:47.372519Z  INFO fs::cache: watching "/var/data/kubeletlogs/ibm-services-system_sos-nessus-agent-v9rgd_c9180591-e5cd-40b2-a911-cc2fbb42e8a9"
2023-09-22T18:52:47.372710Z  INFO fs::cache: watching "/var/data/kubeletlogs/ibm-services-system_sos-nessus-agent-v9rgd_c9180591-e5cd-40b2-a911-cc2fbb42e8a9/sos-nessus-agent"
2023-09-22T18:52:47.372932Z  INFO fs::cache: watching "/var/data/kubeletlogs/ibm-services-system_sos-nessus-agent-v9rgd_c9180591-e5cd-40b2-a911-cc2fbb42e8a9/sos-nessus-agent/0.log"
2023-09-22T18:52:47.373119Z  INFO fs::cache: watching "/var/data/kubeletlogs/default_falco-cbh4d_4f33ec1d-f8ee-4926-b2e1-0978fbeafd02"
2023-09-22T18:52:47.373300Z  INFO fs::cache: watching "/var/data/kubeletlogs/default_falco-cbh4d_4f33ec1d-f8ee-4926-b2e1-0978fbeafd02/istio-init"
2023-09-22T18:52:47.373519Z  INFO fs::cache: watching "/var/data/kubeletlogs/default_falco-cbh4d_4f33ec1d-f8ee-4926-b2e1-0978fbeafd02/istio-init/0.log"
2023-09-22T18:52:47.373705Z  INFO fs::cache: watching "/var/data/kubeletlogs/default_falco-cbh4d_4f33ec1d-f8ee-4926-b2e1-0978fbeafd02/falco-metrics"
2023-09-22T18:52:47.373934Z  INFO fs::cache: watching "/var/data/kubeletlogs/default_falco-cbh4d_4f33ec1d-f8ee-4926-b2e1-0978fbeafd02/falco-metrics/0.log"
2023-09-22T18:52:47.374158Z  INFO fs::cache: watching "/var/data/kubeletlogs/default_falco-cbh4d_4f33ec1d-f8ee-4926-b2e1-0978fbeafd02/falco-prom-monitor"
2023-09-22T18:52:47.374381Z  INFO fs::cache: watching "/var/data/kubeletlogs/default_falco-cbh4d_4f33ec1d-f8ee-4926-b2e1-0978fbeafd02/falco-prom-monitor/0.log"
2023-09-22T18:52:47.374566Z  INFO fs::cache: watching "/var/data/kubeletlogs/default_falco-cbh4d_4f33ec1d-f8ee-4926-b2e1-0978fbeafd02/falco"
2023-09-22T18:52:47.374781Z  INFO fs::cache: watching "/var/data/kubeletlogs/default_falco-cbh4d_4f33ec1d-f8ee-4926-b2e1-0978fbeafd02/falco/0.log"
2023-09-22T18:52:47.378295Z  INFO fs::cache: watching "/var/data/kubeletlogs/default_falco-cbh4d_4f33ec1d-f8ee-4926-b2e1-0978fbeafd02/falco-init"
2023-09-22T18:52:47.378551Z  INFO fs::cache: watching "/var/data/kubeletlogs/default_falco-cbh4d_4f33ec1d-f8ee-4926-b2e1-0978fbeafd02/falco-init/0.log"
2023-09-22T18:52:47.378922Z  INFO fs::cache: watching "/var/data/kubeletlogs/default_falco-cbh4d_4f33ec1d-f8ee-4926-b2e1-0978fbeafd02/istio-proxy"
2023-09-22T18:52:47.379228Z  INFO fs::cache: watching "/var/data/kubeletlogs/default_falco-cbh4d_4f33ec1d-f8ee-4926-b2e1-0978fbeafd02/istio-proxy/0.log"
2023-09-22T18:52:47.379511Z  INFO fs::cache: watching "/var/data/kubeletlogs/kube-system_konnectivity-agent-t22tv_9c7d3fe6-ca7d-4d8b-a6f1-4fa8c224bc91"
2023-09-22T18:52:47.379712Z  INFO fs::cache: watching "/var/data/kubeletlogs/kube-system_konnectivity-agent-t22tv_9c7d3fe6-ca7d-4d8b-a6f1-4fa8c224bc91/konnectivity-agent"
2023-09-22T18:52:47.379962Z  INFO fs::cache: watching "/var/data/kubeletlogs/kube-system_konnectivity-agent-t22tv_9c7d3fe6-ca7d-4d8b-a6f1-4fa8c224bc91/konnectivity-agent/0.log"
2023-09-22T18:52:47.380246Z  INFO fs::cache: watching "/var/data/kubeletlogs/kube-system_konnectivity-agent-t22tv_9c7d3fe6-ca7d-4d8b-a6f1-4fa8c224bc91/konnectivity-agent/0.log.20230921-003213"
2023-09-22T18:52:47.380481Z  INFO fs::cache: watching "/var/data/kubeletlogs/kube-system_konnectivity-agent-t22tv_9c7d3fe6-ca7d-4d8b-a6f1-4fa8c224bc91/konnectivity-agent/0.log.20230917-181405.gz"
2023-09-22T18:52:47.380691Z  INFO fs::cache: watching "/var/data/kubeletlogs/kube-system_node-local-dns-dxlrd_6fb7d8a3-1055-4fd2-9f09-e639f26572a5"
2023-09-22T18:52:47.381072Z  INFO fs::cache: watching "/var/data/kubeletlogs/kube-system_node-local-dns-dxlrd_6fb7d8a3-1055-4fd2-9f09-e639f26572a5/node-cache"
2023-09-22T18:52:47.381329Z  INFO fs::cache: watching "/var/data/kubeletlogs/kube-system_node-local-dns-dxlrd_6fb7d8a3-1055-4fd2-9f09-e639f26572a5/node-cache/0.log"
2023-09-22T18:52:47.381592Z  INFO fs::cache: watching "/var/data/kubeletlogs/default_log-http-exposer-dxmmb_a8656a71-4d27-4a9d-8f68-41434805c6af"
2023-09-22T18:52:47.381873Z  INFO fs::cache: watching "/var/data/kubeletlogs/default_log-http-exposer-dxmmb_a8656a71-4d27-4a9d-8f68-41434805c6af/log-http-exposer"
2023-09-22T18:52:47.382142Z  INFO fs::cache: watching "/var/data/kubeletlogs/default_log-http-exposer-dxmmb_a8656a71-4d27-4a9d-8f68-41434805c6af/log-http-exposer/0.log"
2023-09-22T18:52:47.382364Z  INFO fs::cache: watching "/var/data/kubeletlogs/default_node-exporter-d82x9_b0173568-f127-4c99-8d87-3313dba27fd8"
2023-09-22T18:52:47.382567Z  INFO fs::cache: watching "/var/data/kubeletlogs/default_node-exporter-d82x9_b0173568-f127-4c99-8d87-3313dba27fd8/node-exporter"
2023-09-22T18:52:47.382814Z  INFO fs::cache: watching "/var/data/kubeletlogs/default_node-exporter-d82x9_b0173568-f127-4c99-8d87-3313dba27fd8/node-exporter/0.log"
2023-09-22T18:52:47.383093Z  INFO fs::cache: watching "/var/data/kubeletlogs/default_node-exporter-d82x9_b0173568-f127-4c99-8d87-3313dba27fd8/node-exporter/1.log"
2023-09-22T18:52:47.383391Z  INFO fs::cache: watching "/var/data/kubeletlogs/default_eventstreams-update-libblkid-wv97n_5fd865b6-ee01-49e8-bb96-c9942e411321"
2023-09-22T18:52:47.383617Z  INFO fs::cache: watching "/var/data/kubeletlogs/default_eventstreams-update-libblkid-wv97n_5fd865b6-ee01-49e8-bb96-c9942e411321/init"
2023-09-22T18:52:47.383931Z  INFO fs::cache: watching "/var/data/kubeletlogs/default_eventstreams-update-libblkid-wv97n_5fd865b6-ee01-49e8-bb96-c9942e411321/init/0.log"
2023-09-22T18:52:47.384161Z  INFO fs::cache: watching "/var/data/kubeletlogs/default_eventstreams-update-libblkid-wv97n_5fd865b6-ee01-49e8-bb96-c9942e411321/pause"
2023-09-22T18:52:47.384377Z  INFO fs::cache: watching "/var/data/kubeletlogs/default_eventstreams-update-libblkid-wv97n_5fd865b6-ee01-49e8-bb96-c9942e411321/pause/0.log"
2023-09-22T18:52:47.384526Z  INFO fs::cache: watching "/var/log/private"
2023-09-22T18:52:47.384630Z  INFO fs::cache: ignoring "/var/log/falcon-sensor.log.1"
2023-09-22T18:52:47.384889Z  INFO fs::cache: watching "/var/log/falcond.log"
2023-09-22T18:52:47.385035Z  INFO fs::cache: watching "/var/log/dist-upgrade"
2023-09-22T18:52:47.385253Z  INFO fs::cache: watching "/var/log/at"
2023-09-22T18:52:47.392358Z  WARN fs::cache: watch descriptor for /var/log/ already exists...
2023-09-22T18:52:47.392676Z  WARN fs::cache: watch descriptor for /var/log/ibmc-block.log already exists...
2023-09-22T18:52:47.392771Z  WARN fs::cache: watch descriptor for /var/log/falcon-sensor.log already exists...
2023-09-22T18:52:47.393095Z  WARN fs::cache: watch descriptor for /var/log/ubuntu-advantage-timer.log already exists...
2023-09-22T18:52:47.393112Z  WARN fs::cache: watch descriptor for /var/log/docker.log already exists...
2023-09-22T18:52:47.393205Z  WARN fs::cache: watch descriptor for /var/log/kern.log already exists...
2023-09-22T18:52:47.393368Z  WARN fs::cache: watch descriptor for /var/log/alternatives.log already exists...
2023-09-22T18:52:47.393440Z  WARN fs::cache: watch descriptor for /var/log/haproxy.log already exists...
2023-09-22T18:52:47.393525Z  WARN fs::cache: watch descriptor for /var/log/falcond.log already exists...
2023-09-22T18:52:47.393537Z  WARN fs::cache: watch descriptor for /var/log/calico already exists...
2023-09-22T18:52:47.393547Z  WARN fs::cache: watch descriptor for /var/log/calico/cni already exists...
2023-09-22T18:52:47.393557Z  WARN fs::cache: watch descriptor for /var/log/at-no-rotate already exists...
2023-09-22T18:52:47.393567Z  WARN fs::cache: watch descriptor for /var/log/journal already exists...
2023-09-22T18:52:47.393579Z  WARN fs::cache: watch descriptor for /var/log/journal/8ebcb27021c24eb79f906530d2418392 already exists...
2023-09-22T18:52:47.393799Z  WARN fs::cache: watch descriptor for /var/log/journal/fafbcfdcd5e54268be0e9613f8831331 already exists...
2023-09-22T18:52:47.393847Z  WARN fs::cache: watch descriptor for /var/log/landscape already exists...
2023-09-22T18:52:47.393863Z  WARN fs::cache: watch descriptor for /var/log/landscape/sysinfo.log already exists...
2023-09-22T18:52:47.393874Z  WARN fs::cache: watch descriptor for /var/log/containers already exists...
2023-09-22T18:52:47.393904Z  WARN fs::cache: watch descriptor for /var/log/containers/falco-cbh4d_default_falco-f09c34ca9a1d2797e37d34d6a9718b67d19c055d998842c97194ff3146195e02.log already exists...
2023-09-22T18:52:47.393937Z  WARN fs::cache: watch descriptor for /var/log/containers/node-exporter-d82x9_default_node-exporter-5dcdf4dee1b14eef09934df9a3a7436e062bef8627494c6d3954d05c1848824d.log already exists...
2023-09-22T18:52:47.394029Z  WARN fs::cache: watch descriptor for /var/log/containers/eventstreams-update-libblkid-wv97n_default_pause-5f82b7979e672b37d4e6be84df0605a23b07477097d289aa2165138d1100960b.log already exists...
2023-09-22T18:52:47.394172Z  WARN fs::cache: watch descriptor for /var/log/containers/node-local-dns-dxlrd_kube-system_node-cache-b2a5da4af6c0b4cb371f3722f37a11108459cf4152459e0159652cb7965f3adb.log already exists...
2023-09-22T18:52:47.394258Z  WARN fs::cache: watch descriptor for /var/log/containers/crowdstrike-44w5v_ibm-services-system_crowdstrike-8fbd9260f63b76b1b1b8f681a3bfe0a0ba58a6969f2b3aca06eda4f55151ce7f.log already exists...
2023-09-22T18:52:47.394343Z  WARN fs::cache: watch descriptor for /var/log/containers/log-http-exposer-dxmmb_default_log-http-exposer-9692a2cf3303e6b59af87d11f44aca2b27fcb2c1f19f31f39b051f07c3377934.log already exists...
2023-09-22T18:52:47.394376Z  WARN fs::cache: watch descriptor for /var/log/containers/falco-cbh4d_default_falco-prom-monitor-fb761e5bd273bf3734dd9725642850d713f8495538a345c04b48a0ccb156a599.log already exists...
2023-09-22T18:52:47.394525Z  WARN fs::cache: watch descriptor for /var/log/containers/falco-cbh4d_default_falco-metrics-2747e03464d80f578577c2e4c816dbd6aebc1242a7c1c215be6068518f3bdebd.log already exists...
2023-09-22T18:52:47.394564Z  WARN fs::cache: watch descriptor for /var/log/containers/sos-nessus-agent-v9rgd_ibm-services-system_sos-nessus-agent-627feefed422ac344856d0592eb172ff5702d01922f2192e77e97103ef40a447.log already exists...
2023-09-22T18:52:47.394596Z  WARN fs::cache: watch descriptor for /var/log/containers/falco-cbh4d_default_falco-init-1efd496b1ca690551307e94f48175a53b53097045e0092c5ebf8bbb5ffc722e2.log already exists...
2023-09-22T18:52:47.394683Z  WARN fs::cache: watch descriptor for /var/log/containers/node-exporter-d82x9_default_node-exporter-8a23cef3a58ef7f9cb6786fc97ac0506f37914ad4b4acf3a004ae6503575dbee.log already exists...
2023-09-22T18:52:47.394716Z  WARN fs::cache: watch descriptor for /var/log/containers/eventstreams-update-libblkid-wv97n_default_init-0adb2abfae4f74ecd55a9b1c93550fa0baddeb0f354118e97b75004b6e9fb5ee.log already exists...
2023-09-22T18:52:47.394754Z  WARN fs::cache: watch descriptor for /var/log/containers/ibmcloud-block-storage-driver-v6pqc_kube-system_ibmcloud-block-storage-driver-container-5f624639d8d55c344f6241cbe6b774aeecc203424807601a1a86a3f4737f2ae6.log already exists...
2023-09-22T18:52:47.395002Z  WARN fs::cache: watch descriptor for /var/log/containers/change-tracker-9zr7p_ibm-services-system_change-tracker-7e6bcb7755850028a340d8af14f166489f0b3dd80fd4cf2118664bfceac43221.log already exists...
2023-09-22T18:52:47.395242Z  WARN fs::cache: watch descriptor for /var/log/apt already exists...
2023-09-22T18:52:47.395376Z  WARN fs::cache: watch descriptor for /var/log/unattended-upgrades already exists...
2023-09-22T18:52:47.395439Z  WARN fs::cache: watch descriptor for /var/log/ntpstats already exists...
2023-09-22T18:52:47.395449Z  WARN fs::cache: watch descriptor for /var/log/kubernetes already exists...
2023-09-22T18:52:47.395459Z  WARN fs::cache: watch descriptor for /var/log/pods already exists...
2023-09-22T18:52:47.395880Z  INFO fs::cache: watching "/var/log/pods/kube-system_ibmcloud-block-storage-driver-v6pqc_95703900-97c7-4d59-abc3-5aea479ace79"
2023-09-22T18:52:47.395939Z  WARN fs::cache: watch descriptor for /var/log/pods/kube-system_ibmcloud-block-storage-driver-v6pqc_95703900-97c7-4d59-abc3-5aea479ace79/ibmcloud-block-storage-driver-container already exists...
2023-09-22T18:52:47.396003Z  WARN fs::cache: watch descriptor for /var/log/pods/kube-system_ibmcloud-block-storage-driver-v6pqc_95703900-97c7-4d59-abc3-5aea479ace79/ibmcloud-block-storage-driver-container/0.log already exists...
2023-09-22T18:52:47.396160Z  INFO fs::cache: watching "/var/log/pods/ibm-services-system_syslog-configurator-s5hqk_047b731e-d2e6-48a3-bd67-c85aa1ddb2bd"
2023-09-22T18:52:47.396360Z  INFO fs::cache: watching "/var/log/pods/ibm-services-system_syslog-configurator-s5hqk_047b731e-d2e6-48a3-bd67-c85aa1ddb2bd/syslog-configurator"
2023-09-22T18:52:47.396489Z  INFO fs::cache: ignoring "/var/log/pods/ibm-services-system_syslog-configurator-s5hqk_047b731e-d2e6-48a3-bd67-c85aa1ddb2bd/syslog-configurator/0.log"
2023-09-22T18:52:47.396784Z  INFO fs::cache: watching "/var/log/pods/ibm-services-system_syslog-configurator-s5hqk_047b731e-d2e6-48a3-bd67-c85aa1ddb2bd/dlc-init"
2023-09-22T18:52:47.396918Z  INFO fs::cache: ignoring "/var/log/pods/ibm-services-system_syslog-configurator-s5hqk_047b731e-d2e6-48a3-bd67-c85aa1ddb2bd/dlc-init/0.log"
2023-09-22T18:52:47.397193Z  INFO fs::cache: watching "/var/log/pods/default_logdna-agent-22pkm_5bd339f4-80c5-46f6-85a2-867019677953"
2023-09-22T18:52:47.397478Z  INFO fs::cache: watching "/var/log/pods/default_logdna-agent-22pkm_5bd339f4-80c5-46f6-85a2-867019677953/istio-init"
2023-09-22T18:52:47.397631Z  INFO fs::cache: ignoring "/var/log/pods/default_logdna-agent-22pkm_5bd339f4-80c5-46f6-85a2-867019677953/istio-init/0.log"
2023-09-22T18:52:47.397832Z  INFO fs::cache: watching "/var/log/pods/default_logdna-agent-22pkm_5bd339f4-80c5-46f6-85a2-867019677953/logdna-agent"
2023-09-22T18:52:47.397957Z  INFO fs::cache: ignoring "/var/log/pods/default_logdna-agent-22pkm_5bd339f4-80c5-46f6-85a2-867019677953/logdna-agent/0.log"
2023-09-22T18:52:47.400391Z  INFO fs::cache: watching "/var/log/pods/default_logdna-agent-22pkm_5bd339f4-80c5-46f6-85a2-867019677953/istio-proxy"
2023-09-22T18:52:47.400472Z  INFO fs::cache: ignoring "/var/log/pods/default_logdna-agent-22pkm_5bd339f4-80c5-46f6-85a2-867019677953/istio-proxy/0.log"
2023-09-22T18:52:47.402047Z  INFO fs::cache: watching "/var/log/pods/kube-system_ibm-master-proxy-static-10.171.175.246_e9335e53e1d0afc4e4a4010bc67adc4d"
2023-09-22T18:52:47.402393Z  INFO fs::cache: watching "/var/log/pods/kube-system_ibm-master-proxy-static-10.171.175.246_e9335e53e1d0afc4e4a4010bc67adc4d/ibm-master-proxy-static"
2023-09-22T18:52:47.402482Z  INFO fs::cache: ignoring "/var/log/pods/kube-system_ibm-master-proxy-static-10.171.175.246_e9335e53e1d0afc4e4a4010bc67adc4d/ibm-master-proxy-static/0.log"
2023-09-22T18:52:47.402536Z  INFO fs::cache: ignoring "/var/log/pods/kube-system_ibm-master-proxy-static-10.171.175.246_e9335e53e1d0afc4e4a4010bc67adc4d/ibm-master-proxy-static/0.log.20230919-131554.gz"
2023-09-22T18:52:47.402631Z  INFO fs::cache: ignoring "/var/log/pods/kube-system_ibm-master-proxy-static-10.171.175.246_e9335e53e1d0afc4e4a4010bc67adc4d/ibm-master-proxy-static/0.log.20230921-225749"
2023-09-22T18:52:47.403089Z  INFO fs::cache: watching "/var/log/pods/kube-system_ibm-master-proxy-static-10.171.175.246_e9335e53e1d0afc4e4a4010bc67adc4d/pause"
2023-09-22T18:52:47.403172Z  INFO fs::cache: ignoring "/var/log/pods/kube-system_ibm-master-proxy-static-10.171.175.246_e9335e53e1d0afc4e4a4010bc67adc4d/pause/0.log"
2023-09-22T18:52:47.403515Z  INFO fs::cache: watching "/var/log/pods/ibm-services-system_crowdstrike-44w5v_3d931696-d2a9-4944-bf59-d1cad4768345"
2023-09-22T18:52:47.403557Z  WARN fs::cache: watch descriptor for /var/log/pods/ibm-services-system_crowdstrike-44w5v_3d931696-d2a9-4944-bf59-d1cad4768345/crowdstrike already exists...
2023-09-22T18:52:47.403591Z  WARN fs::cache: watch descriptor for /var/log/pods/ibm-services-system_crowdstrike-44w5v_3d931696-d2a9-4944-bf59-d1cad4768345/crowdstrike/0.log already exists...
2023-09-22T18:52:47.404033Z  INFO fs::cache: watching "/var/log/pods/ibm-services-system_change-tracker-9zr7p_6effde86-0adc-4836-bb75-8a0d300faba0"
2023-09-22T18:52:47.404090Z  WARN fs::cache: watch descriptor for /var/log/pods/ibm-services-system_change-tracker-9zr7p_6effde86-0adc-4836-bb75-8a0d300faba0/change-tracker already exists...
2023-09-22T18:52:47.404149Z  WARN fs::cache: watch descriptor for /var/log/pods/ibm-services-system_change-tracker-9zr7p_6effde86-0adc-4836-bb75-8a0d300faba0/change-tracker/0.log already exists...
2023-09-22T18:52:47.404682Z  INFO fs::cache: watching "/var/log/pods/kube-system_calico-node-vp8lt_43fb6396-5151-49b4-91b0-fab12178d747"
2023-09-22T18:52:47.405004Z  INFO fs::cache: watching "/var/log/pods/kube-system_calico-node-vp8lt_43fb6396-5151-49b4-91b0-fab12178d747/calico-node"
2023-09-22T18:52:47.405094Z  INFO fs::cache: ignoring "/var/log/pods/kube-system_calico-node-vp8lt_43fb6396-5151-49b4-91b0-fab12178d747/calico-node/0.log"
2023-09-22T18:52:47.405428Z  INFO fs::cache: watching "/var/log/pods/kube-system_calico-node-vp8lt_43fb6396-5151-49b4-91b0-fab12178d747/calico-extension"
2023-09-22T18:52:47.405512Z  INFO fs::cache: ignoring "/var/log/pods/kube-system_calico-node-vp8lt_43fb6396-5151-49b4-91b0-fab12178d747/calico-extension/0.log"
2023-09-22T18:52:47.405856Z  INFO fs::cache: watching "/var/log/pods/kube-system_calico-node-vp8lt_43fb6396-5151-49b4-91b0-fab12178d747/install-cni"
2023-09-22T18:52:47.405937Z  INFO fs::cache: ignoring "/var/log/pods/kube-system_calico-node-vp8lt_43fb6396-5151-49b4-91b0-fab12178d747/install-cni/0.log"
2023-09-22T18:52:47.406286Z  INFO fs::cache: watching "/var/log/pods/kube-system_ibm-keepalived-watcher-rhphb_3bdf1ec4-39a5-4b3c-853b-e6e62078ce7e"
2023-09-22T18:52:47.406582Z  INFO fs::cache: watching "/var/log/pods/kube-system_ibm-keepalived-watcher-rhphb_3bdf1ec4-39a5-4b3c-853b-e6e62078ce7e/keepalived-init"
2023-09-22T18:52:47.406682Z  INFO fs::cache: ignoring "/var/log/pods/kube-system_ibm-keepalived-watcher-rhphb_3bdf1ec4-39a5-4b3c-853b-e6e62078ce7e/keepalived-init/0.log"
2023-09-22T18:52:47.407122Z  INFO fs::cache: watching "/var/log/pods/kube-system_ibm-keepalived-watcher-rhphb_3bdf1ec4-39a5-4b3c-853b-e6e62078ce7e/keepalived-watcher"
2023-09-22T18:52:47.407207Z  INFO fs::cache: ignoring "/var/log/pods/kube-system_ibm-keepalived-watcher-rhphb_3bdf1ec4-39a5-4b3c-853b-e6e62078ce7e/keepalived-watcher/0.log"
2023-09-22T18:52:47.407737Z  INFO fs::cache: watching "/var/log/pods/ibm-services-system_sos-nessus-agent-v9rgd_c9180591-e5cd-40b2-a911-cc2fbb42e8a9"
2023-09-22T18:52:47.407784Z  WARN fs::cache: watch descriptor for /var/log/pods/ibm-services-system_sos-nessus-agent-v9rgd_c9180591-e5cd-40b2-a911-cc2fbb42e8a9/sos-nessus-agent already exists...
2023-09-22T18:52:47.407819Z  WARN fs::cache: watch descriptor for /var/log/pods/ibm-services-system_sos-nessus-agent-v9rgd_c9180591-e5cd-40b2-a911-cc2fbb42e8a9/sos-nessus-agent/0.log already exists...
2023-09-22T18:52:47.408164Z  INFO fs::cache: watching "/var/log/pods/default_falco-cbh4d_4f33ec1d-f8ee-4926-b2e1-0978fbeafd02"
2023-09-22T18:52:47.408664Z  INFO fs::cache: watching "/var/log/pods/default_falco-cbh4d_4f33ec1d-f8ee-4926-b2e1-0978fbeafd02/istio-init"
2023-09-22T18:52:47.408747Z  INFO fs::cache: ignoring "/var/log/pods/default_falco-cbh4d_4f33ec1d-f8ee-4926-b2e1-0978fbeafd02/istio-init/0.log"
2023-09-22T18:52:47.408820Z  WARN fs::cache: watch descriptor for /var/log/pods/default_falco-cbh4d_4f33ec1d-f8ee-4926-b2e1-0978fbeafd02/falco-metrics already exists...
2023-09-22T18:52:47.408878Z  WARN fs::cache: watch descriptor for /var/log/pods/default_falco-cbh4d_4f33ec1d-f8ee-4926-b2e1-0978fbeafd02/falco-metrics/0.log already exists...
2023-09-22T18:52:47.408902Z  WARN fs::cache: watch descriptor for /var/log/pods/default_falco-cbh4d_4f33ec1d-f8ee-4926-b2e1-0978fbeafd02/falco-prom-monitor already exists...
2023-09-22T18:52:47.408948Z  WARN fs::cache: watch descriptor for /var/log/pods/default_falco-cbh4d_4f33ec1d-f8ee-4926-b2e1-0978fbeafd02/falco-prom-monitor/0.log already exists...
2023-09-22T18:52:47.408973Z  WARN fs::cache: watch descriptor for /var/log/pods/default_falco-cbh4d_4f33ec1d-f8ee-4926-b2e1-0978fbeafd02/falco already exists...
2023-09-22T18:52:47.409026Z  WARN fs::cache: watch descriptor for /var/log/pods/default_falco-cbh4d_4f33ec1d-f8ee-4926-b2e1-0978fbeafd02/falco/0.log already exists...
2023-09-22T18:52:47.409052Z  WARN fs::cache: watch descriptor for /var/log/pods/default_falco-cbh4d_4f33ec1d-f8ee-4926-b2e1-0978fbeafd02/falco-init already exists...
2023-09-22T18:52:47.409096Z  WARN fs::cache: watch descriptor for /var/log/pods/default_falco-cbh4d_4f33ec1d-f8ee-4926-b2e1-0978fbeafd02/falco-init/0.log already exists...
2023-09-22T18:52:47.409584Z  INFO fs::cache: watching "/var/log/pods/default_falco-cbh4d_4f33ec1d-f8ee-4926-b2e1-0978fbeafd02/istio-proxy"
2023-09-22T18:52:47.409664Z  INFO fs::cache: ignoring "/var/log/pods/default_falco-cbh4d_4f33ec1d-f8ee-4926-b2e1-0978fbeafd02/istio-proxy/0.log"
2023-09-22T18:52:47.410078Z  INFO fs::cache: watching "/var/log/pods/kube-system_konnectivity-agent-t22tv_9c7d3fe6-ca7d-4d8b-a6f1-4fa8c224bc91"
2023-09-22T18:52:47.410446Z  INFO fs::cache: watching "/var/log/pods/kube-system_konnectivity-agent-t22tv_9c7d3fe6-ca7d-4d8b-a6f1-4fa8c224bc91/konnectivity-agent"
2023-09-22T18:52:47.410538Z  INFO fs::cache: ignoring "/var/log/pods/kube-system_konnectivity-agent-t22tv_9c7d3fe6-ca7d-4d8b-a6f1-4fa8c224bc91/konnectivity-agent/0.log"
2023-09-22T18:52:47.410592Z  INFO fs::cache: ignoring "/var/log/pods/kube-system_konnectivity-agent-t22tv_9c7d3fe6-ca7d-4d8b-a6f1-4fa8c224bc91/konnectivity-agent/0.log.20230921-003213"
2023-09-22T18:52:47.410695Z  INFO fs::cache: ignoring "/var/log/pods/kube-system_konnectivity-agent-t22tv_9c7d3fe6-ca7d-4d8b-a6f1-4fa8c224bc91/konnectivity-agent/0.log.20230917-181405.gz"
2023-09-22T18:52:47.411243Z  INFO fs::cache: watching "/var/log/pods/kube-system_node-local-dns-dxlrd_6fb7d8a3-1055-4fd2-9f09-e639f26572a5"
2023-09-22T18:52:47.411286Z  WARN fs::cache: watch descriptor for /var/log/pods/kube-system_node-local-dns-dxlrd_6fb7d8a3-1055-4fd2-9f09-e639f26572a5/node-cache already exists...
2023-09-22T18:52:47.411317Z  WARN fs::cache: watch descriptor for /var/log/pods/kube-system_node-local-dns-dxlrd_6fb7d8a3-1055-4fd2-9f09-e639f26572a5/node-cache/0.log already exists...
2023-09-22T18:52:47.411732Z  INFO fs::cache: watching "/var/log/pods/default_log-http-exposer-dxmmb_a8656a71-4d27-4a9d-8f68-41434805c6af"
2023-09-22T18:52:47.411798Z  WARN fs::cache: watch descriptor for /var/log/pods/default_log-http-exposer-dxmmb_a8656a71-4d27-4a9d-8f68-41434805c6af/log-http-exposer already exists...
2023-09-22T18:52:47.411829Z  WARN fs::cache: watch descriptor for /var/log/pods/default_log-http-exposer-dxmmb_a8656a71-4d27-4a9d-8f68-41434805c6af/log-http-exposer/0.log already exists...
2023-09-22T18:52:47.414920Z  INFO fs::cache: watching "/var/log/pods/default_node-exporter-d82x9_b0173568-f127-4c99-8d87-3313dba27fd8"
2023-09-22T18:52:47.414966Z  WARN fs::cache: watch descriptor for /var/log/pods/default_node-exporter-d82x9_b0173568-f127-4c99-8d87-3313dba27fd8/node-exporter already exists...
2023-09-22T18:52:47.415000Z  WARN fs::cache: watch descriptor for /var/log/pods/default_node-exporter-d82x9_b0173568-f127-4c99-8d87-3313dba27fd8/node-exporter/0.log already exists...
2023-09-22T18:52:47.415027Z  WARN fs::cache: watch descriptor for /var/log/pods/default_node-exporter-d82x9_b0173568-f127-4c99-8d87-3313dba27fd8/node-exporter/1.log already exists...
2023-09-22T18:52:47.415276Z  INFO fs::cache: watching "/var/log/pods/default_eventstreams-update-libblkid-wv97n_5fd865b6-ee01-49e8-bb96-c9942e411321"
2023-09-22T18:52:47.415325Z  WARN fs::cache: watch descriptor for /var/log/pods/default_eventstreams-update-libblkid-wv97n_5fd865b6-ee01-49e8-bb96-c9942e411321/init already exists...
2023-09-22T18:52:47.415388Z  WARN fs::cache: watch descriptor for /var/log/pods/default_eventstreams-update-libblkid-wv97n_5fd865b6-ee01-49e8-bb96-c9942e411321/init/0.log already exists...
2023-09-22T18:52:47.415415Z  WARN fs::cache: watch descriptor for /var/log/pods/default_eventstreams-update-libblkid-wv97n_5fd865b6-ee01-49e8-bb96-c9942e411321/pause already exists...
2023-09-22T18:52:47.415467Z  WARN fs::cache: watch descriptor for /var/log/pods/default_eventstreams-update-libblkid-wv97n_5fd865b6-ee01-49e8-bb96-c9942e411321/pause/0.log already exists...
2023-09-22T18:52:47.415486Z  WARN fs::cache: watch descriptor for /var/log/private already exists...
2023-09-22T18:52:47.415500Z  WARN fs::cache: watch descriptor for /var/log/dist-upgrade already exists...
2023-09-22T18:52:47.415524Z  WARN fs::cache: watch descriptor for /var/log/at already exists...
2023-09-22T18:52:47.415986Z  INFO notify_stream: Shutting down watcher
2023-09-22T18:52:47.417225Z  INFO notify_stream: Shutting down watcher
2023-09-22T18:52:47.418246Z  INFO fs::tail: initialize event for file /var/log/ibmc-block.log
2023-09-22T18:52:47.418585Z  INFO fs::tail: initialize event for file /var/log/falcon-sensor.log
2023-09-22T18:52:47.418719Z  INFO fs::tail: initialize event for file /var/log/landscape/sysinfo.log
2023-09-22T18:52:47.419016Z  INFO fs::tail: initialize event for symlink /var/log/containers/falco-cbh4d_default_falco-f09c34ca9a1d2797e37d34d6a9718b67d19c055d998842c97194ff3146195e02.log, final target /var/log/pods/default_falco-cbh4d_4f33ec1d-f8ee-4926-b2e1-0978fbeafd02/falco/0.log
2023-09-22T18:52:47.419058Z  INFO fs::tail: initialized symlink "/var/log/containers/falco-cbh4d_default_falco-f09c34ca9a1d2797e37d34d6a9718b67d19c055d998842c97194ff3146195e02.log" as DefaultKey(15v1)
2023-09-22T18:52:47.419184Z  INFO fs::tail: initialize event for file /var/log/pods/default_falco-cbh4d_4f33ec1d-f8ee-4926-b2e1-0978fbeafd02/falco/0.log
2023-09-22T18:52:47.419298Z  INFO fs::tail: initialize event for symlink /var/log/containers/node-exporter-d82x9_default_node-exporter-5dcdf4dee1b14eef09934df9a3a7436e062bef8627494c6d3954d05c1848824d.log, final target /var/log/pods/default_node-exporter-d82x9_b0173568-f127-4c99-8d87-3313dba27fd8/node-exporter/1.log
2023-09-22T18:52:47.419306Z  INFO fs::tail: initialized symlink "/var/log/containers/node-exporter-d82x9_default_node-exporter-5dcdf4dee1b14eef09934df9a3a7436e062bef8627494c6d3954d05c1848824d.log" as DefaultKey(18v1)
2023-09-22T18:52:47.419412Z  INFO fs::tail: initialize event for file /var/log/pods/default_node-exporter-d82x9_b0173568-f127-4c99-8d87-3313dba27fd8/node-exporter/1.log
2023-09-22T18:52:47.419524Z  INFO fs::tail: initialize event for symlink /var/log/containers/eventstreams-update-libblkid-wv97n_default_pause-5f82b7979e672b37d4e6be84df0605a23b07477097d289aa2165138d1100960b.log, final target /var/log/pods/default_eventstreams-update-libblkid-wv97n_5fd865b6-ee01-49e8-bb96-c9942e411321/pause/0.log
2023-09-22T18:52:47.419532Z  INFO fs::tail: initialized symlink "/var/log/containers/eventstreams-update-libblkid-wv97n_default_pause-5f82b7979e672b37d4e6be84df0605a23b07477097d289aa2165138d1100960b.log" as DefaultKey(21v1)
2023-09-22T18:52:47.419636Z  INFO fs::tail: initialize event for file /var/log/pods/default_eventstreams-update-libblkid-wv97n_5fd865b6-ee01-49e8-bb96-c9942e411321/pause/0.log
2023-09-22T18:52:47.419750Z  INFO fs::tail: initialize event for symlink /var/log/containers/node-local-dns-dxlrd_kube-system_node-cache-b2a5da4af6c0b4cb371f3722f37a11108459cf4152459e0159652cb7965f3adb.log, final target /var/log/pods/kube-system_node-local-dns-dxlrd_6fb7d8a3-1055-4fd2-9f09-e639f26572a5/node-cache/0.log
2023-09-22T18:52:47.419757Z  INFO fs::tail: initialized symlink "/var/log/containers/node-local-dns-dxlrd_kube-system_node-cache-b2a5da4af6c0b4cb371f3722f37a11108459cf4152459e0159652cb7965f3adb.log" as DefaultKey(24v1)
2023-09-22T18:52:47.419860Z  INFO fs::tail: initialize event for file /var/log/pods/kube-system_node-local-dns-dxlrd_6fb7d8a3-1055-4fd2-9f09-e639f26572a5/node-cache/0.log
2023-09-22T18:52:47.419974Z  INFO fs::tail: initialize event for symlink /var/log/containers/crowdstrike-44w5v_ibm-services-system_crowdstrike-8fbd9260f63b76b1b1b8f681a3bfe0a0ba58a6969f2b3aca06eda4f55151ce7f.log, final target /var/log/pods/ibm-services-system_crowdstrike-44w5v_3d931696-d2a9-4944-bf59-d1cad4768345/crowdstrike/0.log
2023-09-22T18:52:47.419982Z  INFO fs::tail: initialized symlink "/var/log/containers/crowdstrike-44w5v_ibm-services-system_crowdstrike-8fbd9260f63b76b1b1b8f681a3bfe0a0ba58a6969f2b3aca06eda4f55151ce7f.log" as DefaultKey(27v1)
2023-09-22T18:52:47.420086Z  INFO fs::tail: initialize event for file /var/log/pods/ibm-services-system_crowdstrike-44w5v_3d931696-d2a9-4944-bf59-d1cad4768345/crowdstrike/0.log
2023-09-22T18:52:47.420197Z  INFO fs::tail: initialize event for symlink /var/log/containers/log-http-exposer-dxmmb_default_log-http-exposer-9692a2cf3303e6b59af87d11f44aca2b27fcb2c1f19f31f39b051f07c3377934.log, final target /var/log/pods/default_log-http-exposer-dxmmb_a8656a71-4d27-4a9d-8f68-41434805c6af/log-http-exposer/0.log
2023-09-22T18:52:47.420205Z  INFO fs::tail: initialized symlink "/var/log/containers/log-http-exposer-dxmmb_default_log-http-exposer-9692a2cf3303e6b59af87d11f44aca2b27fcb2c1f19f31f39b051f07c3377934.log" as DefaultKey(30v1)
2023-09-22T18:52:47.420307Z  INFO fs::tail: initialize event for file /var/log/pods/default_log-http-exposer-dxmmb_a8656a71-4d27-4a9d-8f68-41434805c6af/log-http-exposer/0.log
2023-09-22T18:52:47.420415Z  INFO fs::tail: initialize event for symlink /var/log/containers/falco-cbh4d_default_falco-prom-monitor-fb761e5bd273bf3734dd9725642850d713f8495538a345c04b48a0ccb156a599.log, final target /var/log/pods/default_falco-cbh4d_4f33ec1d-f8ee-4926-b2e1-0978fbeafd02/falco-prom-monitor/0.log
2023-09-22T18:52:47.420423Z  INFO fs::tail: initialized symlink "/var/log/containers/falco-cbh4d_default_falco-prom-monitor-fb761e5bd273bf3734dd9725642850d713f8495538a345c04b48a0ccb156a599.log" as DefaultKey(33v1)
2023-09-22T18:52:47.420523Z  INFO fs::tail: initialize event for file /var/log/pods/default_falco-cbh4d_4f33ec1d-f8ee-4926-b2e1-0978fbeafd02/falco-prom-monitor/0.log
2023-09-22T18:52:47.420629Z  INFO fs::tail: initialize event for symlink /var/log/containers/falco-cbh4d_default_falco-metrics-2747e03464d80f578577c2e4c816dbd6aebc1242a7c1c215be6068518f3bdebd.log, final target /var/log/pods/default_falco-cbh4d_4f33ec1d-f8ee-4926-b2e1-0978fbeafd02/falco-metrics/0.log
2023-09-22T18:52:47.420637Z  INFO fs::tail: initialized symlink "/var/log/containers/falco-cbh4d_default_falco-metrics-2747e03464d80f578577c2e4c816dbd6aebc1242a7c1c215be6068518f3bdebd.log" as DefaultKey(36v1)
2023-09-22T18:52:47.420735Z  INFO fs::tail: initialize event for file /var/log/pods/default_falco-cbh4d_4f33ec1d-f8ee-4926-b2e1-0978fbeafd02/falco-metrics/0.log
2023-09-22T18:52:47.420850Z  INFO fs::tail: initialize event for symlink /var/log/containers/sos-nessus-agent-v9rgd_ibm-services-system_sos-nessus-agent-627feefed422ac344856d0592eb172ff5702d01922f2192e77e97103ef40a447.log, final target /var/log/pods/ibm-services-system_sos-nessus-agent-v9rgd_c9180591-e5cd-40b2-a911-cc2fbb42e8a9/sos-nessus-agent/0.log
2023-09-22T18:52:47.420858Z  INFO fs::tail: initialized symlink "/var/log/containers/sos-nessus-agent-v9rgd_ibm-services-system_sos-nessus-agent-627feefed422ac344856d0592eb172ff5702d01922f2192e77e97103ef40a447.log" as DefaultKey(39v1)
2023-09-22T18:52:47.420961Z  INFO fs::tail: initialize event for file /var/log/pods/ibm-services-system_sos-nessus-agent-v9rgd_c9180591-e5cd-40b2-a911-cc2fbb42e8a9/sos-nessus-agent/0.log
2023-09-22T18:52:47.421067Z  INFO fs::tail: initialize event for symlink /var/log/containers/falco-cbh4d_default_falco-init-1efd496b1ca690551307e94f48175a53b53097045e0092c5ebf8bbb5ffc722e2.log, final target /var/log/pods/default_falco-cbh4d_4f33ec1d-f8ee-4926-b2e1-0978fbeafd02/falco-init/0.log
2023-09-22T18:52:47.421074Z  INFO fs::tail: initialized symlink "/var/log/containers/falco-cbh4d_default_falco-init-1efd496b1ca690551307e94f48175a53b53097045e0092c5ebf8bbb5ffc722e2.log" as DefaultKey(42v1)
2023-09-22T18:52:47.421172Z  INFO fs::tail: initialize event for file /var/log/pods/default_falco-cbh4d_4f33ec1d-f8ee-4926-b2e1-0978fbeafd02/falco-init/0.log
2023-09-22T18:52:47.421280Z  INFO fs::tail: initialize event for symlink /var/log/containers/node-exporter-d82x9_default_node-exporter-8a23cef3a58ef7f9cb6786fc97ac0506f37914ad4b4acf3a004ae6503575dbee.log, final target /var/log/pods/default_node-exporter-d82x9_b0173568-f127-4c99-8d87-3313dba27fd8/node-exporter/0.log
2023-09-22T18:52:47.421287Z  INFO fs::tail: initialized symlink "/var/log/containers/node-exporter-d82x9_default_node-exporter-8a23cef3a58ef7f9cb6786fc97ac0506f37914ad4b4acf3a004ae6503575dbee.log" as DefaultKey(44v1)
2023-09-22T18:52:47.421388Z  INFO fs::tail: initialize event for file /var/log/pods/default_node-exporter-d82x9_b0173568-f127-4c99-8d87-3313dba27fd8/node-exporter/0.log
2023-09-22T18:52:47.421499Z  INFO fs::tail: initialize event for symlink /var/log/containers/eventstreams-update-libblkid-wv97n_default_init-0adb2abfae4f74ecd55a9b1c93550fa0baddeb0f354118e97b75004b6e9fb5ee.log, final target /var/log/pods/default_eventstreams-update-libblkid-wv97n_5fd865b6-ee01-49e8-bb96-c9942e411321/init/0.log
2023-09-22T18:52:47.421507Z  INFO fs::tail: initialized symlink "/var/log/containers/eventstreams-update-libblkid-wv97n_default_init-0adb2abfae4f74ecd55a9b1c93550fa0baddeb0f354118e97b75004b6e9fb5ee.log" as DefaultKey(47v1)
2023-09-22T18:52:47.421616Z  INFO fs::tail: initialize event for file /var/log/pods/default_eventstreams-update-libblkid-wv97n_5fd865b6-ee01-49e8-bb96-c9942e411321/init/0.log
2023-09-22T18:52:47.421741Z  INFO fs::tail: initialize event for symlink /var/log/containers/ibmcloud-block-storage-driver-v6pqc_kube-system_ibmcloud-block-storage-driver-container-5f624639d8d55c344f6241cbe6b774aeecc203424807601a1a86a3f4737f2ae6.log, final target /var/log/pods/kube-system_ibmcloud-block-storage-driver-v6pqc_95703900-97c7-4d59-abc3-5aea479ace79/ibmcloud-block-storage-driver-container/0.log
2023-09-22T18:52:47.421749Z  INFO fs::tail: initialized symlink "/var/log/containers/ibmcloud-block-storage-driver-v6pqc_kube-system_ibmcloud-block-storage-driver-container-5f624639d8d55c344f6241cbe6b774aeecc203424807601a1a86a3f4737f2ae6.log" as DefaultKey(50v1)
2023-09-22T18:52:47.421860Z  INFO fs::tail: initialize event for file /var/log/pods/kube-system_ibmcloud-block-storage-driver-v6pqc_95703900-97c7-4d59-abc3-5aea479ace79/ibmcloud-block-storage-driver-container/0.log
2023-09-22T18:52:47.421975Z  INFO fs::tail: initialize event for symlink /var/log/containers/change-tracker-9zr7p_ibm-services-system_change-tracker-7e6bcb7755850028a340d8af14f166489f0b3dd80fd4cf2118664bfceac43221.log, final target /var/log/pods/ibm-services-system_change-tracker-9zr7p_6effde86-0adc-4836-bb75-8a0d300faba0/change-tracker/0.log
2023-09-22T18:52:47.421983Z  INFO fs::tail: initialized symlink "/var/log/containers/change-tracker-9zr7p_ibm-services-system_change-tracker-7e6bcb7755850028a340d8af14f166489f0b3dd80fd4cf2118664bfceac43221.log" as DefaultKey(53v1)
2023-09-22T18:52:47.422087Z  INFO fs::tail: initialize event for file /var/log/pods/ibm-services-system_change-tracker-9zr7p_6effde86-0adc-4836-bb75-8a0d300faba0/change-tracker/0.log
2023-09-22T18:52:47.422124Z  INFO fs::tail: initialize event for file /var/log/ubuntu-advantage-timer.log
2023-09-22T18:52:47.422204Z  INFO fs::tail: initialize event for file /var/log/docker.log
2023-09-22T18:52:47.422281Z  INFO fs::tail: initialize event for file /var/log/kern.log
2023-09-22T18:52:47.422369Z  INFO fs::tail: initialize event for file /var/log/alternatives.log
2023-09-22T18:52:47.422441Z  INFO fs::tail: initialize event for file /var/log/haproxy.log
2023-09-22T18:52:47.422737Z  INFO fs::tail: initialize event for file /var/data/kubeletlogs/kube-system_ibmcloud-block-storage-driver-v6pqc_95703900-97c7-4d59-abc3-5aea479ace79/ibmcloud-block-storage-driver-container/0.log
2023-09-22T18:52:47.422815Z  INFO fs::tail: initialize event for file /var/data/kubeletlogs/ibm-services-system_syslog-configurator-s5hqk_047b731e-d2e6-48a3-bd67-c85aa1ddb2bd/syslog-configurator/0.log
2023-09-22T18:52:47.422916Z  INFO fs::tail: initialize event for file /var/data/kubeletlogs/ibm-services-system_syslog-configurator-s5hqk_047b731e-d2e6-48a3-bd67-c85aa1ddb2bd/dlc-init/0.log
2023-09-22T18:52:47.422984Z  INFO fs::tail: initialize event for file /var/data/kubeletlogs/default_logdna-agent-22pkm_5bd339f4-80c5-46f6-85a2-867019677953/istio-init/0.log
2023-09-22T18:52:47.423029Z  INFO fs::tail: initialize event for file /var/data/kubeletlogs/default_logdna-agent-22pkm_5bd339f4-80c5-46f6-85a2-867019677953/logdna-agent/0.log
2023-09-22T18:52:47.423073Z  INFO fs::tail: initialize event for file /var/data/kubeletlogs/default_logdna-agent-22pkm_5bd339f4-80c5-46f6-85a2-867019677953/istio-proxy/0.log
2023-09-22T18:52:47.423134Z  INFO fs::tail: initialize event for file /var/data/kubeletlogs/kube-system_ibm-master-proxy-static-10.171.175.246_e9335e53e1d0afc4e4a4010bc67adc4d/ibm-master-proxy-static/0.log
2023-09-22T18:52:47.423162Z  INFO fs::tail: initialize event for file /var/data/kubeletlogs/kube-system_ibm-master-proxy-static-10.171.175.246_e9335e53e1d0afc4e4a4010bc67adc4d/ibm-master-proxy-static/0.log.20230919-131554.gz
2023-09-22T18:52:47.423189Z  INFO fs::tail: initialize event for file /var/data/kubeletlogs/kube-system_ibm-master-proxy-static-10.171.175.246_e9335e53e1d0afc4e4a4010bc67adc4d/ibm-master-proxy-static/0.log.20230921-225749
2023-09-22T18:52:47.423234Z  INFO fs::tail: initialize event for file /var/data/kubeletlogs/kube-system_ibm-master-proxy-static-10.171.175.246_e9335e53e1d0afc4e4a4010bc67adc4d/pause/0.log
2023-09-22T18:52:47.423338Z  INFO fs::tail: initialize event for file /var/data/kubeletlogs/ibm-services-system_crowdstrike-44w5v_3d931696-d2a9-4944-bf59-d1cad4768345/crowdstrike/0.log
2023-09-22T18:52:47.423443Z  INFO fs::tail: initialize event for file /var/data/kubeletlogs/ibm-services-system_change-tracker-9zr7p_6effde86-0adc-4836-bb75-8a0d300faba0/change-tracker/0.log
2023-09-22T18:52:47.423550Z  INFO fs::tail: initialize event for file /var/data/kubeletlogs/kube-system_calico-node-vp8lt_43fb6396-5151-49b4-91b0-fab12178d747/calico-node/0.log
2023-09-22T18:52:47.423598Z  INFO fs::tail: initialize event for file /var/data/kubeletlogs/kube-system_calico-node-vp8lt_43fb6396-5151-49b4-91b0-fab12178d747/calico-extension/0.log
2023-09-22T18:52:47.423653Z  INFO fs::tail: initialize event for file /var/data/kubeletlogs/kube-system_calico-node-vp8lt_43fb6396-5151-49b4-91b0-fab12178d747/install-cni/0.log
2023-09-22T18:52:47.423715Z  INFO fs::tail: initialize event for file /var/data/kubeletlogs/kube-system_ibm-keepalived-watcher-rhphb_3bdf1ec4-39a5-4b3c-853b-e6e62078ce7e/keepalived-init/0.log
2023-09-22T18:52:47.423804Z  INFO fs::tail: initialize event for file /var/data/kubeletlogs/kube-system_ibm-keepalived-watcher-rhphb_3bdf1ec4-39a5-4b3c-853b-e6e62078ce7e/keepalived-watcher/0.log
2023-09-22T18:52:47.423920Z  INFO fs::tail: initialize event for file /var/data/kubeletlogs/ibm-services-system_sos-nessus-agent-v9rgd_c9180591-e5cd-40b2-a911-cc2fbb42e8a9/sos-nessus-agent/0.log
2023-09-22T18:52:47.423984Z  INFO fs::tail: initialize event for file /var/data/kubeletlogs/default_falco-cbh4d_4f33ec1d-f8ee-4926-b2e1-0978fbeafd02/istio-init/0.log
2023-09-22T18:52:47.424066Z  INFO fs::tail: initialize event for file /var/data/kubeletlogs/default_falco-cbh4d_4f33ec1d-f8ee-4926-b2e1-0978fbeafd02/falco-metrics/0.log
2023-09-22T18:52:47.424151Z  INFO fs::tail: initialize event for file /var/data/kubeletlogs/default_falco-cbh4d_4f33ec1d-f8ee-4926-b2e1-0978fbeafd02/falco-prom-monitor/0.log
2023-09-22T18:52:47.424232Z  INFO fs::tail: initialize event for file /var/data/kubeletlogs/default_falco-cbh4d_4f33ec1d-f8ee-4926-b2e1-0978fbeafd02/falco/0.log
2023-09-22T18:52:47.424314Z  INFO fs::tail: initialize event for file /var/data/kubeletlogs/default_falco-cbh4d_4f33ec1d-f8ee-4926-b2e1-0978fbeafd02/falco-init/0.log
2023-09-22T18:52:47.424360Z  INFO fs::tail: initialize event for file /var/data/kubeletlogs/default_falco-cbh4d_4f33ec1d-f8ee-4926-b2e1-0978fbeafd02/istio-proxy/0.log
2023-09-22T18:52:47.424421Z  INFO fs::tail: initialize event for file /var/data/kubeletlogs/kube-system_konnectivity-agent-t22tv_9c7d3fe6-ca7d-4d8b-a6f1-4fa8c224bc91/konnectivity-agent/0.log
2023-09-22T18:52:47.424449Z  INFO fs::tail: initialize event for file /var/data/kubeletlogs/kube-system_konnectivity-agent-t22tv_9c7d3fe6-ca7d-4d8b-a6f1-4fa8c224bc91/konnectivity-agent/0.log.20230921-003213
2023-09-22T18:52:47.424476Z  INFO fs::tail: initialize event for file /var/data/kubeletlogs/kube-system_konnectivity-agent-t22tv_9c7d3fe6-ca7d-4d8b-a6f1-4fa8c224bc91/konnectivity-agent/0.log.20230917-181405.gz
2023-09-22T18:52:47.424574Z  INFO fs::tail: initialize event for file /var/data/kubeletlogs/kube-system_node-local-dns-dxlrd_6fb7d8a3-1055-4fd2-9f09-e639f26572a5/node-cache/0.log
2023-09-22T18:52:47.424678Z  INFO fs::tail: initialize event for file /var/data/kubeletlogs/default_log-http-exposer-dxmmb_a8656a71-4d27-4a9d-8f68-41434805c6af/log-http-exposer/0.log
2023-09-22T18:52:47.424779Z  INFO fs::tail: initialize event for file /var/data/kubeletlogs/default_node-exporter-d82x9_b0173568-f127-4c99-8d87-3313dba27fd8/node-exporter/0.log
2023-09-22T18:52:47.424845Z  INFO fs::tail: initialize event for file /var/data/kubeletlogs/default_node-exporter-d82x9_b0173568-f127-4c99-8d87-3313dba27fd8/node-exporter/1.log
2023-09-22T18:52:47.424945Z  INFO fs::tail: initialize event for file /var/data/kubeletlogs/default_eventstreams-update-libblkid-wv97n_5fd865b6-ee01-49e8-bb96-c9942e411321/init/0.log
2023-09-22T18:52:47.425053Z  INFO fs::tail: initialize event for file /var/data/kubeletlogs/default_eventstreams-update-libblkid-wv97n_5fd865b6-ee01-49e8-bb96-c9942e411321/pause/0.log
2023-09-22T18:52:47.425089Z  INFO fs::tail: initialize event for file /var/log/falcond.log
dkhokhlov commented 10 months ago

that is not the beginning of the log. the config follows this message right after agent is started:

INFO config: effective configuration collected from cli, env and config:

JunliWang commented 10 months ago

found it after another restart

2023-09-22T20:24:59.064833Z  INFO logdna_agent: running version: 3.8.8
2023-09-22T20:24:59.066821Z  INFO logdna_agent: Uid:    5000    5000    5000    5000
2023-09-22T20:24:59.066847Z  INFO logdna_agent: Gid:    5000    5000    5000    5000
2023-09-22T20:24:59.066917Z  INFO logdna_agent: Groups: 5000
2023-09-22T20:24:59.066933Z  INFO logdna_agent: CapInh: 0000000000000004
2023-09-22T20:24:59.066942Z  INFO logdna_agent: CapPrm: 0000000000000004
2023-09-22T20:24:59.066951Z  INFO logdna_agent: CapEff: 0000000000000004
2023-09-22T20:24:59.066959Z  INFO logdna_agent: CapBnd: 0000000000000004
2023-09-22T20:24:59.066967Z  INFO logdna_agent: CapAmb: 0000000000000000
2023-09-22T20:24:59.067002Z  INFO logdna_agent: Seccomp:    0
2023-09-22T20:24:59.067018Z  INFO logdna_agent: Cpus_allowed:   00000000,0000000f
2023-09-22T20:24:59.067023Z  INFO logdna_agent: Cpus_allowed_list:  0-3
2023-09-22T20:24:59.067134Z  INFO logdna_agent: Open Files limits in the system: (1048576, 1048576)
2023-09-22T20:24:59.068467Z  INFO config: using settings defined in env variables and command line options
2023-09-22T20:24:59.068704Z  INFO config: read the following options from cli, env and config:
---
http:
  host: logs.private.us-south.logging.cloud.ibm.com
  endpoint: /supertenant/logs/ingest
  use_ssl: true
  timeout: 10000
  use_compression: true
  gzip_level: 2
  ingestion_key: REDACTED
  params:
    hostname: ~
    mac: ~
    ip: ~
    tags: "mh-integration-control-us-south,us-south,control,servicelogs"
  body_size: 2097152
  retry_dir: /tmp/logdna
log:
  dirs:
    - /var/log/
  include:
    glob:
      - "*.log"
    regex: []
  exclude:
    glob:
      - /var/log/wtmp
      - /var/log/btmp
      - /var/log/utmp
      - /var/log/wtmpx
      - /var/log/btmpx
      - /var/log/utmpx
      - /var/log/asl/**
      - /var/log/sa/**
      - /var/log/sar*
      - /var/log/tallylog
      - /var/log/fluentd-buffers/**/*
      - /var/log/pods/**/*
      - /var/log/alb/customerlogs/customerlog*
      - /var/log/containers/public-cr*
      - /var/log/containers/private-cr*
      - /var/log/containers/ibm-cloud-provider-ip-*
      - /var/log/containers/calico*
      - /var/log/containers/coredns*
      - /var/log/containers/ibm-file-plugin-*
      - /var/log/containers/ibm-keepalived-watcher-*
      - /var/log/containers/ibm-kube-fluentd*
      - /var/log/containers/ibm-master-proxy-static*
      - /var/log/containers/ibm-storage-watcher-*
      - /var/log/containers/konnectivity-agent-*
      - /var/log/containers/kube-auditlog*
      - /var/log/containers/kube-dns-*
      - /var/log/containers/kubernetes-dashboard*
      - /var/log/containers/metrics-server*
      - /var/log/containers/dashboard-metrics-scraper-*
      - /var/log/containers/bes-local-client*
      - /var/log/containers/syslog-configurator-*
      - /var/log/containers/sos-tools-*
      - /var/log/containers/olm-operator-*
      - /var/log/containers/istiod-*
      - /var/log/containers/weave*
      - /var/log/containers/sysdig-agent*
      - /var/log/containers/logdna-agent*
      - /var/log/trace/kafkaServer-gc*
      - /var/log/apt/*
      - /var/log/auth.log
      - /var/log/cloud-init.log
      - /var/log/cloud-init-output.log
      - /var/log/containerd.log
      - /var/log/cni.log
      - /var/log/dpkg.log
      - /var/log/falconctl.log
      - /var/log/firstboot.log
      - /var/log/kubelet.log
      - /var/log/kube-proxy.log
      - /var/log/syslog
      - /var/log/sudo.log
      - /var/log/unattended-upgrades/*
      - /var/log/ubuntu-advantage.log
      - /var/log/ntpstats/*
      - /var/log/at/mhlogs/*
      - /var/log/containers/*_istio-proxy*
      - /var/log/containers/*_istio-init*
      - /var/log/crio.log
      - /var/log/audit/audit.log
      - /var/log/containers/*-probe
      - /var/log/calico/cni*
      - /var/log/containers/*csi*
      - /var/log/containers/*operator*
      - /var/log/containers/ebs*
      - /var/log/containers/openshift-*
      - /var/log/containers/service-ca-*
      - /var/log/containers/network-metrics-*
      - /var/log/containers/multus*
      - /var/log/containers/migrator-*
      - /var/log/containers/ingress-*
      - /var/log/containers/downloads-*
    regex:
      - TRACE
  log_metric_server_stats: ~
  clear_cache_interval: 21600
journald:
  systemd_journal_tailer: true
startup: {}

2023-09-22T20:24:59.070667Z ERROR logdna_agent::_main: Failed to open agent state db Read-only file system (os error 30)
2023-09-22T20:24:59.075129Z  INFO logdna_agent::_main: K8s Config Startup Option: Never
2023-09-22T20:24:59.075156Z  INFO logdna_agent::_main: Kubernetes cluster initialized, K8s startup lease set to: Some(Never)
2023-09-22T20:24:59.075191Z  INFO logdna_agent::_main: No K8s lease claimed during startup.
2023-09-22T20:24:59.086031Z  INFO journald::journalctl: Listening to journalctl
dkhokhlov commented 10 months ago

clear_cache_interval: 21600

try to lower it to 10 mins using env var:

MZ_CACHE_CLEAR_INTERVAL=600

I assume you did not have OOMs during first 10 mins.

JunliWang commented 10 months ago

I tried MZ_CACHE_CLEAR_INTERVAL, CACHE_CLEAR_INTERVAL, LOGDNA_MZ_CACHE_CLEAR_INTERVAL, LOGDNA_CACHE_CLEAR_INTERVAL, none of them can work. starting up log still prints the default 21600. not sure if it is configurable: https://github.com/logdna/logdna-agent-v2#configuration

JunliWang commented 10 months ago

LOGDNA_CLEAR_CACHE_INTERVAL works, found from this commit https://github.com/logdna/logdna-agent-v2/commit/0b0cb1d58068e84cfd284720ac2f8130db428fb4

JunliWang commented 10 months ago

not optimistic about this setting, below is 6h view, and the setting was applied 4h ago. I will update again in 1 or 2 days. image

JunliWang commented 9 months ago

over the weekend, memory is still increasing with this LOGDNA_CLEAR_CACHE_INTERVAL=600 image

dkhokhlov commented 9 months ago

keep going. initially it is always growing. then there should be memory drops like on your plot.

JunliWang commented 9 months ago

keep monitoring, and OOM still happened yesterday and today in US time (timezone on the chart is UTC), as cluster is busier on workdays than weekend.

image

dkhokhlov commented 9 months ago

how many nodes? OOM kill happens on one specific node? could you grep agent metrics of that node(s)?

dkhokhlov commented 9 months ago

how many nodes? is OOM kill happening on all nodes? could you grep metrics in agent log around that time?

JunliWang commented 9 months ago

11 nodes in this cluster, almost all of them are heavily loaded.

$ kubectl top pod --sort-by=memory -l app=logdna-agent
NAME                 CPU(cores)   MEMORY(bytes)
logdna-agent-pkvk6   51m          600Mi
logdna-agent-f5jlf   32m          548Mi
logdna-agent-gdprq   42m          546Mi
logdna-agent-h94xj   29m          516Mi
logdna-agent-z428t   29m          514Mi
logdna-agent-cpjz7   30m          512Mi
logdna-agent-jb6wq   39m          512Mi
logdna-agent-hsgzl   31m          509Mi
logdna-agent-5j4vw   25m          500Mi
logdna-agent-8c9bl   33m          496Mi
logdna-agent-2tkms   39m          464Mi

from the largest memory consumer now, not exact the time it got OOMKill, I do not retain logdna-agent logs at server side, but just grab from the pod

2023-09-27T19:02:15.854833Z  INFO metrics: {"fs":{"events":1662810,"creates":5538,"deletes":4946,"writes":1652326,"lines":181972,"bytes":79587065,"files_tracked":32374},"memory":{"active":523362304,"allocated":511087968,"resident":535453696},"ingest":{"requests":100588,"requests_size":727900771,"rate_limits":239499,"retries":42441,"retries_success":4280,"retries_failure":38161,"requests_duration":1303057628.8050066,"requests_timed_out":41885,"requests_failed":556,"requests_succeeded":58147},"k8s":{"lines":0,"creates":1036,"deletes":104,"events":1140},"journald":{"lines":0,"bytes":0},"retry":{"pending":0,"storage_used":0}}
dkhokhlov commented 9 months ago

stats to notice:

"retries":42441    -- means http send to Mezmo failed many times
"retries_failure":38161   -- and then http retries failed too
"requests_timed_out":41885 -- even more - some got timed out

this may explain mem usage. in ideal conditions you do not want to see them. it may be related to endpoint you use: /supertenant/logs/ingest. one way to isolate is to switch to regular endpoint.

Also:

"files_tracked":32374

Means many log files. You may want to try to narrow the scope to specific pods using inclusion rules to see if it will bring mem usage down.

Uvedale commented 9 months ago

@dkhokhlov has there been any deliberate fixes on this issue since the it was reported by multiple users after the last fix attempt around August 2022? We've just had to accept this memory leak since then and have been waiting for a fix. We are not on the latest version, but we'd like to know if going through the effort of upgrading is expected to make a difference. JunliWang's reports seem to suggest otherwise.

dkhokhlov commented 9 months ago

@Uvedale yes, related commits (hover over for quick details):

JunliWang commented 9 months ago

@dkhokhlov I will keep monitoring after removing LOGDNA_ENDPOINT=/supertenant/logs/ingest. I do not think our usage caused issue, looking at the usage in our dev account (having about 20-30 clusters), the overall data volume does not increase in the past 2 months, but OOM happens more often than before. And we do not have any inclusion/exclusion rule change nor having some new code writing logs excessively. But I'm not sure if the agent has reached its capacity thus unable to send out more data. image

Appreciated those efforts to fix it, could you release a new version with the commits from early August then we could give a try to see how it goes? right now, this OOMKill happens in several hundreds clusters in our production everyday(I could not recall the exact time when this starts, maybe sporadically since March, then more often in more clusters since Sept.)

dkhokhlov commented 9 months ago

It is may not be about volume in agent but about ingestion server performance. Agent will retry on errors. this has some memory overhead. could you check retry folder size around bad time? in your config it is at:

retry_dir: /tmp/logdna

on each node. also note - if /tmp is mounted to shared memory - it will eat memory too.

500MB limit is recommended. but actual limit depends on many factors. you can always try to double it to see if it will sustain.

All fixes are in master. 3.8 may not have all because of big distance between branches > 1 year.
Latest 3.8 image in ICR:

icr.io/ext/logdna-agent                                                 3.8.8-20230922.cc92b95fe91f7384-amd64                        15fc7135b524   ext         5 days ago

You can try latest 3.9 dev image from ICR on your test cluster:

icr.io/ext/logdna-agent                                                 3.9.0-alpha.2-4-x86_64                                       2adf81ce9aa9   ext         1 week ago
JunliWang commented 9 months ago

after removing the LOGDNA_ENDPOINT=/supertenant/logs/ingest, retries_failure becomes zero, thus retry_dir is empty. our /tmp is not mounted from memory . will keep monitoring.

Uvedale commented 9 months ago

@Uvedale yes, related commits (hover over for quick details):

Great, we will give the latest version a shot. Thanks.

JunliWang commented 9 months ago

I captured some logs from the agent a few minutes before the OOM happened. and I think what happened was: the agent got rate limited by ingestion server, so it had to cache the data and retry, and retry failed due to timeout, then it accumulated the data in memory and eventually run into out of memory in a few minutes. So can we conclude this is ingestion server's issue rather than the agent? image

dkhokhlov commented 9 months ago

this is very useful. the server is the trigger. but usually it should not cause OOM in short period of time. the "hit rate limit" log flood has been fixed by this recent commit to minimize short term memory overhead caused by high write rate to agent log: https://github.com/logdna/logdna-agent-v2/commit/d28f51adc29d677c5cf1224bea15f872327db606#diff-707e56895fbf736cfe382d4bc04a74ccbe35c9b0855b9b99e7cc05b839b07f2fR131 looks like you are not running the latest 3.8 version. could you try again with the latest 3.8?

JunliWang commented 9 months ago

I'm running v3.8.8 from 3 weeks ago but looks like there is a new v3.8.8 from last week, I will rebuild.

dkhokhlov commented 9 months ago

@JunliWang fyi: v3.8.9 has been published to ICR https://github.com/logdna/logdna-agent-v2/compare/3.8.8...3.8.9

sergeykrulikovskiy commented 7 months ago

@JunliWang Hello, did you fix this issue on your side?

I saw those OOMs before (on k8s 1.24.x) but they didn't cause any issues for me. But recently I upgraded clusters to 1.26.6 and looks like those OOMs cause some memory saturation. I cannot find any other reasons why Memory usage by the node increases since AKS upgrade. Current agent version installed is 3.7.0