Open dlmather opened 9 months ago
Hi @dlmather, Thank you for submitting this issue. Could you provide us with a full-line sanitized sample output in question since it contains the place(s) in the code that is generating this output?
Hello @dlmather,
Normally we store only a select list of ENV variables from other pods, namely those used for tagging with container_env_as_tags
(plus some standard ones) as we don't expect secrets inside tag values.
We would be interested to get a flare to troubleshoot this issue faster (we'll need to open a support case).
Agent Environment
Agent version
7.49.0
Describe what happened: After turning on debug logs for the datadog agent it started printing out a certain container level information, which included env vars for containers running on the same host. In a Kubernetes setting, it is a common pattern to use Env Vars to store secrets. Once the Datadog agent started printing these logs, it leaked the container level secrets into our logging system. It would be good to either have this behavior documented or only selectively enabled as there can be reasons to enable Datadog Agent debug logs - but not want to have all container secrets leaked :)
Describe what you expected: Turning on debug logs would not suddenly introduce logs exposing the Env vars set for other Pods running on the same host.
Steps to reproduce the issue: Enable debug logs on the agent in a Kubernetes setting where secrets are mounted as env var. In our case the log looked like:
""" DUMP OF ENV VARS including secrets not available through collector "system", err: containerID not found """
Additional environment details (Operating System, Cloud provider, etc): EKS