ansible / awx

AWX provides a web-based user interface, REST API, and task engine built on top of Ansible. It is one of the upstream projects for Red Hat Ansible Automation Platform.
Other
14.11k stars 3.43k forks source link

Job outputs are not respecting the verbosity, even after setting verbosity to Normal (0). it is showing all env variables and secrets in the JOB STDOUT output. #13613

Open rupadhy3 opened 1 year ago

rupadhy3 commented 1 year ago

Please confirm the following

Bug Summary

Executing job template with verbosity set to Normal results in verbose output with env variabels and secrets being exposed

Job outputs are not respecting the verbosity, even after setting verbosity to Normal (0). it is showing all env variables and secrets in the JOB STDOUT output.

Job Details: Started 2/20/2023, 7:05:39 AM Finished 2/20/2023, 7:06:40 AM Job Template configure cfg project Job Type Playbook Run Inventory local_inventory Revision 76xxxxx38 Playbook customer_cfg_repo.yml Verbosity 0 (Normal)

JOB OUTPUT: time="2023-02-20T06:06:31-06:00" level=warning msg="Failed to decode the keys ["storage.options.override_kernel_check"] from "/etc/containers/storage.conf"." {"status": "starting", "runner_ident": "227", "command": ["ansible-playbook", "-u", "root", "--ask-vault-pass", "-e", "@/runner/env/tmp08p7vj5u", "-i", "/runner/inventory/hosts", "-e", "@/runner/env/extravars", "customer_cfg_repo.yml"], "env": {"KUBERNETES_SERVICE_PORT_HTTPS": "443", "KUBERNETES_SERVICE_PORT": "443", "AWX_NP_SERVICE_PORT_80_TCP": "tcp://172.21.84.78:80", "AWX_OPERATOR_CONTROLLER_MANAGER_METRICS_SERVICE_SERVICE_HOST": "172.21.34.213", "HOSTNAME": "automation-job-227-2km82", "PWD": "/runner", "AWX_OPERATOR_CONTROLLER_MANAGER_METRICS_SERVICE_PORT_8443_TCP_PORT": "8443", "AWX_NP_SERVICE_PORT_80_TCP_PORT": "80", "AWX_NP_SERVICE_SERVICE_PORT": "80", "AWX_OPERATOR_CONTROLLER_MANAGER_METRICS_SERVICE_PORT_8443_TCP_PROTO": "tcp", "HOME": "/home/runner", "KUBERNETES_PORT_443_TCP": "tcp://172.21.0.1:443", "AWX_NP_SERVICE_SERVICE_PORT_HTTP": "80", "AWX_OPERATOR_CONTROLLER_MANAGER_METRICS_SERVICE_PORT": "tcp://172.21.34.213:8443", "TERM": "xterm", "AWX_NP_SERVICE_PORT": "tcp://172.21.84.78:80", "AWX_OPERATOR_CONTROLLER_MANAGER_METRICS_SERVICE_SERVICE_PORT_HTTPS": "8443", "AWX_NP_SERVICE_SERVICE_HOST": "172.21.84.78", "AWX_OPERATOR_CONTROLLER_MANAGER_METRICS_SERVICE_SERVICE_PORT": "8443", "SHLVL": "0", "KUBERNETES_PORT_443_TCP_PROTO": "tcp", "KUBERNETES_PORT_443_TCP_ADDR": "172.21.0.1", "AWX_NP_SERVICE_PORT_80_TCP_ADDR": "172.21.84.78", "AWX_OPERATOR_CONTROLLER_MANAGER_METRICS_SERVICE_PORT_8443_TCP": "tcp://172.21.34.213:8443", "KUBERNETES_SERVICE_HOST": "172.21.0.1", "KUBERNETES_PORT": "tcp://172.21.0.1:443", "KUBERNETES_PORT_443_TCP_PORT": "443", "AWX_NP_SERVICE_PORT_80_TCP_PROTO": "tcp", "PATH": "/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin", "AWX_OPERATOR_CONTROLLER_MANAGER_METRICS_SERVICE_PORT_8443_TCP_ADDR": "172.21.34.213", "LC_CTYPE": "C.UTF-8", "ANSIBLE_FORCE_COLOR": "True", "ANSIBLE_HOST_KEY_CHECKING": "False", "ANSIBLE_INVENTORY_UNPARSED_FAILED": "True", "ANSIBLE_PARAMIKO_RECORD_HOST_KEYS": "False", "AWX_PRIVATE_DATA_DIR": "/tmp/awx_227_5w6ficbc", "JOB_ID": "227", "INVENTORY_ID": "35", "PROJECT_REVISION": "76xxxxx38", "ANSIBLE_RETRY_FILES_ENABLED": "False", "MAX_EVENT_RES": "700000", "AWX_HOST": "https://awx-XXXXXXXXXXXXXX.cloud", "ANSIBLE_SSH_CONTROL_PATH_DIR": "/runner/cp", "ANSIBLE_COLLECTIONS_PATHS": "/runner/requirements_collections:/.ansible/collections:/usr/share/ansible/collections", "ANSIBLE_ROLES_PATH": "/runner/requirements_roles:/.ansible/roles:/usr/share/ansible/roles:/etc/ansible/roles", "AV_TOKEN": "D4xxxxx99", "LOGDNA_KEY": "4axxxxxab", "AV_TENANT_ID": "85xxxxx25", "AWX_AUTH_KEY": "bDxxxxxME", "GITHUB_TOKEN": "0fxxxxxd3", "FLASK_API_KEY": "eyxxxxxX0.

AWX version

21.5.0

Select the relevant components

Installation method

kubernetes

Modifications

no

Ansible version

2.12.5

Operating system

Linux

Web browser

Firefox

Steps to reproduce

Executing any job template with verbosity set to Normal results in the output with env variabels as well as all extra variables and secrets being exposed on the job stdout output.

Expected results

Executing any job template with verbosity set to Normal should result in Anisble playbook output with any env variiables and secrets being exposed in the job output

Actual results

Executing any job template with verbosity set to Normal results in the output with env variabels as well as all extra variables and secrets being exposed on the job stdout output.

Additional information

No response

djyasin commented 1 year ago

@rupadhy3 we would like to gather a little bit more information from you.

Could you provide us with the custom resource you applied? Please be sure to remove any confidential information.

rupadhy3 commented 1 year ago

@djyasin please find the awx custom resource that we have applied. I tried to tweak a little with the logging by changing it to WARING for console logger but it is also not helping:

apiVersion: awx.ansible.com/v1beta1
kind: AWX
metadata:
  annotations:
    kubectl.kubernetes.io/last-applied-configuration: |
      {"apiVersion":"awx.ansible.com/v1beta1","kind":"AWX","metadata":{"annotations":{},"labels":{"app.kubernetes.io/instance":"awx-instance"},"name":"awx-np","namespace":"awx"},"spec":{"admin_user":"admin","bundle_cacert_secret":"awx-np-custom-certs","create_preload_data":true,"ee_extra_env":"- name: HTTP_PROXY\n  value: http://proxy.xxxxx:3128\n- name: HTTPS_PROXY\n  value: http://proxy.xxxxx:3128\n- name: NO_PROXY\n  value: localhost,.svc,.cluster.local,127.0.0.1,172.16.0.0/16,172.17.0.0/18,172.18.0.0/16,172.19.0.0/16,172.20.0.0/16,172.21.0.0/16,172.31.252.0/24,172.31.253.0/24,172.31.254.0/24,161.26.0.0/14,172.24.0.0/16,166.8.0.0/14,icr.io,.icr.io,registry.au-syd.bluemix.net,registry.eu-de.bluemix.net,registry.eu-gb.bluemix.net,registry.ng.bluemix.net,*.eu-de.containers.cloud.ibm.com\n","garbage_collect_secrets":false,"image_pull_policy":"IfNotPresent","ingress_type":"route","loadbalancer_port":80,"loadbalancer_protocol":"http","nodeport_port":30080,"postgres_configuration_secret":"awx-np-postgres-configuration","projects_persistence":false,"projects_storage_access_mode":"ReadWriteMany","projects_storage_size":"8Gi","replicas":1,"route_tls_termination_mechanism":"Edge","service_type":"clusterip","task_extra_env":"- name: HTTP_PROXY\n  value: http://proxy.xxxxx:3128\n- name: HTTPS_PROXY\n  value: http://proxy.xxxxx:3128\n- name: NO_PROXY\n  value: localhost,.svc,.cluster.local,127.0.0.1,172.16.0.0/16,172.17.0.0/18,172.18.0.0/16,172.19.0.0/16,172.20.0.0/16,172.21.0.0/16,172.31.252.0/24,172.31.253.0/24,172.31.254.0/24,161.26.0.0/14,172.24.0.0/16,166.8.0.0/14,icr.io,.icr.io,registry.au-syd.bluemix.net,registry.eu-de.bluemix.net,registry.eu-gb.bluemix.net,registry.ng.bluemix.net,*.eu-de.containers.cloud.ibm.com\n","task_privileged":false,"web_extra_env":"- name: HTTP_PROXY\n  value: http://proxy.xxxxx:3128\n- name: HTTPS_PROXY\n  value: http://proxy.xxxxx:3128\n- name: NO_PROXY\n  value: localhost,.svc,.cluster.local,127.0.0.1,172.16.0.0/16,172.17.0.0/18,172.18.0.0/16,172.19.0.0/16,172.20.0.0/16,172.21.0.0/16,172.31.252.0/24,172.31.253.0/24,172.31.254.0/24,161.26.0.0/14,172.24.0.0/16,166.8.0.0/14,icr.io,.icr.io,registry.au-syd.bluemix.net,registry.eu-de.bluemix.net,registry.eu-gb.bluemix.net,registry.ng.bluemix.net,*.eu-de.containers.cloud.ibm.com\n"}}
  creationTimestamp: "2022-11-07T20:16:02Z"
  generation: 9
  labels:
    app.kubernetes.io/component: awx
    app.kubernetes.io/instance: awx-instance
    app.kubernetes.io/managed-by: awx-operator
    app.kubernetes.io/name: awx-np
    app.kubernetes.io/operator-version: 0.28.0
    app.kubernetes.io/part-of: awx-np
  name: awx-np
  namespace: awx
  resourceVersion: "145082784"
  uid: 32f4bfa5-8c92-4426-a94d-6ac96a7e9936
spec:
  admin_user: admin
  auto_upgrade: true
  bundle_cacert_secret: awx-np-custom-certs
  create_preload_data: true
  ee_extra_env: |
    - name: HTTP_PROXY
      value: http://proxy.xxxxx:3128
    - name: HTTPS_PROXY
      value: http://proxy.xxxxx:3128
    - name: NO_PROXY
      value: localhost,.svc,.cluster.local,127.0.0.1,172.16.0.0/16,172.17.0.0/18,172.18.0.0/16,172.19.0.0/16,172.20.0.0/16,172.21.0.0/16,172.31.252.0/24,172.31.253.0/24,172.31.254.0/24,161.26.0.0/14,172.24.0.0/16,166.8.0.0/14,icr.io,.icr.io,registry.au-syd.bluemix.net,registry.eu-de.bluemix.net,registry.eu-gb.bluemix.net,registry.ng.bluemix.net,*.eu-de.containers.cloud.ibm.com
  extra_settings:
  - setting: LOGGING['handlers']['console']
    value: '{"()": "logging.StreamHandler", "level": "WARNING", "formatter": "simple"}'
  - setting: LOGGING['loggers']['awx']['level']
    value: '"WARNING"'
  - setting: LOG_AGGREGATOR_LEVEL
    value: '"WARNING"'
  - setting: LOGGING['loggers']['awx']['propagate']
    value: '"False"'
  garbage_collect_secrets: false
  image_pull_policy: IfNotPresent
  ingress_type: route
  loadbalancer_port: 80
  loadbalancer_protocol: http
  nodeport_port: 30080
  postgres_configuration_secret: awx-np-postgres-configuration
  projects_persistence: false
  projects_storage_access_mode: ReadWriteMany
  projects_storage_size: 8Gi
  replicas: 1
  route_tls_termination_mechanism: Edge
  service_type: clusterip
  set_self_labels: true
  task_extra_env: |
    - name: HTTP_PROXY
      value: http://proxy.xxxxx:3128
    - name: HTTPS_PROXY
      value: http://proxy.xxxxx:3128
    - name: NO_PROXY
      value: localhost,.svc,.cluster.local,127.0.0.1,172.16.0.0/16,172.17.0.0/18,172.18.0.0/16,172.19.0.0/16,172.20.0.0/16,172.21.0.0/16,172.31.252.0/24,172.31.253.0/24,172.31.254.0/24,161.26.0.0/14,172.24.0.0/16,166.8.0.0/14,icr.io,.icr.io,registry.au-syd.bluemix.net,registry.eu-de.bluemix.net,registry.eu-gb.bluemix.net,registry.ng.bluemix.net,*.eu-de.containers.cloud.ibm.com
  task_privileged: false
  web_extra_env: |
    - name: HTTP_PROXY
      value: http://proxy.xxxxx:3128
    - name: HTTPS_PROXY
      value: http://proxy.xxxxx:3128
    - name: NO_PROXY
      value: localhost,.svc,.cluster.local,127.0.0.1,172.16.0.0/16,172.17.0.0/18,172.18.0.0/16,172.19.0.0/16,172.20.0.0/16,172.21.0.0/16,172.31.252.0/24,172.31.253.0/24,172.31.254.0/24,161.26.0.0/14,172.24.0.0/16,166.8.0.0/14,icr.io,.icr.io,registry.au-syd.bluemix.net,registry.eu-de.bluemix.net,registry.eu-gb.bluemix.net,registry.ng.bluemix.net,*.eu-de.containers.cloud.ibm.com
status:
  URL: https://awx-np-awx.pes-nonprod-e6615681e74d1ae8dcbb77cfdab239d6-i000.eu-de.containers.appdomain.cloud
  adminPasswordSecret: awx-np-admin-password
  adminUser: admin
  broadcastWebsocketSecret: awx-np-broadcast-websocket
  conditions:
  - lastTransitionTime: "2023-02-22T20:04:50Z"
    reason: ""
    status: "False"
    type: Failure
  - lastTransitionTime: "2023-02-22T20:04:50Z"
    reason: Successful
    status: "True"
    type: Running
  - lastTransitionTime: "2023-02-22T20:52:11Z"
    reason: Successful
    status: "True"
    type: Successful
  image: quay.io/ansible/awx:21.5.0
  postgresConfigurationSecret: awx-np-postgres-configuration
  secretKeySecret: awx-np-secret-key
  version: 21.5.0
rupadhy3 commented 1 year ago

it uses an external postgres database and the postgres connection is updated in the postgres configuration secret -- awx-np-postgres-configuration and certificate in the custom certificate secret -- awx-np-custom-certs

kurokobo commented 1 year ago

JOB OUTPUT: time="2023-02-20T06:06:31-06:00" level=warning msg="Failed to decode the keys ["storage.options.override_kernel_check"] from "/etc/containers/storage.conf"." {"status": "starting", "runner_ident": "227", "command": ["ansible-playbook", "-u", "root", "--ask-vault-pass", "-e"...

The "Verbosity" for Job Template is similar to the number of -v options for the ansible-playbook command, and does not control the verbosity of error or stack trace. Nevertheless, there is something wrong with the situation that makes these variables visible.

How did you obtain these logs? Is it in the Output tab of a Job in AWX's Web UI?

rupadhy3 commented 1 year ago

@kurokobo Thank you, yes these logs are visible in the output tab of AWX web UI whenever any playbook fails, even in the error condition there should be a way to stop/avoid the env variables and sensitive data to be leaked to the job output tab.

kurokobo commented 1 year ago

@rupadhy3 Thanks for providing your background. To run playbook, various components such as Receptor and Ansible Runner are running, so the errors and their traces will vary depending on how the job fails, but I don't know of a pattern that all the environment variables are dumped.

So, if possible, could you please attach not just a single line of log but a complete log from Output tab (of course remove any confidential information), and your playbook that is causing the problem and why it failed (if you know)?

rupadhy3 commented 1 year ago

@kurokobo Please find the complete job-output as well as the inventory sync script that we are running to sync/import the inventory from IBMcloud, but this is just an example this is happenning for any job run (inventory-sync , playbook run etc.), even when the playbook is for example is just connectivity test using ping module (job_output_playbook.txt).

inventory script.txt job_output.txt job_output_playbook.txt

kurokobo commented 1 year ago

@rupadhy3 Thanks for updating. This is just a comment as a precaution since you have modified it so that it cannot be decoded, but please note that Base64-encoded strings in the log can be converted to ZIP files, which contain sensitive information inside as plain text.

Anyway, the job output that you provided seems the raw log from automation job pod (the stdout of Ansible Runner's worker process). These JSONL logs are usually formatted and displayed by AWX as follows; JSONL logs are not displayed as is.

image

yes these logs are visible in the output tab of AWX web UI whenever any playbook fails, even in the error condition

There is definitely something wrong with the situation that such raw JSONL logs being displayed as it is in the Web UI, but I have no idea what the cause is... One thing is for sure, this is completely out of the scope of "Verbosity" for the Job Template. Do you have any special settings for output or logging in AWX, ansible.cfg, or ansible_* variables?

This issue can be accurately titled as "JSONL logs from Ansible Runner are displayed as is in the job's Output tab".

@TheRealHaoLiu @shanemcd @AlanCoding Sorry for mentioning and sorry if I'm missing something, but any clue about this? I think this could be a security issue.

AlanCoding commented 1 year ago

In error cases, we do wind up putting the receptor output into the job details in some cases, and this can be risky. Lately we have changed that so we don't show it if any events have been received up to that point, which I believe should address the security risks concerns.