sassoftware / viya4-deployment

This project contains Ansible code that creates a baseline in an existing Kubernetes environment for use with the SAS Viya Platform, generates the manifest for an order, and then can also deploy that order into the Kubernetes environment specified.
Apache License 2.0
71 stars 64 forks source link

Error when deploying the viya4-monitoring-kubernetes #247

Closed JimBrousseau closed 2 years ago

JimBrousseau commented 2 years ago

I receive this error when installing the monitoring components using the Docker container for viya4-deployment. I have the "master" branch of viya4-monitoring-kubernetes as of 6/28/2022.

The log:

Wednesday 29 June 2022  15:56:35 +0000 (0:00:00.050)       0:02:33.596 ******** 
<127.0.0.1> ESTABLISH LOCAL CONNECTION FOR USER: viya4-deployment
<127.0.0.1> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo $HOME/.ansible/tmp `"&& mkdir "` echo $HOME/.ansible/tmp/ansible-tmp-1656518195.2261136-1066-44247930650391 `" && echo ansible-tmp-1656518195.2261136-1066-44247930650391="` echo $HOME/.ansible/tmp/ansible-tmp-1656518195.2261136-1066-44247930650391 `" ) && sleep 0'
Using module file /usr/local/lib/python3.8/dist-packages/ansible/modules/file.py
<127.0.0.1> PUT /viya4-deployment/.ansible/tmp/ansible-local-1mfozsvqr/tmpin06ro_h TO /viya4-deployment/.ansible/tmp/ansible-tmp-1656518195.2261136-1066-44247930650391/AnsiballZ_file.py
<127.0.0.1> EXEC /bin/sh -c 'chmod u+x /viya4-deployment/.ansible/tmp/ansible-tmp-1656518195.2261136-1066-44247930650391/ /viya4-deployment/.ansible/tmp/ansible-tmp-1656518195.2261136-1066-44247930650391/AnsiballZ_file.py && sleep 0'
<127.0.0.1> EXEC /bin/sh -c '/usr/bin/python3 /viya4-deployment/.ansible/tmp/ansible-tmp-1656518195.2261136-1066-44247930650391/AnsiballZ_file.py && sleep 0'
<127.0.0.1> EXEC /bin/sh -c 'rm -f -r /viya4-deployment/.ansible/tmp/ansible-tmp-1656518195.2261136-1066-44247930650391/ > /dev/null 2>&1 && sleep 0'

TASK [monitoring : cluster-logging - create userdir] ***************************
task path: /viya4-deployment/roles/monitoring/tasks/cluster-logging.yaml:2
changed: [localhost] => changed=true 
  diff:
    after:
      mode: '0770'
      path: /tmp/ansible.nvc9ptvd/logging/
      state: directory
    before:
      mode: '0755'
      path: /tmp/ansible.nvc9ptvd/logging/
      state: absent
  gid: 1000
  group: '1000'
  invocation:
    module_args:
      _diff_peek: null
      _original_basename: null
      access_time: null
      access_time_format: '%Y%m%d%H%M.%S'
      attributes: null
      follow: true
      force: false
      group: null
      mode: '0770'
      modification_time: null
      modification_time_format: '%Y%m%d%H%M.%S'
      owner: null
      path: /tmp/ansible.nvc9ptvd/logging/
      recurse: false
      selevel: null
      serole: null
      setype: null
      seuser: null
      src: null
      state: directory
      unsafe_writes: false
  mode: '0770'
  owner: viya4-deployment
  path: /tmp/ansible.nvc9ptvd/logging/
  size: 6
  state: directory
  uid: 1000
Wednesday 29 June 2022  15:56:35 +0000 (0:00:00.269)       0:02:33.865 ******** 
<127.0.0.1> ESTABLISH LOCAL CONNECTION FOR USER: viya4-deployment
<127.0.0.1> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo $HOME/.ansible/tmp `"&& mkdir "` echo $HOME/.ansible/tmp/ansible-tmp-1656518195.4949656-1089-273660138029517 `" && echo ansible-tmp-1656518195.4949656-1089-273660138029517="` echo $HOME/.ansible/tmp/ansible-tmp-1656518195.4949656-1089-273660138029517 `" ) && sleep 0'
Using module file /viya4-deployment/.ansible/collections/ansible_collections/community/kubernetes/plugins/modules/k8s_info.py
<127.0.0.1> PUT /viya4-deployment/.ansible/tmp/ansible-local-1mfozsvqr/tmppd6h0t2h TO /viya4-deployment/.ansible/tmp/ansible-tmp-1656518195.4949656-1089-273660138029517/AnsiballZ_k8s_info.py
<127.0.0.1> EXEC /bin/sh -c 'chmod u+x /viya4-deployment/.ansible/tmp/ansible-tmp-1656518195.4949656-1089-273660138029517/ /viya4-deployment/.ansible/tmp/ansible-tmp-1656518195.4949656-1089-273660138029517/AnsiballZ_k8s_info.py && sleep 0'
<127.0.0.1> EXEC /bin/sh -c '/usr/bin/python3 /viya4-deployment/.ansible/tmp/ansible-tmp-1656518195.4949656-1089-273660138029517/AnsiballZ_k8s_info.py && sleep 0'
<127.0.0.1> EXEC /bin/sh -c 'rm -f -r /viya4-deployment/.ansible/tmp/ansible-tmp-1656518195.4949656-1089-273660138029517/ > /dev/null 2>&1 && sleep 0'

TASK [monitoring : cluster-logging - lookup existing credentials] **************
task path: /viya4-deployment/roles/monitoring/tasks/cluster-logging.yaml:12
ok: [localhost] => changed=false 
  api_found: true
  invocation:
    module_args:
      api_key: null
      api_version: v1
      ca_cert: null
      client_cert: null
      client_key: null
      context: null
      field_selectors: []
      host: null
      kind: Secret
      kubeconfig: /tmp/ansible.nvc9ptvd/.kube
      label_selectors:
      - managed-by = v4m-es-script
      name: null
      namespace: logging
      password: null
      persist_config: null
      proxy: null
      username: null
      validate_certs: null
      wait: false
      wait_condition: null
      wait_sleep: 5
      wait_timeout: 120
  resources: []
Wednesday 29 June 2022  15:56:36 +0000 (0:00:00.809)       0:02:34.675 ******** 

TASK [monitoring : Set password facts] *****************************************
task path: /viya4-deployment/roles/monitoring/tasks/cluster-logging.yaml:24
ok: [localhost] => changed=false 
  ansible_facts:
    V4M_KIBANASERVER_PASSWORD: Myeye$0n1y##
    V4M_KIBANA_PASSWORD: Myeye$0n1y##
    V4M_LOGCOLLECTOR_PASSWORD: Myeye$0n1y##
    V4M_METRICGETTER_PASSWORD: Myeye$0n1y##
Wednesday 29 June 2022  15:56:36 +0000 (0:00:00.049)       0:02:34.724 ******** 
Wednesday 29 June 2022  15:56:36 +0000 (0:00:00.048)       0:02:34.772 ******** 

TASK [monitoring : cluster-logging - output credentials] ***********************
task path: /viya4-deployment/roles/monitoring/tasks/cluster-logging.yaml:44
ok: [localhost] => 
  msg:
  - 'OpenSearch admin  - username: admin,                   password: Myeye$0n1y##'
  - 'OpenSearch Dashboards Server - username: kibanaserver, password: Myeye$0n1y##'
  - 'Log Collector - username: logcollector,                password: Myeye$0n1y##'
  - 'Metric Getter - username: metricgetter,                password: Myeye$0n1y##'
Wednesday 29 June 2022  15:56:36 +0000 (0:00:00.048)       0:02:34.821 ******** 
<127.0.0.1> ESTABLISH LOCAL CONNECTION FOR USER: viya4-deployment
<127.0.0.1> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo $HOME/.ansible/tmp `"&& mkdir "` echo $HOME/.ansible/tmp/ansible-tmp-1656518196.455251-1119-186772324564683 `" && echo ansible-tmp-1656518196.455251-1119-186772324564683="` echo $HOME/.ansible/tmp/ansible-tmp-1656518196.455251-1119-186772324564683 `" ) && sleep 0'
Using module file /usr/local/lib/python3.8/dist-packages/ansible/modules/stat.py
<127.0.0.1> PUT /viya4-deployment/.ansible/tmp/ansible-local-1mfozsvqr/tmpak5n2lnk TO /viya4-deployment/.ansible/tmp/ansible-tmp-1656518196.455251-1119-186772324564683/AnsiballZ_stat.py
<127.0.0.1> EXEC /bin/sh -c 'chmod u+x /viya4-deployment/.ansible/tmp/ansible-tmp-1656518196.455251-1119-186772324564683/ /viya4-deployment/.ansible/tmp/ansible-tmp-1656518196.455251-1119-186772324564683/AnsiballZ_stat.py && sleep 0'
<127.0.0.1> EXEC /bin/sh -c '/usr/bin/python3 /viya4-deployment/.ansible/tmp/ansible-tmp-1656518196.455251-1119-186772324564683/AnsiballZ_stat.py && sleep 0'
<127.0.0.1> PUT /viya4-deployment/.ansible/tmp/ansible-local-1mfozsvqr/tmp9rqldj0s/user-values-elasticsearch-opensearch.yaml TO /viya4-deployment/.ansible/tmp/ansible-tmp-1656518196.455251-1119-186772324564683/source
<127.0.0.1> EXEC /bin/sh -c 'chmod u+x /viya4-deployment/.ansible/tmp/ansible-tmp-1656518196.455251-1119-186772324564683/ /viya4-deployment/.ansible/tmp/ansible-tmp-1656518196.455251-1119-186772324564683/source && sleep 0'
Using module file /usr/local/lib/python3.8/dist-packages/ansible/modules/copy.py
<127.0.0.1> PUT /viya4-deployment/.ansible/tmp/ansible-local-1mfozsvqr/tmp0jzsyszf TO /viya4-deployment/.ansible/tmp/ansible-tmp-1656518196.455251-1119-186772324564683/AnsiballZ_copy.py
<127.0.0.1> EXEC /bin/sh -c 'chmod u+x /viya4-deployment/.ansible/tmp/ansible-tmp-1656518196.455251-1119-186772324564683/ /viya4-deployment/.ansible/tmp/ansible-tmp-1656518196.455251-1119-186772324564683/AnsiballZ_copy.py && sleep 0'
<127.0.0.1> EXEC /bin/sh -c '/usr/bin/python3 /viya4-deployment/.ansible/tmp/ansible-tmp-1656518196.455251-1119-186772324564683/AnsiballZ_copy.py && sleep 0'
<127.0.0.1> EXEC /bin/sh -c 'rm -f -r /viya4-deployment/.ansible/tmp/ansible-tmp-1656518196.455251-1119-186772324564683/ > /dev/null 2>&1 && sleep 0'

TASK [monitoring : cluster-logging - opensearch user values] *******************
task path: /viya4-deployment/roles/monitoring/tasks/cluster-logging.yaml:54
changed: [localhost] => changed=true 
  checksum: dcae13a967390fb04b0f45d7d848d007650ed005
  dest: /tmp/ansible.nvc9ptvd/logging/user-values-opensearch.yaml
  diff: []
  gid: 1000
  group: '1000'
  invocation:
    module_args:
      _original_basename: user-values-elasticsearch-opensearch.yaml
      attributes: null
      backup: false
      checksum: dcae13a967390fb04b0f45d7d848d007650ed005
      content: null
      dest: /tmp/ansible.nvc9ptvd/logging/user-values-opensearch.yaml
      directory_mode: null
      follow: false
      force: true
      group: null
      local_follow: null
      mode: '0660'
      owner: null
      remote_src: null
      selevel: null
      serole: null
      setype: null
      seuser: null
      src: /viya4-deployment/.ansible/tmp/ansible-tmp-1656518196.455251-1119-186772324564683/source
      unsafe_writes: false
      validate: null
  md5sum: 59ef9d7e85196eb3edcbbfef4ee53eed
  mode: '0660'
  owner: viya4-deployment
  size: 337
  src: /viya4-deployment/.ansible/tmp/ansible-tmp-1656518196.455251-1119-186772324564683/source
  state: file
  uid: 1000
Wednesday 29 June 2022  15:56:36 +0000 (0:00:00.482)       0:02:35.303 ******** 
<127.0.0.1> ESTABLISH LOCAL CONNECTION FOR USER: viya4-deployment
<127.0.0.1> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo $HOME/.ansible/tmp `"&& mkdir "` echo $HOME/.ansible/tmp/ansible-tmp-1656518196.948137-1155-227400422423620 `" && echo ansible-tmp-1656518196.948137-1155-227400422423620="` echo $HOME/.ansible/tmp/ansible-tmp-1656518196.948137-1155-227400422423620 `" ) && sleep 0'
Using module file /usr/local/lib/python3.8/dist-packages/ansible/modules/stat.py
<127.0.0.1> PUT /viya4-deployment/.ansible/tmp/ansible-local-1mfozsvqr/tmpzdc3g9dt TO /viya4-deployment/.ansible/tmp/ansible-tmp-1656518196.948137-1155-227400422423620/AnsiballZ_stat.py
<127.0.0.1> EXEC /bin/sh -c 'chmod u+x /viya4-deployment/.ansible/tmp/ansible-tmp-1656518196.948137-1155-227400422423620/ /viya4-deployment/.ansible/tmp/ansible-tmp-1656518196.948137-1155-227400422423620/AnsiballZ_stat.py && sleep 0'
<127.0.0.1> EXEC /bin/sh -c '/usr/bin/python3 /viya4-deployment/.ansible/tmp/ansible-tmp-1656518196.948137-1155-227400422423620/AnsiballZ_stat.py && sleep 0'
<127.0.0.1> PUT /viya4-deployment/.ansible/tmp/ansible-local-1mfozsvqr/tmp1qsex_ql/user-values-osd-opensearch.yaml TO /viya4-deployment/.ansible/tmp/ansible-tmp-1656518196.948137-1155-227400422423620/source
<127.0.0.1> EXEC /bin/sh -c 'chmod u+x /viya4-deployment/.ansible/tmp/ansible-tmp-1656518196.948137-1155-227400422423620/ /viya4-deployment/.ansible/tmp/ansible-tmp-1656518196.948137-1155-227400422423620/source && sleep 0'
Using module file /usr/local/lib/python3.8/dist-packages/ansible/modules/copy.py
<127.0.0.1> PUT /viya4-deployment/.ansible/tmp/ansible-local-1mfozsvqr/tmp1r62tvl4 TO /viya4-deployment/.ansible/tmp/ansible-tmp-1656518196.948137-1155-227400422423620/AnsiballZ_copy.py
<127.0.0.1> EXEC /bin/sh -c 'chmod u+x /viya4-deployment/.ansible/tmp/ansible-tmp-1656518196.948137-1155-227400422423620/ /viya4-deployment/.ansible/tmp/ansible-tmp-1656518196.948137-1155-227400422423620/AnsiballZ_copy.py && sleep 0'
<127.0.0.1> EXEC /bin/sh -c '/usr/bin/python3 /viya4-deployment/.ansible/tmp/ansible-tmp-1656518196.948137-1155-227400422423620/AnsiballZ_copy.py && sleep 0'
<127.0.0.1> EXEC /bin/sh -c 'rm -f -r /viya4-deployment/.ansible/tmp/ansible-tmp-1656518196.948137-1155-227400422423620/ > /dev/null 2>&1 && sleep 0'

TASK [monitoring : cluster-logging - osd user values] **************************
task path: /viya4-deployment/roles/monitoring/tasks/cluster-logging.yaml:64
changed: [localhost] => changed=true 
  checksum: d87755381ce5cc3779c46e4e962bf7192eb4b2fb
  dest: /tmp/ansible.nvc9ptvd/logging/user-values-osd.yaml
  diff: []
  gid: 1000
  group: '1000'
  invocation:
    module_args:
      _original_basename: user-values-osd-opensearch.yaml
      attributes: null
      backup: false
      checksum: d87755381ce5cc3779c46e4e962bf7192eb4b2fb
      content: null
      dest: /tmp/ansible.nvc9ptvd/logging/user-values-osd.yaml
      directory_mode: null
      follow: false
      force: true
      group: null
      local_follow: null
      mode: '0660'
      owner: null
      remote_src: null
      selevel: null
      serole: null
      setype: null
      seuser: null
      src: /viya4-deployment/.ansible/tmp/ansible-tmp-1656518196.948137-1155-227400422423620/source
      unsafe_writes: false
      validate: null
  md5sum: 70f41701bc426706633bd567698e2f43
  mode: '0660'
  owner: viya4-deployment
  size: 413
  src: /viya4-deployment/.ansible/tmp/ansible-tmp-1656518196.948137-1155-227400422423620/source
  state: file
  uid: 1000
Wednesday 29 June 2022  15:56:37 +0000 (0:00:00.483)       0:02:35.787 ******** 
<127.0.0.1> ESTABLISH LOCAL CONNECTION FOR USER: viya4-deployment
<127.0.0.1> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo $HOME/.ansible/tmp `"&& mkdir "` echo $HOME/.ansible/tmp/ansible-tmp-1656518197.4333062-1191-220570103669179 `" && echo ansible-tmp-1656518197.4333062-1191-220570103669179="` echo $HOME/.ansible/tmp/ansible-tmp-1656518197.4333062-1191-220570103669179 `" ) && sleep 0'
Using module file /usr/local/lib/python3.8/dist-packages/ansible/modules/command.py
<127.0.0.1> PUT /viya4-deployment/.ansible/tmp/ansible-local-1mfozsvqr/tmp7yxpb3j8 TO /viya4-deployment/.ansible/tmp/ansible-tmp-1656518197.4333062-1191-220570103669179/AnsiballZ_command.py
<127.0.0.1> EXEC /bin/sh -c 'chmod u+x /viya4-deployment/.ansible/tmp/ansible-tmp-1656518197.4333062-1191-220570103669179/ /viya4-deployment/.ansible/tmp/ansible-tmp-1656518197.4333062-1191-220570103669179/AnsiballZ_command.py && sleep 0'
<127.0.0.1> EXEC /bin/sh -c 'USER_DIR=/tmp/ansible.nvc9ptvd TLS_ENABLE=true LOG_KB_TLS_ENABLE=true KUBECONFIG=/tmp/ansible.nvc9ptvd/.kube LOG_COLOR_ENABLE=False NODE_PLACEMENT_ENABLE=False ES_ADMIN_PASSWD='"'"'Myeye$0n1y##'"'"' ES_KIBANASERVER_PASSWD='"'"'Myeye$0n1y##'"'"' ES_LOGCOLLECTOR_PASSWD='"'"'Myeye$0n1y##'"'"' ES_METRICGETTER_PASSWD='"'"'Myeye$0n1y##'"'"' LOG_NS=logging KB_KNOWN_NODEPORT_ENABLE=False /usr/bin/python3 /viya4-deployment/.ansible/tmp/ansible-tmp-1656518197.4333062-1191-220570103669179/AnsiballZ_command.py && sleep 0'
<127.0.0.1> EXEC /bin/sh -c 'rm -f -r /viya4-deployment/.ansible/tmp/ansible-tmp-1656518197.4333062-1191-220570103669179/ > /dev/null 2>&1 && sleep 0'

TASK [monitoring : cluster-logging - deploy] ***********************************
task path: /viya4-deployment/roles/monitoring/tasks/cluster-logging.yaml:74
The full traceback is:
  File "/tmp/ansible_ansible.legacy.command_payload_g1cyymeu/ansible_ansible.legacy.command_payload.zip/ansible/module_utils/basic.py", line 2727, in run_command
    cmd = subprocess.Popen(args, **kwargs)
  File "/usr/lib/python3.8/subprocess.py", line 858, in __init__
    self._execute_child(args, executable, preexec_fn, close_fds,
  File "/usr/lib/python3.8/subprocess.py", line 1704, in _execute_child
    raise child_exception_type(errno_num, err_msg, err_filename)
fatal: [localhost]: FAILED! => changed=false 
  cmd: /tmp/ansible.nvc9ptvd/viya4-monitoring-kubernetes/logging/bin/deploy_logging.sh
  invocation:
    module_args:
      _raw_params: /tmp/ansible.nvc9ptvd/viya4-monitoring-kubernetes/logging/bin/deploy_logging.sh
      _uses_shell: false
      argv: null
      chdir: null
      creates: null
      executable: null
      removes: null
      stdin: null
      stdin_add_newline: true
      strip_empty_ends: true
      warn: true
  msg: '[Errno 2] No such file or directory: b''/tmp/ansible.nvc9ptvd/viya4-monitoring-kubernetes/logging/bin/deploy_logging.sh'''
  rc: 2

PLAY RECAP *********************************************************************
localhost                  : ok=40   changed=10   unreachable=0    failed=1    skipped=34   rescued=0    ignored=0   

Wednesday 29 June 2022  15:56:37 +0000 (0:00:00.277)       0:02:36.064 ******** 
=============================================================================== 
monitoring : cluster-monitoring - deploy ------------------------------ 144.03s
/viya4-deployment/roles/monitoring/tasks/cluster-monitoring.yaml:56 -----------
monitoring : v4m - download --------------------------------------------- 1.33s
/viya4-deployment/roles/monitoring/tasks/main.yaml:2 --------------------------
Gathering Facts --------------------------------------------------------- 1.23s
/viya4-deployment/playbooks/playbook.yaml:1 -----------------------------------
monitoring : cluster-monitoring - lookup existing credentials ----------- 1.05s
/viya4-deployment/roles/monitoring/tasks/cluster-monitoring.yaml:12 -----------
monitoring : v4m - add storageclass ------------------------------------- 1.05s
/viya4-deployment/roles/monitoring/tasks/main.yaml:12 -------------------------
monitoring : cluster-logging - lookup existing credentials -------------- 0.81s
/viya4-deployment/roles/monitoring/tasks/cluster-logging.yaml:12 --------------
common : tfstate - export kubeconfig ------------------------------------ 0.72s
/viya4-deployment/roles/common/tasks/main.yaml:46 -----------------------------
monitoring : cluster-logging - osd user values -------------------------- 0.48s
/viya4-deployment/roles/monitoring/tasks/cluster-logging.yaml:64 --------------
monitoring : cluster-logging - opensearch user values ------------------- 0.48s
/viya4-deployment/roles/monitoring/tasks/cluster-logging.yaml:54 --------------
monitoring : cluster-monitoring - user values --------------------------- 0.46s
/viya4-deployment/roles/monitoring/tasks/cluster-monitoring.yaml:46 -----------
monitoring : cluster-monitoring - create userdir ------------------------ 0.38s
/viya4-deployment/roles/monitoring/tasks/cluster-monitoring.yaml:2 ------------
global tmp dir ---------------------------------------------------------- 0.36s
/viya4-deployment/playbooks/playbook.yaml:3 -----------------------------------
monitoring : cluster-logging - deploy ----------------------------------- 0.28s
/viya4-deployment/roles/monitoring/tasks/cluster-logging.yaml:74 --------------
monitoring : cluster-logging - create userdir --------------------------- 0.27s
/viya4-deployment/roles/monitoring/tasks/cluster-logging.yaml:2 ---------------
common : Load config file ----------------------------------------------- 0.10s
/viya4-deployment/roles/common/tasks/main.yaml:1 ------------------------------
common : Parse tfstate -------------------------------------------------- 0.07s
/viya4-deployment/roles/common/tasks/main.yaml:19 -----------------------------
common : Add nat ip to LOADBALANCER_SOURCE_RANGES ----------------------- 0.07s
/viya4-deployment/roles/common/tasks/main.yaml:27 -----------------------------
monitoring role - cluster ----------------------------------------------- 0.06s
/viya4-deployment/playbooks/playbook.yaml:35 ----------------------------------
common role ------------------------------------------------------------- 0.06s
/viya4-deployment/playbooks/playbook.yaml:11 ----------------------------------
common : tfstate - postgres fqdn ---------------------------------------- 0.06s
/viya4-deployment/roles/common/tasks/main.yaml:165 ----------------------------
thpang commented 2 years ago

Seems like it is not copying over the shell script into the tmp location for execution. How are you running the tooling? With Ansible or Docker? Could you also provide the command you're running. Thanks.

JimBrousseau commented 2 years ago

Using Docker.

#!/bin/bash
cd $base_dir/viya4-deployment
# monitor install
docker run --rm  --group-add root  --user $(id -u):$(id -g)  \
  --volume ${base_dir}:/data \
  --volume ${base_dir}/viya4-deployment/ansible-vars-iac-azure.yaml:/config/config \
  --volume $base_dir/.ssh/id_rsa:/config/jump_svr_private_key --volume ${base_dir}/viya4-iac-azure/terraform.tfstate:/config/tfstate  \
  --volume $HOME/.gitconfig:/viya4-deployment/.gitconfig  \
  viya4-deployment-sasfirdviya4 --tags "cluster-logging,cluster-monitoring,viya-monitoring,install" -vvv > ${base_dir}/monitoring_run_latest-sasfirdviya4.log
JimBrousseau commented 2 years ago

Tried stable branch. Same error.

JimBrousseau commented 2 years ago

Is the first step being skipped due to the when clause?


thpang commented 2 years ago

Need more clarification on the when clause you speak of

thpang commented 2 years ago

Need to see output from the main task under monitoring it would be labeled:

v4m - download

JimBrousseau commented 2 years ago
TASK [monitoring : v4m - download] *********************************************
task path: /viya4-deployment/roles/monitoring/tasks/main.yaml:2
changed: [localhost] => changed=true 
  after: a93f6ca5dfc029bfbf562fe67277246c6942f3da
  before: null
  invocation:
    module_args:
      accept_hostkey: false
      archive: null
      archive_prefix: null
      bare: false
      clone: true
      depth: null
      dest: /tmp/ansible.d9_dhwup/viya4-monitoring-kubernetes/
      executable: null
      force: false
      gpg_whitelist: []
      key_file: null
      recursive: true
      reference: null
      refspec: null
      remote: origin
      repo: https://github.com/sassoftware/viya4-monitoring-kubernetes.git
      separate_git_dir: null
      ssh_opts: null
      track_submodules: false
      umask: null
      update: true
      verify_commit: false
      version: 1.0.10
redirecting (type: action) community.kubernetes.k8s to community.kubernetes.k8s_info
Wednesday 29 June 2022  20:02:35 +0000 (0:00:01.973)       0:00:06.800 ******** 
redirecting (type: action) community.kubernetes.k8s to community.kubernetes.k8s_info
redirecting (type: action) community.kubernetes.k8s to community.kubernetes.k8s_info
<127.0.0.1> ESTABLISH LOCAL CONNECTION FOR USER: viya4-deployment
<127.0.0.1> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo $HOME/.ansible/tmp `"&& mkdir "` echo $HOME/.ansible/tmp/ansible-tmp-1656532955.0852308-257-200093522827246 `" && echo ansible-tmp-1656532955.0852308-257-200093522827246="` echo $HOME/.ansible/tmp/ansible-tmp-1656532955.0852308-257-200093522827246 `" ) && sleep 0'
Using module file /viya4-deployment/.ansible/collections/ansible_collections/community/kubernetes/plugins/modules/k8s.py
<127.0.0.1> PUT /viya4-deployment/.ansible/tmp/ansible-local-1u124t61g/tmpnhbv070t TO /viya4-deployment/.ansible/tmp/ansible-tmp-1656532955.0852308-257-200093522827246/AnsiballZ_k8s.py
<127.0.0.1> EXEC /bin/sh -c 'chmod u+x /viya4-deployment/.ansible/tmp/ansible-tmp-1656532955.0852308-257-200093522827246/ /viya4-deployment/.ansible/tmp/ansible-tmp-1656532955.0852308-257-200093522827246/AnsiballZ_k8s.py && sleep 0'
<127.0.0.1> EXEC /bin/sh -c '/usr/bin/python3 /viya4-deployment/.ansible/tmp/ansible-tmp-1656532955.0852308-257-200093522827246/AnsiballZ_k8s.py && sleep 0'
<127.0.0.1> EXEC /bin/sh -c 'rm -f -r /viya4-deployment/.ansible/tmp/ansible-tmp-1656532955.0852308-257-200093522827246/ > /dev/null 2>&1 && sleep 0'

TASK [monitoring : v4m - add storageclass] *************************************
task path: /viya4-deployment/roles/monitoring/tasks/main.yaml:12
ok: [localhost] => changed=false 
  diff: {}
  invocation:
    module_args:
      api_key: null
      api_version: v1
      append_hash: false
      apply: false
      ca_cert: null
      client_cert: null
      client_key: null
      context: null
      delete_options: null
      force: false
      host: null
      kind: null
      kubeconfig: /tmp/ansible.d9_dhwup/.kube
      merge_type: null
      name: null
      namespace: null
      password: null
      persist_config: null
      proxy: null
      resource_definition: null
      src: /viya4-deployment/roles/monitoring/files/azure-storageclass.yaml
      state: present
      template: null
      username: null
      validate: null
      validate_certs: null
      wait: false
      wait_condition: null
      wait_sleep: 5
      wait_timeout: 120
  method: patch
  result:
    allowVolumeExpansion: true
    apiVersion: storage.k8s.io/v1
    kind: StorageClass
    metadata:
      creationTimestamp: '2022-06-28T21:37:27Z'
      labels:
        addonmanager.kubernetes.io/mode: EnsureExists
        kubernetes.io/cluster-service: 'true'
      managedFields:
      - apiVersion: storage.k8s.io/v1
        fieldsType: FieldsV1
        fieldsV1:
          f:allowVolumeExpansion: {}
          f:metadata:
            f:labels:
              .: {}
              f:addonmanager.kubernetes.io/mode: {}
              f:kubernetes.io/cluster-service: {}
          f:parameters:
            .: {}
            f:skuName: {}
          f:provisioner: {}
          f:reclaimPolicy: {}
          f:volumeBindingMode: {}
        manager: OpenAPI-Generator
        operation: Update
        time: '2022-06-28T21:37:27Z'
      name: v4m
      resourceVersion: '64518808'
      uid: 57729f73-8aa4-457b-821e-f7734d79def2
    parameters:
      skuName: Standard_LRS
    provisioner: kubernetes.io/azure-disk
    reclaimPolicy: Delete
    volumeBindingMode: WaitForFirstConsumer
Wednesday 29 June 2022  20:02:36 +0000 (0:00:01.029)       0:00:07.830 ******** 
redirecting (type: action) community.kubernetes.k8s to community.kubernetes.k8s_info
redirecting (type: action) community.kubernetes.k8s to community.kubernetes.k8s_info
redirecting (type: action) community.kubernetes.k8s to community.kubernetes.k8s_info
redirecting (type: action) community.kubernetes.k8s to community.kubernetes.k8s_info
included: /viya4-deployment/roles/monitoring/tasks/cluster-monitoring.yaml for localhost
Wednesday 29 June 2022  20:02:36 +0000 (0:00:00.036)       0:00:07.867 ******** 
<127.0.0.1> ESTABLISH LOCAL CONNECTION FOR USER: viya4-deployment
<127.0.0.1> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo $HOME/.ansible/tmp `"&& mkdir "` echo $HOME/.ansible/tmp/ansible-tmp-1656532956.1423342-283-187642393217072 `" && echo ansible-tmp-1656532956.1423342-283-187642393217072="` echo $HOME/.ansible/tmp/ansible-tmp-1656532956.1423342-283-187642393217072 `" ) && sleep 0'
Using module file /usr/local/lib/python3.8/dist-packages/ansible/modules/file.py
<127.0.0.1> PUT /viya4-deployment/.ansible/tmp/ansible-local-1u124t61g/tmpxglr9uo4 TO /viya4-deployment/.ansible/tmp/ansible-tmp-1656532956.1423342-283-187642393217072/AnsiballZ_file.py
<127.0.0.1> EXEC /bin/sh -c 'chmod u+x /viya4-deployment/.ansible/tmp/ansible-tmp-1656532956.1423342-283-187642393217072/ /viya4-deployment/.ansible/tmp/ansible-tmp-1656532956.1423342-283-187642393217072/AnsiballZ_file.py && sleep 0'
<127.0.0.1> EXEC /bin/sh -c '/usr/bin/python3 /viya4-deployment/.ansible/tmp/ansible-tmp-1656532956.1423342-283-187642393217072/AnsiballZ_file.py && sleep 0'
<127.0.0.1> EXEC /bin/sh -c 'rm -f -r /viya4-deployment/.ansible/tmp/ansible-tmp-1656532956.1423342-283-187642393217072/ > /dev/null 2>&1 && sleep 0'

TASK [monitoring : cluster-monitoring - create userdir] ************************
task path: /viya4-deployment/roles/monitoring/tasks/cluster-monitoring.yaml:2
changed: [localhost] => changed=true 
  diff:
    after:
      mode: '0770'
      path: /tmp/ansible.d9_dhwup/monitoring/
      state: directory
    before:
      mode: '0755'
      path: /tmp/ansible.d9_dhwup/monitoring/
      state: absent
  gid: 1000
  group: '1000'
  invocation:
    module_args:
      _diff_peek: null
      _original_basename: null
      access_time: null
      access_time_format: '%Y%m%d%H%M.%S'
      attributes: null
      follow: true
      force: false
      group: null
      mode: '0770'
      modification_time: null
      modification_time_format: '%Y%m%d%H%M.%S'
      owner: null
      path: /tmp/ansible.d9_dhwup/monitoring/
      recurse: false
      selevel: null
      serole: null
      setype: null
      seuser: null
      src: null
      state: directory
      unsafe_writes: false
  mode: '0770'
  owner: viya4-deployment
  path: /tmp/ansible.d9_dhwup/monitoring/
  size: 6
  state: directory
  uid: 1000
Wednesday 29 June 2022  20:02:36 +0000 (0:00:00.350)       0:00:08.217 ******** 
<127.0.0.1> ESTABLISH LOCAL CONNECTION FOR USER: viya4-deployment
<127.0.0.1> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo $HOME/.ansible/tmp `"&& mkdir "` echo $HOME/.ansible/tmp/ansible-tmp-1656532956.4943066-306-51508623617238 `" && echo ansible-tmp-1656532956.4943066-306-51508623617238="` echo $HOME/.ansible/tmp/ansible-tmp-1656532956.4943066-306-51508623617238 `" ) && sleep 0'
Using module file /viya4-deployment/.ansible/collections/ansible_collections/community/kubernetes/plugins/modules/k8s_info.py
<127.0.0.1> PUT /viya4-deployment/.ansible/tmp/ansible-local-1u124t61g/tmpmvugys1a TO /viya4-deployment/.ansible/tmp/ansible-tmp-1656532956.4943066-306-51508623617238/AnsiballZ_k8s_info.py
<127.0.0.1> EXEC /bin/sh -c 'chmod u+x /viya4-deployment/.ansible/tmp/ansible-tmp-1656532956.4943066-306-51508623617238/ /viya4-deployment/.ansible/tmp/ansible-tmp-1656532956.4943066-306-51508623617238/AnsiballZ_k8s_info.py && sleep 0'
<127.0.0.1> EXEC /bin/sh -c '/usr/bin/python3 /viya4-deployment/.ansible/tmp/ansible-tmp-1656532956.4943066-306-51508623617238/AnsiballZ_k8s_info.py && sleep 0'
<127.0.0.1> EXEC /bin/sh -c 'rm -f -r /viya4-deployment/.ansible/tmp/ansible-tmp-1656532956.4943066-306-51508623617238/ > /dev/null 2>&1 && sleep 0'

TASK [monitoring : cluster-monitoring - lookup existing credentials] ***********
task path: /viya4-deployment/roles/monitoring/tasks/cluster-monitoring.yaml:12
ok: [localhost] => changed=false 
  api_found: true
  invocation:
    module_args:
      api_key: null
      api_version: v1
      ca_cert: null
      client_cert: null
      client_key: null
      context: null
      field_selectors: []
      host: null
      kind: Secret
      kubeconfig: /tmp/ansible.d9_dhwup/.kube
      label_selectors: []
      name: v4m-grafana
      namespace: monitoring
      password: null
      persist_config: null
      proxy: null
      username: null
      validate_certs: null
      wait: false
      wait_condition: null
      wait_sleep: 5
      wait_timeout: 120
  resources:
  - apiVersion: v1
    data:
      admin-password: azByZHhMUkhvRGlnVGVSUndxTzhaZFJ4anFWSEMyTnJCZDZjRU9WYQ==
      admin-user: YWRtaW4=
      ldap-toml: ''
    kind: Secret
    metadata:
      annotations:
        meta.helm.sh/release-name: v4m-prometheus-operator
        meta.helm.sh/release-namespace: monitoring
      creationTimestamp: '2022-06-28T22:12:49Z'
      labels:
        app.kubernetes.io/instance: v4m-prometheus-operator
        app.kubernetes.io/managed-by: Helm
        app.kubernetes.io/name: grafana
        app.kubernetes.io/version: 7.5.4
        helm.sh/chart: grafana-6.7.5
      managedFields:
      - apiVersion: v1
        fieldsType: FieldsV1
        fieldsV1:
          f:data:
            .: {}
            f:admin-password: {}
            f:admin-user: {}
            f:ldap-toml: {}
          f:metadata:
            f:annotations:
              .: {}
              f:meta.helm.sh/release-name: {}
              f:meta.helm.sh/release-namespace: {}
            f:labels:
              .: {}
              f:app.kubernetes.io/instance: {}
              f:app.kubernetes.io/managed-by: {}
              f:app.kubernetes.io/name: {}
              f:app.kubernetes.io/version: {}
              f:helm.sh/chart: {}
          f:type: {}
        manager: helm
        operation: Update
        time: '2022-06-28T22:12:49Z'
      name: v4m-grafana
      namespace: monitoring
      resourceVersion: '65230788'
      uid: 5c830b5f-abec-4d86-a51c-4fe87a7e04a9
    type: Opaque
Wednesday 29 June 2022  20:02:37 +0000 (0:00:00.934)       0:00:09.152 ******** 

TASK [monitoring : Set password fact] ******************************************
task path: /viya4-deployment/roles/monitoring/tasks/cluster-monitoring.yaml:24
ok: [localhost] => changed=false 
  ansible_facts:
    V4M_GRAFANA_PASSWORD: Myeye$0n1y##
Wednesday 29 June 2022  20:02:37 +0000 (0:00:00.044)       0:00:09.197 ******** 

TASK [monitoring : cluster-monitoring - save credentials] **********************
task path: /viya4-deployment/roles/monitoring/tasks/cluster-monitoring.yaml:30
ok: [localhost] => changed=false 
  ansible_facts:
    V4M_GRAFANA_PASSWORD: k0rdxLRHoDigTeRRwqO8ZdRxjqVHC2NrBd6cEOVa
Wednesday 29 June 2022  20:02:37 +0000 (0:00:00.056)       0:00:09.254 ******** 

TASK [monitoring : cluster-monitoring - output credentials] ********************
task path: /viya4-deployment/roles/monitoring/tasks/cluster-monitoring.yaml:39
ok: [localhost] => 
  msg:
  - 'Grafana - username: admin, password: k0rdxLRHoDigTeRRwqO8ZdRxjqVHC2NrBd6cEOVa'
Wednesday 29 June 2022  20:02:37 +0000 (0:00:00.044)       0:00:09.298 ******** 
<127.0.0.1> ESTABLISH LOCAL CONNECTION FOR USER: viya4-deployment
<127.0.0.1> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo $HOME/.ansible/tmp `"&& mkdir "` echo $HOME/.ansible/tmp/ansible-tmp-1656532957.5747337-336-87357379262124 `" && echo ansible-tmp-1656532957.5747337-336-87357379262124="` echo $HOME/.ansible/tmp/ansible-tmp-1656532957.5747337-336-87357379262124 `" ) && sleep 0'
Using module file /usr/local/lib/python3.8/dist-packages/ansible/modules/stat.py
<127.0.0.1> PUT /viya4-deployment/.ansible/tmp/ansible-local-1u124t61g/tmp5za3hnjv TO /viya4-deployment/.ansible/tmp/ansible-tmp-1656532957.5747337-336-87357379262124/AnsiballZ_stat.py
<127.0.0.1> EXEC /bin/sh -c 'chmod u+x /viya4-deployment/.ansible/tmp/ansible-tmp-1656532957.5747337-336-87357379262124/ /viya4-deployment/.ansible/tmp/ansible-tmp-1656532957.5747337-336-87357379262124/AnsiballZ_stat.py && sleep 0'
<127.0.0.1> EXEC /bin/sh -c '/usr/bin/python3 /viya4-deployment/.ansible/tmp/ansible-tmp-1656532957.5747337-336-87357379262124/AnsiballZ_stat.py && sleep 0'
<127.0.0.1> PUT /viya4-deployment/.ansible/tmp/ansible-local-1u124t61g/tmpm_wuk9z4/user-values-prom-operator.yaml TO /viya4-deployment/.ansible/tmp/ansible-tmp-1656532957.5747337-336-87357379262124/source
<127.0.0.1> EXEC /bin/sh -c 'chmod u+x /viya4-deployment/.ansible/tmp/ansible-tmp-1656532957.5747337-336-87357379262124/ /viya4-deployment/.ansible/tmp/ansible-tmp-1656532957.5747337-336-87357379262124/source && sleep 0'
Using module file /usr/local/lib/python3.8/dist-packages/ansible/modules/copy.py
<127.0.0.1> PUT /viya4-deployment/.ansible/tmp/ansible-local-1u124t61g/tmp5_ii90f2 TO /viya4-deployment/.ansible/tmp/ansible-tmp-1656532957.5747337-336-87357379262124/AnsiballZ_copy.py
<127.0.0.1> EXEC /bin/sh -c 'chmod u+x /viya4-deployment/.ansible/tmp/ansible-tmp-1656532957.5747337-336-87357379262124/ /viya4-deployment/.ansible/tmp/ansible-tmp-1656532957.5747337-336-87357379262124/AnsiballZ_copy.py && sleep 0'
<127.0.0.1> EXEC /bin/sh -c '/usr/bin/python3 /viya4-deployment/.ansible/tmp/ansible-tmp-1656532957.5747337-336-87357379262124/AnsiballZ_copy.py && sleep 0'
<127.0.0.1> EXEC /bin/sh -c 'rm -f -r /viya4-deployment/.ansible/tmp/ansible-tmp-1656532957.5747337-336-87357379262124/ > /dev/null 2>&1 && sleep 0'

TASK [monitoring : cluster-monitoring - user values] ***************************
task path: /viya4-deployment/roles/monitoring/tasks/cluster-monitoring.yaml:46
changed: [localhost] => changed=true 
  checksum: 78f2efc22b527d1973a0c9a720873940d1bd7b36
  dest: /tmp/ansible.d9_dhwup/monitoring/user-values-prom-operator.yaml
  diff: []
  gid: 1000
  group: '1000'
  invocation:
    module_args:
      _original_basename: user-values-prom-operator.yaml
      attributes: null
      backup: false
      checksum: 78f2efc22b527d1973a0c9a720873940d1bd7b36
      content: null
      dest: /tmp/ansible.d9_dhwup/monitoring/user-values-prom-operator.yaml
      directory_mode: null
      follow: false
      force: true
      group: null
      local_follow: null
      mode: '0660'
      owner: null
      remote_src: null
      selevel: null
      serole: null
      setype: null
      seuser: null
      src: /viya4-deployment/.ansible/tmp/ansible-tmp-1656532957.5747337-336-87357379262124/source
      unsafe_writes: false
      validate: null
  md5sum: 97278081f0e72eab35337a546c4db65d
  mode: '0660'
  owner: viya4-deployment
  size: 1979
  src: /viya4-deployment/.ansible/tmp/ansible-tmp-1656532957.5747337-336-87357379262124/source
  state: file
  uid: 1000
Wednesday 29 June 2022  20:02:37 +0000 (0:00:00.445)       0:00:09.744 ******** 
<127.0.0.1> ESTABLISH LOCAL CONNECTION FOR USER: viya4-deployment
<127.0.0.1> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo $HOME/.ansible/tmp `"&& mkdir "` echo $HOME/.ansible/tmp/ansible-tmp-1656532958.0244393-372-229989092429948 `" && echo ansible-tmp-1656532958.0244393-372-229989092429948="` echo $HOME/.ansible/tmp/ansible-tmp-1656532958.0244393-372-229989092429948 `" ) && sleep 0'
Using module file /usr/local/lib/python3.8/dist-packages/ansible/modules/command.py
<127.0.0.1> PUT /viya4-deployment/.ansible/tmp/ansible-local-1u124t61g/tmp2npr1skf TO /viya4-deployment/.ansible/tmp/ansible-tmp-1656532958.0244393-372-229989092429948/AnsiballZ_command.py
<127.0.0.1> EXEC /bin/sh -c 'chmod u+x /viya4-deployment/.ansible/tmp/ansible-tmp-1656532958.0244393-372-229989092429948/ /viya4-deployment/.ansible/tmp/ansible-tmp-1656532958.0244393-372-229989092429948/AnsiballZ_command.py && sleep 0'
<127.0.0.1> EXEC /bin/sh -c 'USER_DIR=/tmp/ansible.d9_dhwup TLS_ENABLE=true KUBECONFIG=/tmp/ansible.d9_dhwup/.kube LOG_COLOR_ENABLE=False NODE_PLACEMENT_ENABLE=False GRAFANA_ADMIN_PASSWORD=k0rdxLRHoDigTeRRwqO8ZdRxjqVHC2NrBd6cEOVa VIYA_NS=sasfirdviya4 MON_NS=monitoring /usr/bin/python3 /viya4-deployment/.ansible/tmp/ansible-tmp-1656532958.0244393-372-229989092429948/AnsiballZ_command.py && sleep 0'
<127.0.0.1> EXEC /bin/sh -c 'rm -f -r /viya4-deployment/.ansible/tmp/ansible-tmp-1656532958.0244393-372-229989092429948/ > /dev/null 2>&1 && sleep 0'

TASK [monitoring : cluster-monitoring - deploy] ********************************
task path: /viya4-deployment/roles/monitoring/tasks/cluster-monitoring.yaml:56
changed: [localhost] => changed=true 
  cmd:
  - /tmp/ansible.d9_dhwup/viya4-monitoring-kubernetes/monitoring/bin/deploy_monitoring_cluster.sh
  delta: '0:01:52.467472'
  end: '2022-06-29 20:04:30.795430'
  invocation:
    module_args:
      _raw_params: /tmp/ansible.d9_dhwup/viya4-monitoring-kubernetes/monitoring/bin/deploy_monitoring_cluster.sh
      _uses_shell: false
      argv: null
      chdir: null
      creates: null
      executable: null
      removes: null
      stdin: null
      stdin_add_newline: true
      strip_empty_ends: true
      warn: true
  msg: ''
  rc: 0
  start: '2022-06-29 20:02:38.327958'
  stderr: |-
    W0629 20:03:06.877409     600 warnings.go:70] policy/v1beta1 PodSecurityPolicy is deprecated in v1.21+, unavailable in v1.25+
    W0629 20:03:07.696630     600 warnings.go:70] policy/v1beta1 PodSecurityPolicy is deprecated in v1.21+, unavailable in v1.25+
    W0629 20:03:14.075036     600 warnings.go:70] policy/v1beta1 PodSecurityPolicy is deprecated in v1.21+, unavailable in v1.25+
    W0629 20:03:14.170941     600 warnings.go:70] policy/v1beta1 PodSecurityPolicy is deprecated in v1.21+, unavailable in v1.25+
    W0629 20:03:14.175483     600 warnings.go:70] policy/v1beta1 PodSecurityPolicy is deprecated in v1.21+, unavailable in v1.25+
    W0629 20:03:14.195273     600 warnings.go:70] policy/v1beta1 PodSecurityPolicy is deprecated in v1.21+, unavailable in v1.25+
    W0629 20:03:14.200198     600 warnings.go:70] policy/v1beta1 PodSecurityPolicy is deprecated in v1.21+, unavailable in v1.25+
    W0629 20:03:14.204712     600 warnings.go:70] policy/v1beta1 PodSecurityPolicy is deprecated in v1.21+, unavailable in v1.25+
    W0629 20:03:14.219802     600 warnings.go:70] policy/v1beta1 PodSecurityPolicy is deprecated in v1.21+, unavailable in v1.25+
    W0629 20:03:14.224977     600 warnings.go:70] policy/v1beta1 PodSecurityPolicy is deprecated in v1.21+, unavailable in v1.25+
    W0629 20:03:14.229387     600 warnings.go:70] policy/v1beta1 PodSecurityPolicy is deprecated in v1.21+, unavailable in v1.25+
    W0629 20:03:14.243384     600 warnings.go:70] policy/v1beta1 PodSecurityPolicy is deprecated in v1.21+, unavailable in v1.25+
    W0629 20:03:14.247854     600 warnings.go:70] policy/v1beta1 PodSecurityPolicy is deprecated in v1.21+, unavailable in v1.25+
    W0629 20:03:14.252144     600 warnings.go:70] policy/v1beta1 PodSecurityPolicy is deprecated in v1.21+, unavailable in v1.25+
    W0629 20:03:14.265963     600 warnings.go:70] policy/v1beta1 PodSecurityPolicy is deprecated in v1.21+, unavailable in v1.25+
    W0629 20:03:14.270915     600 warnings.go:70] policy/v1beta1 PodSecurityPolicy is deprecated in v1.21+, unavailable in v1.25+
    W0629 20:03:14.275456     600 warnings.go:70] policy/v1beta1 PodSecurityPolicy is deprecated in v1.21+, unavailable in v1.25+
    W0629 20:03:14.289116     600 warnings.go:70] policy/v1beta1 PodSecurityPolicy is deprecated in v1.21+, unavailable in v1.25+
    W0629 20:03:14.294055     600 warnings.go:70] policy/v1beta1 PodSecurityPolicy is deprecated in v1.21+, unavailable in v1.25+
    W0629 20:03:14.298388     600 warnings.go:70] policy/v1beta1 PodSecurityPolicy is deprecated in v1.21+, unavailable in v1.25+
    W0629 20:03:14.312322     600 warnings.go:70] policy/v1beta1 PodSecurityPolicy is deprecated in v1.21+, unavailable in v1.25+
    W0629 20:03:15.833210     600 warnings.go:70] networking.k8s.io/v1beta1 Ingress is deprecated in v1.19+, unavailable in v1.22+; use networking.k8s.io/v1 Ingress
    W0629 20:03:15.838656     600 warnings.go:70] networking.k8s.io/v1beta1 Ingress is deprecated in v1.19+, unavailable in v1.22+; use networking.k8s.io/v1 Ingress
    W0629 20:03:16.768583     600 warnings.go:70] networking.k8s.io/v1beta1 Ingress is deprecated in v1.19+, unavailable in v1.22+; use networking.k8s.io/v1 Ingress
    W0629 20:03:16.774077     600 warnings.go:70] networking.k8s.io/v1beta1 Ingress is deprecated in v1.19+, unavailable in v1.22+; use networking.k8s.io/v1 Ingress
    W0629 20:03:16.778692     600 warnings.go:70] networking.k8s.io/v1beta1 Ingress is deprecated in v1.19+, unavailable in v1.22+; use networking.k8s.io/v1 Ingress
    W0629 20:03:17.665002     600 warnings.go:70] networking.k8s.io/v1beta1 Ingress is deprecated in v1.19+, unavailable in v1.22+; use networking.k8s.io/v1 Ingress
    W0629 20:04:00.566680     600 warnings.go:70] policy/v1beta1 PodSecurityPolicy is deprecated in v1.21+, unavailable in v1.25+
    W0629 20:04:01.405096     600 warnings.go:70] policy/v1beta1 PodSecurityPolicy is deprecated in v1.21+, unavailable in v1.25+
    W0629 20:04:08.222180     600 warnings.go:70] policy/v1beta1 PodSecurityPolicy is deprecated in v1.21+, unavailable in v1.25+
  stderr_lines: <omitted>
  stdout: |-
    Helm client version: 3.8.1
    Kubernetes client version: v1.22.10
    Kubernetes server version: v1.21.9

    Deploying monitoring to the [monitoring] namespace...
    Adding [prometheus-community] helm repository
    "prometheus-community" has been added to your repositories
    Updating helm repositories...
    Hang tight while we grab the latest from your chart repositories...
    ...Successfully got an update from the "prometheus-community" chart repository
    Update Complete. ⎈Happy Helming!⎈
    Updating Prometheus Operator custom resource definitions
    customresourcedefinition.apiextensions.k8s.io/alertmanagerconfigs.monitoring.coreos.com configured
    customresourcedefinition.apiextensions.k8s.io/alertmanagers.monitoring.coreos.com configured
    customresourcedefinition.apiextensions.k8s.io/podmonitors.monitoring.coreos.com configured
    customresourcedefinition.apiextensions.k8s.io/prometheuses.monitoring.coreos.com configured
    customresourcedefinition.apiextensions.k8s.io/prometheusrules.monitoring.coreos.com configured
    customresourcedefinition.apiextensions.k8s.io/servicemonitors.monitoring.coreos.com configured
    customresourcedefinition.apiextensions.k8s.io/thanosrulers.monitoring.coreos.com configured
    customresourcedefinition.apiextensions.k8s.io/probes.monitoring.coreos.com configured
    Provisioning TLS-enabled Prometheus datasource for Grafana...
    configmap "grafana-datasource-prom-https" deleted
    configmap/grafana-datasource-prom-https created
    configmap/grafana-datasource-prom-https labeled
    Enabling Prometheus node-exporter for TLS...
    configmap "node-exporter-tls-web-config" deleted
    configmap/node-exporter-tls-web-config created
    configmap/node-exporter-tls-web-config labeled
    User response file: [/tmp/ansible.d9_dhwup/monitoring/user-values-prom-operator.yaml]
    Deploying the Kube Prometheus Stack. This may take a few minutes...
    Upgrading via Helm...(Wed Jun 29 20:02:54 UTC 2022 - timeout 20m)
    Release "v4m-prometheus-operator" has been upgraded. Happy Helming!
    NAME: v4m-prometheus-operator
    LAST DEPLOYED: Wed Jun 29 20:03:01 2022
    NAMESPACE: monitoring
    STATUS: deployed
    REVISION: 9
    TEST SUITE: None
    NOTES:
    kube-prometheus-stack has been installed. Check its status by running:
      kubectl --namespace monitoring get pods -l "release=v4m-prometheus-operator"

    Visit https://github.com/prometheus-operator/kube-prometheus for instructions on how to create & configure Alertmanager and Prometheus instances using the Operator.
    Patching Grafana ServiceMonitor for TLS...
    servicemonitor.monitoring.coreos.com/v4m-grafana patched (no change)
    Deploying cluster ServiceMonitors...
    NAME            STATUS   AGE
    ingress-nginx   Active   120d
    NGINX found. Deploying podMonitor to [ingress-nginx] namespace...
    podmonitor.monitoring.coreos.com/ingress-nginx unchanged
    podmonitor.monitoring.coreos.com/eventrouter unchanged
    servicemonitor.monitoring.coreos.com/elasticsearch unchanged
    servicemonitor.monitoring.coreos.com/fluent-bit unchanged
    servicemonitor.monitoring.coreos.com/fluent-bit-v2 unchanged
    Adding Prometheus recording rules...
    prometheusrule.monitoring.coreos.com/sas-launcher-job-rules unchanged

    Deploying SAS dashboards to the [monitoring] namespace...
    --------------------------------
    Deploying welcome dashboards...
    --------------------------------
    configmap/viya-welcome-dashboard configured
    configmap/viya-welcome-dashboard not labeled
    --------------------------------
    Deploying Kubernetes cluster dashboards...
    --------------------------------
    configmap/k8s-cluster-dashboard configured
    configmap/k8s-cluster-dashboard not labeled
    configmap/k8s-deployment-dashboard configured
    configmap/k8s-deployment-dashboard not labeled
    configmap/perf-k8s-container-util configured
    configmap/perf-k8s-container-util not labeled
    configmap/perf-k8s-headroom configured
    configmap/perf-k8s-headroom not labeled
    configmap/perf-k8s-node-util-detail configured
    configmap/perf-k8s-node-util-detail not labeled
    configmap/perf-k8s-node-util configured
    configmap/perf-k8s-node-util not labeled
    configmap/prometheus-alerts configured
    configmap/prometheus-alerts not labeled
    --------------------------------
    Deploying Logging dashboards...
    --------------------------------
    configmap/elasticsearch-dashboard configured
    configmap/elasticsearch-dashboard not labeled
    configmap/fluent-bit configured
    configmap/fluent-bit not labeled
    --------------------------------
    Deploying SAS Viya dashboards...
    --------------------------------
    configmap/cas-dashboard configured
    configmap/cas-dashboard not labeled
    configmap/go-service-dashboard configured
    configmap/go-service-dashboard not labeled
    configmap/java-service-dashboard configured
    configmap/java-service-dashboard not labeled
    configmap/postgres-dashboard configured
    configmap/postgres-dashboard not labeled
    configmap/sas-launched-jobs-node configured
    configmap/sas-launched-jobs-node not labeled
    configmap/sas-launched-jobs-users configured
    configmap/sas-launched-jobs-users not labeled
    --------------------------------
    Deploying Postgres dashboards...
    --------------------------------
    configmap/pg-details configured
    configmap/pg-details not labeled
    --------------------------------
    Deploying RabbitMQ dashboards...
    --------------------------------
    configmap/erlang-memory-allocators configured
    configmap/erlang-memory-allocators not labeled
    configmap/rabbitmq-overview configured
    configmap/rabbitmq-overview not labeled
    --------------------------------
    Deploying NGINX dashboards...
    --------------------------------
    configmap/nginx-dashboard configured
    configmap/nginx-dashboard not labeled
    --------------------------------
    Deployed dashboards to the [monitoring] namespace
    Updating version info...
    Release "v4m" has been upgraded. Happy Helming!
    NAME: v4m
    LAST DEPLOYED: Wed Jun 29 20:04:30 2022
    NAMESPACE: monitoring
    STATUS: deployed
    REVISION: 8
    TEST SUITE: None
    NOTES:
    Viya Monitoring for Kubernetes 1.0.10 is installed

    ================================================================================
    == Accessing the monitoring applications ==
    == ==
    == ***GRAFANA*** ==
    == You can access Grafana via the following URL: ==
    == https://grafana.sasfirdviya4.cases.unx.sas.com:443/ ==
    == ==
    == Note: These URLs may be incorrect if your ingress and/or other network ==
    == configuration includes options this script does not handle. ==
    ================================================================================

    Successfully deployed components to the [monitoring] namespace
  stdout_lines: <omitted>
redirecting (type: action) community.kubernetes.k8s to community.kubernetes.k8s_info
Wednesday 29 June 2022  20:04:30 +0000 (0:01:52.871)       0:02:02.615 ******** 
redirecting (type: action) community.kubernetes.k8s to community.kubernetes.k8s_info
Wednesday 29 June 2022  20:04:30 +0000 (0:00:00.051)       0:02:02.667 ******** 
redirecting (type: action) community.kubernetes.k8s to community.kubernetes.k8s_info
Wednesday 29 June 2022  20:04:30 +0000 (0:00:00.045)       0:02:02.713 ******** 
Wednesday 29 June 2022  20:04:30 +0000 (0:00:00.043)       0:02:02.756 ******** 
redirecting (type: action) community.kubernetes.k8s to community.kubernetes.k8s_info
redirecting (type: action) community.kubernetes.k8s to community.kubernetes.k8s_info
redirecting (type: action) community.kubernetes.k8s to community.kubernetes.k8s_info
included: /viya4-deployment/roles/monitoring/tasks/cluster-logging.yaml for localhost
Wednesday 29 June 2022  20:04:31 +0000 (0:00:00.057)       0:02:02.814 ******** 
<127.0.0.1> ESTABLISH LOCAL CONNECTION FOR USER: viya4-deployment
<127.0.0.1> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo $HOME/.ansible/tmp `"&& mkdir "` echo $HOME/.ansible/tmp/ansible-tmp-1656533071.088675-1063-127229629782703 `" && echo ansible-tmp-1656533071.088675-1063-127229629782703="` echo $HOME/.ansible/tmp/ansible-tmp-1656533071.088675-1063-127229629782703 `" ) && sleep 0'
Using module file /usr/local/lib/python3.8/dist-packages/ansible/modules/file.py
<127.0.0.1> PUT /viya4-deployment/.ansible/tmp/ansible-local-1u124t61g/tmpditkh0pq TO /viya4-deployment/.ansible/tmp/ansible-tmp-1656533071.088675-1063-127229629782703/AnsiballZ_file.py
<127.0.0.1> EXEC /bin/sh -c 'chmod u+x /viya4-deployment/.ansible/tmp/ansible-tmp-1656533071.088675-1063-127229629782703/ /viya4-deployment/.ansible/tmp/ansible-tmp-1656533071.088675-1063-127229629782703/AnsiballZ_file.py && sleep 0'
<127.0.0.1> EXEC /bin/sh -c '/usr/bin/python3 /viya4-deployment/.ansible/tmp/ansible-tmp-1656533071.088675-1063-127229629782703/AnsiballZ_file.py && sleep 0'
<127.0.0.1> EXEC /bin/sh -c 'rm -f -r /viya4-deployment/.ansible/tmp/ansible-tmp-1656533071.088675-1063-127229629782703/ > /dev/null 2>&1 && sleep 0'

TASK [monitoring : cluster-logging - create userdir] ***************************
task path: /viya4-deployment/roles/monitoring/tasks/cluster-logging.yaml:2
changed: [localhost] => changed=true 
  diff:
    after:
      mode: '0770'
      path: /tmp/ansible.d9_dhwup/logging/
      state: directory
    before:
      mode: '0755'
      path: /tmp/ansible.d9_dhwup/logging/
      state: absent
  gid: 1000
  group: '1000'
  invocation:
    module_args:
      _diff_peek: null
      _original_basename: null
      access_time: null
      access_time_format: '%Y%m%d%H%M.%S'
      attributes: null
      follow: true
      force: false
      group: null
      mode: '0770'
      modification_time: null
      modification_time_format: '%Y%m%d%H%M.%S'
      owner: null
      path: /tmp/ansible.d9_dhwup/logging/
      recurse: false
      selevel: null
      serole: null
      setype: null
      seuser: null
      src: null
      state: directory
      unsafe_writes: false
  mode: '0770'
  owner: viya4-deployment
  path: /tmp/ansible.d9_dhwup/logging/
  size: 6
  state: directory
  uid: 1000
Wednesday 29 June 2022  20:04:31 +0000 (0:00:00.347)       0:02:03.161 ******** 
<127.0.0.1> ESTABLISH LOCAL CONNECTION FOR USER: viya4-deployment
<127.0.0.1> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo $HOME/.ansible/tmp `"&& mkdir "` echo $HOME/.ansible/tmp/ansible-tmp-1656533071.436057-1086-40309658532186 `" && echo ansible-tmp-1656533071.436057-1086-40309658532186="` echo $HOME/.ansible/tmp/ansible-tmp-1656533071.436057-1086-40309658532186 `" ) && sleep 0'
Using module file /viya4-deployment/.ansible/collections/ansible_collections/community/kubernetes/plugins/modules/k8s_info.py
<127.0.0.1> PUT /viya4-deployment/.ansible/tmp/ansible-local-1u124t61g/tmpnqxyuwcf TO /viya4-deployment/.ansible/tmp/ansible-tmp-1656533071.436057-1086-40309658532186/AnsiballZ_k8s_info.py
<127.0.0.1> EXEC /bin/sh -c 'chmod u+x /viya4-deployment/.ansible/tmp/ansible-tmp-1656533071.436057-1086-40309658532186/ /viya4-deployment/.ansible/tmp/ansible-tmp-1656533071.436057-1086-40309658532186/AnsiballZ_k8s_info.py && sleep 0'
<127.0.0.1> EXEC /bin/sh -c '/usr/bin/python3 /viya4-deployment/.ansible/tmp/ansible-tmp-1656533071.436057-1086-40309658532186/AnsiballZ_k8s_info.py && sleep 0'
<127.0.0.1> EXEC /bin/sh -c 'rm -f -r /viya4-deployment/.ansible/tmp/ansible-tmp-1656533071.436057-1086-40309658532186/ > /dev/null 2>&1 && sleep 0'

TASK [monitoring : cluster-logging - lookup existing credentials] **************
task path: /viya4-deployment/roles/monitoring/tasks/cluster-logging.yaml:12
ok: [localhost] => changed=false 
  api_found: true
  invocation:
    module_args:
      api_key: null
      api_version: v1
      ca_cert: null
      client_cert: null
      client_key: null
      context: null
      field_selectors: []
      host: null
      kind: Secret
      kubeconfig: /tmp/ansible.d9_dhwup/.kube
      label_selectors:
      - managed-by = v4m-es-script
      name: null
      namespace: logging
      password: null
      persist_config: null
      proxy: null
      username: null
      validate_certs: null
      wait: false
      wait_condition: null
      wait_sleep: 5
      wait_timeout: 120
  resources: []
Wednesday 29 June 2022  20:04:32 +0000 (0:00:00.919)       0:02:04.080 ******** 

TASK [monitoring : Set password facts] *****************************************
task path: /viya4-deployment/roles/monitoring/tasks/cluster-logging.yaml:24
ok: [localhost] => changed=false 
  ansible_facts:
    V4M_KIBANASERVER_PASSWORD: Myeye$0n1y##
    V4M_KIBANA_PASSWORD: Myeye$0n1y##
    V4M_LOGCOLLECTOR_PASSWORD: Myeye$0n1y##
    V4M_METRICGETTER_PASSWORD: Myeye$0n1y##
Wednesday 29 June 2022  20:04:32 +0000 (0:00:00.052)       0:02:04.133 ******** 
Wednesday 29 June 2022  20:04:32 +0000 (0:00:00.043)       0:02:04.176 ******** 

TASK [monitoring : cluster-logging - output credentials] ***********************
task path: /viya4-deployment/roles/monitoring/tasks/cluster-logging.yaml:44
ok: [localhost] => 
  msg:
  - 'OpenSearch admin  - username: admin,                   password: Myeye$0n1y##'
  - 'OpenSearch Dashboards Server - username: kibanaserver, password: Myeye$0n1y##'
  - 'Log Collector - username: logcollector,                password: Myeye$0n1y##'
  - 'Metric Getter - username: metricgetter,                password: Myeye$0n1y##'
Wednesday 29 June 2022  20:04:32 +0000 (0:00:00.050)       0:02:04.227 ******** 
<127.0.0.1> ESTABLISH LOCAL CONNECTION FOR USER: viya4-deployment
<127.0.0.1> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo $HOME/.ansible/tmp `"&& mkdir "` echo $HOME/.ansible/tmp/ansible-tmp-1656533072.5045311-1116-151096846372402 `" && echo ansible-tmp-1656533072.5045311-1116-151096846372402="` echo $HOME/.ansible/tmp/ansible-tmp-1656533072.5045311-1116-151096846372402 `" ) && sleep 0'
Using module file /usr/local/lib/python3.8/dist-packages/ansible/modules/stat.py
<127.0.0.1> PUT /viya4-deployment/.ansible/tmp/ansible-local-1u124t61g/tmpyhr2811d TO /viya4-deployment/.ansible/tmp/ansible-tmp-1656533072.5045311-1116-151096846372402/AnsiballZ_stat.py
<127.0.0.1> EXEC /bin/sh -c 'chmod u+x /viya4-deployment/.ansible/tmp/ansible-tmp-1656533072.5045311-1116-151096846372402/ /viya4-deployment/.ansible/tmp/ansible-tmp-1656533072.5045311-1116-151096846372402/AnsiballZ_stat.py && sleep 0'
<127.0.0.1> EXEC /bin/sh -c '/usr/bin/python3 /viya4-deployment/.ansible/tmp/ansible-tmp-1656533072.5045311-1116-151096846372402/AnsiballZ_stat.py && sleep 0'
<127.0.0.1> PUT /viya4-deployment/.ansible/tmp/ansible-local-1u124t61g/tmpdt49f6i5/user-values-elasticsearch-opensearch.yaml TO /viya4-deployment/.ansible/tmp/ansible-tmp-1656533072.5045311-1116-151096846372402/source
<127.0.0.1> EXEC /bin/sh -c 'chmod u+x /viya4-deployment/.ansible/tmp/ansible-tmp-1656533072.5045311-1116-151096846372402/ /viya4-deployment/.ansible/tmp/ansible-tmp-1656533072.5045311-1116-151096846372402/source && sleep 0'
Using module file /usr/local/lib/python3.8/dist-packages/ansible/modules/copy.py
<127.0.0.1> PUT /viya4-deployment/.ansible/tmp/ansible-local-1u124t61g/tmp0wzjkid3 TO /viya4-deployment/.ansible/tmp/ansible-tmp-1656533072.5045311-1116-151096846372402/AnsiballZ_copy.py
<127.0.0.1> EXEC /bin/sh -c 'chmod u+x /viya4-deployment/.ansible/tmp/ansible-tmp-1656533072.5045311-1116-151096846372402/ /viya4-deployment/.ansible/tmp/ansible-tmp-1656533072.5045311-1116-151096846372402/AnsiballZ_copy.py && sleep 0'
<127.0.0.1> EXEC /bin/sh -c '/usr/bin/python3 /viya4-deployment/.ansible/tmp/ansible-tmp-1656533072.5045311-1116-151096846372402/AnsiballZ_copy.py && sleep 0'
<127.0.0.1> EXEC /bin/sh -c 'rm -f -r /viya4-deployment/.ansible/tmp/ansible-tmp-1656533072.5045311-1116-151096846372402/ > /dev/null 2>&1 && sleep 0'

TASK [monitoring : cluster-logging - opensearch user values] *******************
task path: /viya4-deployment/roles/monitoring/tasks/cluster-logging.yaml:54
changed: [localhost] => changed=true 
  checksum: dcae13a967390fb04b0f45d7d848d007650ed005
  dest: /tmp/ansible.d9_dhwup/logging/user-values-opensearch.yaml
  diff: []
  gid: 1000
  group: '1000'
  invocation:
    module_args:
      _original_basename: user-values-elasticsearch-opensearch.yaml
      attributes: null
      backup: false
      checksum: dcae13a967390fb04b0f45d7d848d007650ed005
      content: null
      dest: /tmp/ansible.d9_dhwup/logging/user-values-opensearch.yaml
      directory_mode: null
      follow: false
      force: true
      group: null
      local_follow: null
      mode: '0660'
      owner: null
      remote_src: null
      selevel: null
      serole: null
      setype: null
      seuser: null
      src: /viya4-deployment/.ansible/tmp/ansible-tmp-1656533072.5045311-1116-151096846372402/source
      unsafe_writes: false
      validate: null
  md5sum: 59ef9d7e85196eb3edcbbfef4ee53eed
  mode: '0660'
  owner: viya4-deployment
  size: 337
  src: /viya4-deployment/.ansible/tmp/ansible-tmp-1656533072.5045311-1116-151096846372402/source
  state: file
  uid: 1000
Wednesday 29 June 2022  20:04:32 +0000 (0:00:00.481)       0:02:04.709 ******** 
<127.0.0.1> ESTABLISH LOCAL CONNECTION FOR USER: viya4-deployment
<127.0.0.1> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo $HOME/.ansible/tmp `"&& mkdir "` echo $HOME/.ansible/tmp/ansible-tmp-1656533072.9876575-1152-278911838424313 `" && echo ansible-tmp-1656533072.9876575-1152-278911838424313="` echo $HOME/.ansible/tmp/ansible-tmp-1656533072.9876575-1152-278911838424313 `" ) && sleep 0'
Using module file /usr/local/lib/python3.8/dist-packages/ansible/modules/stat.py
<127.0.0.1> PUT /viya4-deployment/.ansible/tmp/ansible-local-1u124t61g/tmp7p4_zub8 TO /viya4-deployment/.ansible/tmp/ansible-tmp-1656533072.9876575-1152-278911838424313/AnsiballZ_stat.py
<127.0.0.1> EXEC /bin/sh -c 'chmod u+x /viya4-deployment/.ansible/tmp/ansible-tmp-1656533072.9876575-1152-278911838424313/ /viya4-deployment/.ansible/tmp/ansible-tmp-1656533072.9876575-1152-278911838424313/AnsiballZ_stat.py && sleep 0'
<127.0.0.1> EXEC /bin/sh -c '/usr/bin/python3 /viya4-deployment/.ansible/tmp/ansible-tmp-1656533072.9876575-1152-278911838424313/AnsiballZ_stat.py && sleep 0'
<127.0.0.1> PUT /viya4-deployment/.ansible/tmp/ansible-local-1u124t61g/tmpr7swk_d2/user-values-osd-opensearch.yaml TO /viya4-deployment/.ansible/tmp/ansible-tmp-1656533072.9876575-1152-278911838424313/source
<127.0.0.1> EXEC /bin/sh -c 'chmod u+x /viya4-deployment/.ansible/tmp/ansible-tmp-1656533072.9876575-1152-278911838424313/ /viya4-deployment/.ansible/tmp/ansible-tmp-1656533072.9876575-1152-278911838424313/source && sleep 0'
Using module file /usr/local/lib/python3.8/dist-packages/ansible/modules/copy.py
<127.0.0.1> PUT /viya4-deployment/.ansible/tmp/ansible-local-1u124t61g/tmpqp2vq466 TO /viya4-deployment/.ansible/tmp/ansible-tmp-1656533072.9876575-1152-278911838424313/AnsiballZ_copy.py
<127.0.0.1> EXEC /bin/sh -c 'chmod u+x /viya4-deployment/.ansible/tmp/ansible-tmp-1656533072.9876575-1152-278911838424313/ /viya4-deployment/.ansible/tmp/ansible-tmp-1656533072.9876575-1152-278911838424313/AnsiballZ_copy.py && sleep 0'
<127.0.0.1> EXEC /bin/sh -c '/usr/bin/python3 /viya4-deployment/.ansible/tmp/ansible-tmp-1656533072.9876575-1152-278911838424313/AnsiballZ_copy.py && sleep 0'
<127.0.0.1> EXEC /bin/sh -c 'rm -f -r /viya4-deployment/.ansible/tmp/ansible-tmp-1656533072.9876575-1152-278911838424313/ > /dev/null 2>&1 && sleep 0'

TASK [monitoring : cluster-logging - osd user values] **************************
task path: /viya4-deployment/roles/monitoring/tasks/cluster-logging.yaml:64
changed: [localhost] => changed=true 
  checksum: d87755381ce5cc3779c46e4e962bf7192eb4b2fb
  dest: /tmp/ansible.d9_dhwup/logging/user-values-osd.yaml
  diff: []
  gid: 1000
  group: '1000'
  invocation:
    module_args:
      _original_basename: user-values-osd-opensearch.yaml
      attributes: null
      backup: false
      checksum: d87755381ce5cc3779c46e4e962bf7192eb4b2fb
      content: null
      dest: /tmp/ansible.d9_dhwup/logging/user-values-osd.yaml
      directory_mode: null
      follow: false
      force: true
      group: null
      local_follow: null
      mode: '0660'
      owner: null
      remote_src: null
      selevel: null
      serole: null
      setype: null
      seuser: null
      src: /viya4-deployment/.ansible/tmp/ansible-tmp-1656533072.9876575-1152-278911838424313/source
      unsafe_writes: false
      validate: null
  md5sum: 70f41701bc426706633bd567698e2f43
  mode: '0660'
  owner: viya4-deployment
  size: 413
  src: /viya4-deployment/.ansible/tmp/ansible-tmp-1656533072.9876575-1152-278911838424313/source
  state: file
  uid: 1000
Wednesday 29 June 2022  20:04:33 +0000 (0:00:00.466)       0:02:05.175 ******** 
<127.0.0.1> ESTABLISH LOCAL CONNECTION FOR USER: viya4-deployment
<127.0.0.1> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo $HOME/.ansible/tmp `"&& mkdir "` echo $HOME/.ansible/tmp/ansible-tmp-1656533073.458043-1188-161113107769127 `" && echo ansible-tmp-1656533073.458043-1188-161113107769127="` echo $HOME/.ansible/tmp/ansible-tmp-1656533073.458043-1188-161113107769127 `" ) && sleep 0'
Using module file /usr/local/lib/python3.8/dist-packages/ansible/modules/command.py
<127.0.0.1> PUT /viya4-deployment/.ansible/tmp/ansible-local-1u124t61g/tmpwlt7d_e7 TO /viya4-deployment/.ansible/tmp/ansible-tmp-1656533073.458043-1188-161113107769127/AnsiballZ_command.py
<127.0.0.1> EXEC /bin/sh -c 'chmod u+x /viya4-deployment/.ansible/tmp/ansible-tmp-1656533073.458043-1188-161113107769127/ /viya4-deployment/.ansible/tmp/ansible-tmp-1656533073.458043-1188-161113107769127/AnsiballZ_command.py && sleep 0'
<127.0.0.1> EXEC /bin/sh -c 'USER_DIR=/tmp/ansible.d9_dhwup TLS_ENABLE=true LOG_KB_TLS_ENABLE=true KUBECONFIG=/tmp/ansible.d9_dhwup/.kube LOG_COLOR_ENABLE=False NODE_PLACEMENT_ENABLE=False ES_ADMIN_PASSWD='"'"'Myeye$0n1y##'"'"' ES_KIBANASERVER_PASSWD='"'"'Myeye$0n1y##'"'"' ES_LOGCOLLECTOR_PASSWD='"'"'Myeye$0n1y##'"'"' ES_METRICGETTER_PASSWD='"'"'Myeye$0n1y##'"'"' LOG_NS=logging KB_KNOWN_NODEPORT_ENABLE=False /usr/bin/python3 /viya4-deployment/.ansible/tmp/ansible-tmp-1656533073.458043-1188-161113107769127/AnsiballZ_command.py && sleep 0'
<127.0.0.1> EXEC /bin/sh -c 'rm -f -r /viya4-deployment/.ansible/tmp/ansible-tmp-1656533073.458043-1188-161113107769127/ > /dev/null 2>&1 && sleep 0'

TASK [monitoring : cluster-logging - deploy] ***********************************
task path: /viya4-deployment/roles/monitoring/tasks/cluster-logging.yaml:74
The full traceback is:
  File "/tmp/ansible_ansible.legacy.command_payload_77_cynv7/ansible_ansible.legacy.command_payload.zip/ansible/module_utils/basic.py", line 2727, in run_command
    cmd = subprocess.Popen(args, **kwargs)
  File "/usr/lib/python3.8/subprocess.py", line 858, in __init__
    self._execute_child(args, executable, preexec_fn, close_fds,
  File "/usr/lib/python3.8/subprocess.py", line 1704, in _execute_child
    raise child_exception_type(errno_num, err_msg, err_filename)
fatal: [localhost]: FAILED! => changed=false 
  cmd: /tmp/ansible.d9_dhwup/viya4-monitoring-kubernetes/logging/bin/deploy_logging.sh
  invocation:
    module_args:
      _raw_params: /tmp/ansible.d9_dhwup/viya4-monitoring-kubernetes/logging/bin/deploy_logging.sh
      _uses_shell: false
      argv: null
      chdir: null
      creates: null
      executable: null
      removes: null
      stdin: null
      stdin_add_newline: true
      strip_empty_ends: true
      warn: true
  msg: '[Errno 2] No such file or directory: b''/tmp/ansible.d9_dhwup/viya4-monitoring-kubernetes/logging/bin/deploy_logging.sh'''
  rc: 2

PLAY RECAP *********************************************************************
localhost                  : ok=40   changed=10   unreachable=0    failed=1    skipped=34   rescued=0    ignored=0   

Wednesday 29 June 2022  20:04:33 +0000 (0:00:00.258)       0:02:05.434 ******** 
=============================================================================== 
monitoring : cluster-monitoring - deploy ------------------------------ 112.87s
/viya4-deployment/roles/monitoring/tasks/cluster-monitoring.yaml:56 -----------
monitoring : v4m - download --------------------------------------------- 1.97s
/viya4-deployment/roles/monitoring/tasks/main.yaml:2 --------------------------
Gathering Facts --------------------------------------------------------- 1.11s
/viya4-deployment/playbooks/playbook.yaml:1 -----------------------------------
monitoring : v4m - add storageclass ------------------------------------- 1.03s
/viya4-deployment/roles/monitoring/tasks/main.yaml:12 -------------------------
monitoring : cluster-monitoring - lookup existing credentials ----------- 0.93s
/viya4-deployment/roles/monitoring/tasks/cluster-monitoring.yaml:12 -----------
monitoring : cluster-logging - lookup existing credentials -------------- 0.92s
/viya4-deployment/roles/monitoring/tasks/cluster-logging.yaml:12 --------------
common : tfstate - export kubeconfig ------------------------------------ 0.72s
/viya4-deployment/roles/common/tasks/main.yaml:46 -----------------------------
monitoring : cluster-logging - opensearch user values ------------------- 0.48s
/viya4-deployment/roles/monitoring/tasks/cluster-logging.yaml:54 --------------
monitoring : cluster-logging - osd user values -------------------------- 0.47s
/viya4-deployment/roles/monitoring/tasks/cluster-logging.yaml:64 --------------
monitoring : cluster-monitoring - user values --------------------------- 0.44s
/viya4-deployment/roles/monitoring/tasks/cluster-monitoring.yaml:46 -----------
global tmp dir ---------------------------------------------------------- 0.37s
/viya4-deployment/playbooks/playbook.yaml:3 -----------------------------------
monitoring : cluster-monitoring - create userdir ------------------------ 0.35s
/viya4-deployment/roles/monitoring/tasks/cluster-monitoring.yaml:2 ------------
monitoring : cluster-logging - create userdir --------------------------- 0.35s
/viya4-deployment/roles/monitoring/tasks/cluster-logging.yaml:2 ---------------
monitoring : cluster-logging - deploy ----------------------------------- 0.26s
/viya4-deployment/roles/monitoring/tasks/cluster-logging.yaml:74 --------------
common : Load config file ----------------------------------------------- 0.08s
/viya4-deployment/roles/common/tasks/main.yaml:1 ------------------------------
common : Add nat ip to LOADBALANCER_SOURCE_RANGES ----------------------- 0.07s
/viya4-deployment/roles/common/tasks/main.yaml:27 -----------------------------
monitoring role - cluster ----------------------------------------------- 0.07s
/viya4-deployment/playbooks/playbook.yaml:35 ----------------------------------
common : Parse tfstate -------------------------------------------------- 0.06s
/viya4-deployment/roles/common/tasks/main.yaml:19 -----------------------------
monitoring : v4m - cluster logging -------------------------------------- 0.06s
/viya4-deployment/roles/monitoring/tasks/main.yaml:31 -------------------------
common : set_fact ------------------------------------------------------- 0.06s
/viya4-deployment/roles/common/tasks/migrations.yaml:41 -----------------------
JimBrousseau commented 2 years ago

Thomas,

This was the issue: V4M_VERSION: 1.0.10

I had a specific version in my ansible vars file which was causing an out of date version of monitoring to be pulled. Once I commented that out, it worked. You can close this issue.

dhoucgitter commented 2 years ago

Hi Jim, glad that was it, what method were you using to try "master" vs. "stable" branch of V4M, I typically use V4M_VERSION: master|stable in ansible-vars.yaml when running DAC with docker container so mainly just curious. Given what you saw, the error makes sense since that older version of V4M does not have the deploy script that DAC:5.0.0 is trying to invoke.

JimBrousseau commented 2 years ago

When I left the version out, it worked. So what ever the default is.

-- Jim Brousseau

From: David Houck @.> Sent: Wednesday, June 29, 2022 2:34 PM To: sassoftware/viya4-deployment @.> Cc: Jim Brousseau @.>; Author @.> Subject: Re: [sassoftware/viya4-deployment] Error when deploying the viya4-monitoring-kubernetes (Issue #247)

EXTERNAL

Hi Jim, glad that was it, what method were you using to try "master" vs. "stable" branch of V4M, I typically use V4M_VERSION: master|stable in ansible-vars.yaml when running DAC with docker container so mainly just curious.

- Reply to this email directly, view it on GitHubhttps://nam02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fgithub.com%2Fsassoftware%2Fviya4-deployment%2Fissues%2F247%23issuecomment-1170514592&data=05%7C01%7Cjim.brousseau%40sas.com%7C93c477388778473b255e08da5a1719bd%7Cb1c14d5c362545b3a4309552373a0c2f%7C0%7C0%7C637921352550291393%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7C&sdata=hEy5R35iQriKx2zBE1ujwVbbSaRymDw6ZJzh3I9x48c%3D&reserved=0, or unsubscribehttps://nam02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fgithub.com%2Fnotifications%2Funsubscribe-auth%2FAJHY3NB4ZPEKQO7FNJ6KMNTVRS6NBANCNFSM52GUFIHQ&data=05%7C01%7Cjim.brousseau%40sas.com%7C93c477388778473b255e08da5a1719bd%7Cb1c14d5c362545b3a4309552373a0c2f%7C0%7C0%7C637921352550291393%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7C&sdata=ZwrIHjTxINy60iJW85zaK9y1fYU5LidB8VYYLaSC3ZQ%3D&reserved=0. You are receiving this because you authored the thread.Message ID: @.***>

dhoucgitter commented 2 years ago

Closing per OP since his issue has been resolved.