Closed Tronde closed 5 months ago
I have downgraded to rhel-system-roles.noarch
version: 1.22.0-2.el9
and ran the playbook again. This time the podman secrets
were created.
]$ ansible-playbook -i inventory -K --ask-vault-password deploy_mytinytodo.yml -vv
ansible-playbook [core 2.14.14]
config file = /etc/ansible/ansible.cfg
configured module search path = ['/home/tronde/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python3.9/site-packages/ansible
ansible collection location = /home/tronde/.ansible/collections:/usr/share/ansible/collections
executable location = /usr/bin/ansible-playbook
python version = 3.9.18 (main, Jan 24 2024, 00:00:00) [GCC 11.4.1 20231218 (Red Hat 11.4.1-3)] (/usr/bin/python3)
jinja version = 3.1.2
libyaml = True
Using /etc/ansible/ansible.cfg as config file
BECOME password:
Vault password:
Skipping callback 'default', as we already have a stdout callback.
Skipping callback 'minimal', as we already have a stdout callback.
Skipping callback 'oneline', as we already have a stdout callback.
PLAYBOOK: deploy_mytinytodo.yml *****************************************************************************
1 plays in deploy_mytinytodo.yml
PLAY [myTinyTodo Demo Playbook] *****************************************************************************
TASK [Gathering Facts] **************************************************************************************
task path: /home/tronde/podman-demo/ansible/deploy_mytinytodo.yml:2
ok: [localhost]
TASK [Deploy myTinyTodo and MariaDB with Podman Quadlet] ****************************************************
task path: /home/tronde/podman-demo/ansible/deploy_mytinytodo.yml:8
TASK [rhel-system-roles.podman : Set platform/version specific variables] ***********************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/main.yml:3
included: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/set_vars.yml for localhost
TASK [rhel-system-roles.podman : Ensure ansible_facts used by role] *****************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/set_vars.yml:3
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Set platform/version specific variables] ***********************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/set_vars.yml:9
ok: [localhost] => (item=RedHat.yml) => {"ansible_facts": {"__podman_packages": ["podman", "shadow-utils-subid"]}, "ansible_included_var_files": ["/usr/share/ansible/roles/rhel-system-roles.podman/vars/RedHat.yml"], "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml"}
ok: [localhost] => (item=RedHat.yml) => {"ansible_facts": {"__podman_packages": ["podman", "shadow-utils-subid"]}, "ansible_included_var_files": ["/usr/share/ansible/roles/rhel-system-roles.podman/vars/RedHat.yml"], "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml"}
skipping: [localhost] => (item=RedHat_9.yml) => {"ansible_loop_var": "item", "changed": false, "item": "RedHat_9.yml", "skip_reason": "Conditional result was False"}
skipping: [localhost] => (item=RedHat_9.4.yml) => {"ansible_loop_var": "item", "changed": false, "item": "RedHat_9.4.yml", "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Gather the package facts] **************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/main.yml:6
ok: [localhost] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false}
TASK [rhel-system-roles.podman : Enable copr if requested] **************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/main.yml:10
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Ensure required packages are installed] ************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/main.yml:14
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Get podman version] ********************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/main.yml:21
ok: [localhost] => {"changed": false, "cmd": ["podman", "--version"], "delta": "0:00:00.038452", "end": "2024-05-10 16:57:19.198717", "msg": "", "rc": 0, "start": "2024-05-10 16:57:19.160265", "stderr": "", "stderr_lines": [], "stdout": "podman version 4.9.4-rhel", "stdout_lines": ["podman version 4.9.4-rhel"]}
TASK [rhel-system-roles.podman : Set podman version] ********************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/main.yml:26
ok: [localhost] => {"ansible_facts": {"podman_version": "4.9.4-rhel"}, "changed": false}
TASK [rhel-system-roles.podman : Podman package version must be 4.2 or later] *******************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/main.yml:30
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Podman package version must be 4.4 or later for quadlet, secrets] **********
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/main.yml:37
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Podman package version must be 4.4 or later for quadlet, secrets] **********
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/main.yml:47
META: end_host conditional evaluated to False, continuing execution for localhost
skipping: [localhost] => {"msg": "end_host conditional evaluated to false, continuing execution for localhost", "skip_reason": "end_host conditional evaluated to False, continuing execution for localhost"}
TASK [rhel-system-roles.podman : Check user and group information] ******************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/main.yml:54
included: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml for localhost
TASK [rhel-system-roles.podman : Get user information] ******************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:2
ok: [localhost] => {"ansible_facts": {"getent_passwd": {"tronde": ["x", "1000", "1000", "lokaler Benutzeraccount mit sudo-Berechtigung", "/home/tronde", "/bin/bash"]}}, "changed": false}
TASK [rhel-system-roles.podman : Fail if user does not exist] ***********************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:10
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Set group for podman user] *************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:17
ok: [localhost] => {"ansible_facts": {"__podman_group": "tronde"}, "changed": false}
TASK [rhel-system-roles.podman : Get group information] *****************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:28
ok: [localhost] => {"ansible_facts": {"getent_group": {"tronde": ["x", "1000", ""]}}, "changed": false}
TASK [rhel-system-roles.podman : Set group name] ************************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:36
ok: [localhost] => {"ansible_facts": {"__podman_group_name": "tronde"}, "changed": false}
TASK [rhel-system-roles.podman : See if getsubids exists] ***************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:41
ok: [localhost] => {"changed": false, "stat": {"atime": 1715349821.3887033, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 32, "charset": "binary", "checksum": "076f89863b474504550ab1b5c11216a77ed7123c", "ctime": 1700817235.2266545, "dev": 64768, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 16846320, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "application/x-pie-executable", "mode": "0755", "mtime": 1689166932.0, "nlink": 1, "path": "/usr/bin/getsubids", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 15496, "uid": 0, "version": "1087703794", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true}}
TASK [rhel-system-roles.podman : Check user with getsubids] *************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:52
ok: [localhost] => {"changed": false, "cmd": ["getsubids", "tronde"], "delta": "0:00:00.003549", "end": "2024-05-10 16:57:21.646833", "msg": "", "rc": 0, "start": "2024-05-10 16:57:21.643284", "stderr": "", "stderr_lines": [], "stdout": "0: tronde 100000 65536", "stdout_lines": ["0: tronde 100000 65536"]}
TASK [rhel-system-roles.podman : Check group with getsubids] ************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:56
ok: [localhost] => {"changed": false, "cmd": ["getsubids", "-g", "tronde"], "delta": "0:00:00.006447", "end": "2024-05-10 16:57:21.964445", "msg": "", "rc": 0, "start": "2024-05-10 16:57:21.957998", "stderr": "", "stderr_lines": [], "stdout": "0: tronde 100000 65536", "stdout_lines": ["0: tronde 100000 65536"]}
TASK [rhel-system-roles.podman : Check if user is in subuid file] *******************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:66
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Fail if user not in subuid file] *******************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:74
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Check if group is in subgid file] ******************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:81
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Fail if group not in subgid file] ******************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:89
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Set config file paths] *****************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/main.yml:60
ok: [localhost] => {"ansible_facts": {"__podman_container_conf_file": "/home/tronde/.config/containers/containers.conf.d/50-systemroles.conf", "__podman_policy_json_file": "/home/tronde/.config/containers/policy.json", "__podman_registries_conf_file": "/home/tronde/.config/containers/registries.conf.d/50-systemroles.conf", "__podman_storage_conf_file": "/home/tronde/.config/containers/storage.conf"}, "changed": false}
TASK [rhel-system-roles.podman : Handle container.conf.d] ***************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/main.yml:78
included: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_container_conf_d.yml for localhost
TASK [rhel-system-roles.podman : Ensure containers.d exists] ************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_container_conf_d.yml:5
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Update container config file] **********************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_container_conf_d.yml:14
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Handle registries.conf.d] **************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/main.yml:81
included: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_registries_conf_d.yml for localhost
TASK [rhel-system-roles.podman : Ensure registries.d exists] ************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_registries_conf_d.yml:5
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Update registries config file] *********************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_registries_conf_d.yml:14
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Handle storage.conf] *******************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/main.yml:84
included: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_storage_conf.yml for localhost
TASK [rhel-system-roles.podman : Ensure storage.conf parent dir exists] *************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_storage_conf.yml:5
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Update storage config file] ************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_storage_conf.yml:14
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Handle policy.json] ********************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/main.yml:87
included: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_policy_json.yml for localhost
TASK [rhel-system-roles.podman : Ensure policy.json parent dir exists] **************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_policy_json.yml:6
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Stat the policy.json file] *************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_policy_json.yml:15
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Get the existing policy.json] **********************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_policy_json.yml:20
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Write new policy.json file] ************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_policy_json.yml:26
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Manage firewall for specified ports] ******************************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/main.yml:93
TASK [redhat.rhel_system_roles.firewall : Setup firewalld] **************************************************
task path: /usr/share/ansible/collections/ansible_collections/redhat/rhel_system_roles/roles/firewall/tasks/main.yml:2
included: /usr/share/ansible/collections/ansible_collections/redhat/rhel_system_roles/roles/firewall/tasks/firewalld.yml for localhost
TASK [redhat.rhel_system_roles.firewall : Ensure ansible_facts used by role] ********************************
task path: /usr/share/ansible/collections/ansible_collections/redhat/rhel_system_roles/roles/firewall/tasks/firewalld.yml:2
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [redhat.rhel_system_roles.firewall : Install firewalld] ************************************************
task path: /usr/share/ansible/collections/ansible_collections/redhat/rhel_system_roles/roles/firewall/tasks/firewalld.yml:7
ok: [localhost] => {"changed": false, "msg": "Nothing to do", "rc": 0, "results": []}
TASK [redhat.rhel_system_roles.firewall : Collect service facts] ********************************************
task path: /usr/share/ansible/collections/ansible_collections/redhat/rhel_system_roles/roles/firewall/tasks/main.yml:5
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [redhat.rhel_system_roles.firewall : Attempt to stop and disable conflicting services] *****************
task path: /usr/share/ansible/collections/ansible_collections/redhat/rhel_system_roles/roles/firewall/tasks/main.yml:9
skipping: [localhost] => (item=nftables) => {"ansible_loop_var": "item", "changed": false, "item": "nftables", "skip_reason": "Conditional result was False"}
skipping: [localhost] => (item=iptables) => {"ansible_loop_var": "item", "changed": false, "item": "iptables", "skip_reason": "Conditional result was False"}
skipping: [localhost] => (item=ufw) => {"ansible_loop_var": "item", "changed": false, "item": "ufw", "skip_reason": "Conditional result was False"}
skipping: [localhost] => {"changed": false, "msg": "All items skipped"}
TASK [redhat.rhel_system_roles.firewall : Unmask firewalld service] *****************************************
task path: /usr/share/ansible/collections/ansible_collections/redhat/rhel_system_roles/roles/firewall/tasks/main.yml:22
ok: [localhost] => {"changed": false, "name": "firewalld", "status": {"AccessSELinuxContext": "system_u:object_r:firewalld_unit_file_t:s0", "ActiveEnterTimestamp": "Fri 2024-05-10 16:00:04 CEST", "ActiveEnterTimestampMonotonic": "4977140", "ActiveExitTimestampMonotonic": "0", "ActiveState": "active", "After": "sysinit.target basic.target system.slice dbus.socket polkit.service dbus-broker.service", "AllowIsolate": "no", "AssertResult": "yes", "AssertTimestamp": "Fri 2024-05-10 16:00:04 CEST", "AssertTimestampMonotonic": "4744078", "Before": "network-pre.target shutdown.target multi-user.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "org.fedoraproject.FirewallD1", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "752072000", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "yes", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "yes", "ConditionTimestamp": "Fri 2024-05-10 16:00:04 CEST", "ConditionTimestampMonotonic": "4744073", "ConfigurationDirectoryMode": "0755", "Conflicts": "shutdown.target ip6tables.service ebtables.service ipset.service nftables.service iptables.service", "ControlGroup": "/system.slice/firewalld.service", "ControlGroupId": "3573", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "firewalld - dynamic firewall daemon", "DevicePolicy": "auto", "Documentation": "\"man:firewalld(1)\"", "DynamicUser": "no", "EnvironmentFiles": "/etc/sysconfig/firewalld (ignore_errors=yes)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "790", "ExecMainStartTimestamp": "Fri 2024-05-10 16:00:04 CEST", "ExecMainStartTimestampMonotonic": "4745391", "ExecMainStatus": "0", "ExecReload": "{ path=/bin/kill ; argv[]=/bin/kill -HUP $MAINPID ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/bin/kill ; argv[]=/bin/kill -HUP $MAINPID ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStart": "{ path=/usr/sbin/firewalld ; argv[]=/usr/sbin/firewalld --nofork --nopid $FIREWALLD_ARGS ; ignore_errors=no ; start_time=[Fri 2024-05-10 16:00:04 CEST] ; stop_time=[n/a] ; pid=790 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/firewalld ; argv[]=/usr/sbin/firewalld --nofork --nopid $FIREWALLD_ARGS ; flags= ; start_time=[Fri 2024-05-10 16:00:04 CEST] ; stop_time=[n/a] ; pid=790 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/usr/lib/systemd/system/firewalld.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "firewalld.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestamp": "Fri 2024-05-10 16:00:04 CEST", "InactiveExitTimestampMonotonic": "4745572", "InvocationID": "195843547baa4c3c9ff1e7aa5f40bb8e", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "mixed", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "30502", "LimitNPROCSoft": "30502", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "30502", "LimitSIGPENDINGSoft": "30502", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "790", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "48066560", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "firewalld.service dbus-org.fedoraproject.FirewallD1.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "dbus.socket system.slice sysinit.target", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "null", "StandardInput": "null", "StandardOutput": "null", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Fri 2024-05-10 16:00:04 CEST", "StateChangeTimestampMonotonic": "4977140", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "running", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "2", "TasksMax": "48803", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "enabled", "UnitFileState": "enabled", "UtmpMode": "init", "WantedBy": "multi-user.target", "Wants": "network-pre.target", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0"}}
TASK [redhat.rhel_system_roles.firewall : Enable and start firewalld service] *******************************
task path: /usr/share/ansible/collections/ansible_collections/redhat/rhel_system_roles/roles/firewall/tasks/main.yml:28
ok: [localhost] => {"changed": false, "enabled": true, "name": "firewalld", "state": "started", "status": {"AccessSELinuxContext": "system_u:object_r:firewalld_unit_file_t:s0", "ActiveEnterTimestamp": "Fri 2024-05-10 16:00:04 CEST", "ActiveEnterTimestampMonotonic": "4977140", "ActiveExitTimestampMonotonic": "0", "ActiveState": "active", "After": "sysinit.target basic.target system.slice dbus.socket polkit.service dbus-broker.service", "AllowIsolate": "no", "AssertResult": "yes", "AssertTimestamp": "Fri 2024-05-10 16:00:04 CEST", "AssertTimestampMonotonic": "4744078", "Before": "network-pre.target shutdown.target multi-user.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "org.fedoraproject.FirewallD1", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "752072000", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "yes", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "yes", "ConditionTimestamp": "Fri 2024-05-10 16:00:04 CEST", "ConditionTimestampMonotonic": "4744073", "ConfigurationDirectoryMode": "0755", "Conflicts": "shutdown.target ip6tables.service ebtables.service ipset.service nftables.service iptables.service", "ControlGroup": "/system.slice/firewalld.service", "ControlGroupId": "3573", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "firewalld - dynamic firewall daemon", "DevicePolicy": "auto", "Documentation": "\"man:firewalld(1)\"", "DynamicUser": "no", "EnvironmentFiles": "/etc/sysconfig/firewalld (ignore_errors=yes)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "790", "ExecMainStartTimestamp": "Fri 2024-05-10 16:00:04 CEST", "ExecMainStartTimestampMonotonic": "4745391", "ExecMainStatus": "0", "ExecReload": "{ path=/bin/kill ; argv[]=/bin/kill -HUP $MAINPID ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/bin/kill ; argv[]=/bin/kill -HUP $MAINPID ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStart": "{ path=/usr/sbin/firewalld ; argv[]=/usr/sbin/firewalld --nofork --nopid $FIREWALLD_ARGS ; ignore_errors=no ; start_time=[Fri 2024-05-10 16:00:04 CEST] ; stop_time=[n/a] ; pid=790 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/firewalld ; argv[]=/usr/sbin/firewalld --nofork --nopid $FIREWALLD_ARGS ; flags= ; start_time=[Fri 2024-05-10 16:00:04 CEST] ; stop_time=[n/a] ; pid=790 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/usr/lib/systemd/system/firewalld.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "firewalld.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestamp": "Fri 2024-05-10 16:00:04 CEST", "InactiveExitTimestampMonotonic": "4745572", "InvocationID": "195843547baa4c3c9ff1e7aa5f40bb8e", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "mixed", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "30502", "LimitNPROCSoft": "30502", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "30502", "LimitSIGPENDINGSoft": "30502", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "790", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "48066560", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "firewalld.service dbus-org.fedoraproject.FirewallD1.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "dbus.socket system.slice sysinit.target", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "null", "StandardInput": "null", "StandardOutput": "null", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Fri 2024-05-10 16:00:04 CEST", "StateChangeTimestampMonotonic": "4977140", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "running", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "2", "TasksMax": "48803", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "enabled", "UnitFileState": "enabled", "UtmpMode": "init", "WantedBy": "multi-user.target", "Wants": "network-pre.target", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0"}}
TASK [redhat.rhel_system_roles.firewall : Check if previous replaced is defined] ****************************
task path: /usr/share/ansible/collections/ansible_collections/redhat/rhel_system_roles/roles/firewall/tasks/main.yml:34
ok: [localhost] => {"ansible_facts": {"__firewall_previous_replaced": false, "__firewall_python_cmd": "/usr/bin/python3", "__firewall_report_changed": true}, "changed": false}
TASK [redhat.rhel_system_roles.firewall : Get config files, checksums before and remove] ********************
task path: /usr/share/ansible/collections/ansible_collections/redhat/rhel_system_roles/roles/firewall/tasks/main.yml:43
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [redhat.rhel_system_roles.firewall : Tell firewall module it is able to report changed] ****************
task path: /usr/share/ansible/collections/ansible_collections/redhat/rhel_system_roles/roles/firewall/tasks/main.yml:55
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [redhat.rhel_system_roles.firewall : Configure firewall] ***********************************************
task path: /usr/share/ansible/collections/ansible_collections/redhat/rhel_system_roles/roles/firewall/tasks/main.yml:71
ok: [localhost] => (item={'port': '8080/tcp', 'state': 'enabled'}) => {"__firewall_changed": false, "ansible_loop_var": "item", "changed": false, "item": {"port": "8080/tcp", "state": "enabled"}}
TASK [redhat.rhel_system_roles.firewall : Gather firewall config information] *******************************
task path: /usr/share/ansible/collections/ansible_collections/redhat/rhel_system_roles/roles/firewall/tasks/main.yml:120
skipping: [localhost] => (item={'port': '8080/tcp', 'state': 'enabled'}) => {"ansible_loop_var": "item", "changed": false, "item": {"port": "8080/tcp", "state": "enabled"}, "skip_reason": "Conditional result was False"}
skipping: [localhost] => {"changed": false, "msg": "All items skipped"}
TASK [redhat.rhel_system_roles.firewall : Update firewalld_config fact] *************************************
task path: /usr/share/ansible/collections/ansible_collections/redhat/rhel_system_roles/roles/firewall/tasks/main.yml:130
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [redhat.rhel_system_roles.firewall : Gather firewall config if no arguments] ***************************
task path: /usr/share/ansible/collections/ansible_collections/redhat/rhel_system_roles/roles/firewall/tasks/main.yml:139
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [redhat.rhel_system_roles.firewall : Update firewalld_config fact] *************************************
task path: /usr/share/ansible/collections/ansible_collections/redhat/rhel_system_roles/roles/firewall/tasks/main.yml:144
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [redhat.rhel_system_roles.firewall : Get config files, checksums after] ********************************
task path: /usr/share/ansible/collections/ansible_collections/redhat/rhel_system_roles/roles/firewall/tasks/main.yml:153
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [redhat.rhel_system_roles.firewall : Calculate what has changed] ***************************************
task path: /usr/share/ansible/collections/ansible_collections/redhat/rhel_system_roles/roles/firewall/tasks/main.yml:163
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [redhat.rhel_system_roles.firewall : Show diffs] *******************************************************
task path: /usr/share/ansible/collections/ansible_collections/redhat/rhel_system_roles/roles/firewall/tasks/main.yml:169
skipping: [localhost] => {}
TASK [Manage selinux for specified ports] *******************************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/main.yml:100
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Handle secrets] ************************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/main.yml:107
included: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_secret.yml for localhost => (item={'name': 'mysql-root-password-container', 'state': 'present', 'skip_existing': True, 'data': 'password123'})
included: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_secret.yml for localhost => (item={'name': 'mysql-user-password-container', 'state': 'present', 'skip_existing': True, 'data': 'redhat'})
TASK [rhel-system-roles.podman : Set variables part 0] ******************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_secret.yml:3
ok: [localhost] => {"ansible_facts": {"__podman_secret": {"data": "password123", "name": "mysql-root-password-container", "skip_existing": true, "state": "present"}}, "changed": false}
TASK [rhel-system-roles.podman : Set variables part 1] ******************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_secret.yml:11
ok: [localhost] => {"ansible_facts": {"__podman_user": "tronde"}, "changed": false}
TASK [rhel-system-roles.podman : Set variables part 2] ******************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_secret.yml:16
ok: [localhost] => {"ansible_facts": {"__podman_rootless": true, "__podman_xdg_runtime_dir": "/run/user/1000"}, "changed": false}
TASK [rhel-system-roles.podman : Manage each secret] ********************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_secret.yml:22
changed: [localhost] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": true}
TASK [rhel-system-roles.podman : Set variables part 0] ******************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_secret.yml:3
ok: [localhost] => {"ansible_facts": {"__podman_secret": {"data": "password", "name": "mysql-user-password-container", "skip_existing": true, "state": "present"}}, "changed": false}
TASK [rhel-system-roles.podman : Set variables part 1] ******************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_secret.yml:11
ok: [localhost] => {"ansible_facts": {"__podman_user": "tronde"}, "changed": false}
TASK [rhel-system-roles.podman : Set variables part 2] ******************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_secret.yml:16
ok: [localhost] => {"ansible_facts": {"__podman_rootless": true, "__podman_xdg_runtime_dir": "/run/user/1000"}, "changed": false}
TASK [rhel-system-roles.podman : Manage each secret] ********************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_secret.yml:22
changed: [localhost] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": true}
TASK [rhel-system-roles.podman : Handle Kubernetes specifications] ******************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/main.yml:113
skipping: [localhost] => {"changed": false, "skipped_reason": "No items in the list"}
TASK [rhel-system-roles.podman : Handle Quadlet specifications] *********************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/main.yml:119
included: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_quadlet_spec.yml for localhost => (item={'file_src': '../quadlet/mytinytodo-demo.network', 'activate_systemd_unit': True})
included: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_quadlet_spec.yml for localhost => (item={'file_src': '../quadlet/mariadb.volume', 'activate_systemd_unit': False})
included: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_quadlet_spec.yml for localhost => (item={'file_src': '../quadlet/mytinytodo.volume', 'activate_systemd_unit': False})
included: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_quadlet_spec.yml for localhost => (item={'file_src': '../quadlet/mariadb.container', 'activate_systemd_unit': True})
included: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_quadlet_spec.yml for localhost => (item={'file_src': '../quadlet/mytinytodo.container', 'activate_systemd_unit': True})
TASK [rhel-system-roles.podman : Set per-container variables part 0] ****************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_quadlet_spec.yml:14
ok: [localhost] => {"ansible_facts": {"__podman_quadlet_file_src": "../quadlet/mytinytodo-demo.network", "__podman_quadlet_spec": {}, "__podman_quadlet_str": "[Network]", "__podman_quadlet_template_src": ""}, "changed": false}
TASK [rhel-system-roles.podman : Set per-container variables part 1] ****************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_quadlet_spec.yml:40
ok: [localhost] => {"ansible_facts": {"__podman_continue_if_pull_fails": true, "__podman_pull_image": true, "__podman_state": "created", "__podman_systemd_unit_scope": "", "__podman_user": "tronde"}, "changed": false}
TASK [rhel-system-roles.podman : Fail if no quadlet spec is given] ******************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_quadlet_spec.yml:57
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Set per-container variables part 2] ****************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_quadlet_spec.yml:70
ok: [localhost] => {"ansible_facts": {"__podman_quadlet_name": "mytinytodo-demo", "__podman_quadlet_type": "network", "__podman_rootless": true}, "changed": false}
TASK [rhel-system-roles.podman : Check user and group information] ******************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_quadlet_spec.yml:97
included: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml for localhost
TASK [rhel-system-roles.podman : Get user information] ******************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:2
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Fail if user does not exist] ***********************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:10
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Set group for podman user] *************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:17
ok: [localhost] => {"ansible_facts": {"__podman_group": "tronde"}, "changed": false}
TASK [rhel-system-roles.podman : Get group information] *****************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:28
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Set group name] ************************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:36
ok: [localhost] => {"ansible_facts": {"__podman_group_name": "tronde"}, "changed": false}
TASK [rhel-system-roles.podman : See if getsubids exists] ***************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:41
ok: [localhost] => {"changed": false, "stat": {"atime": 1715349821.3887033, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 32, "charset": "binary", "checksum": "076f89863b474504550ab1b5c11216a77ed7123c", "ctime": 1700817235.2266545, "dev": 64768, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 16846320, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "application/x-pie-executable", "mode": "0755", "mtime": 1689166932.0, "nlink": 1, "path": "/usr/bin/getsubids", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 15496, "uid": 0, "version": "1087703794", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true}}
TASK [rhel-system-roles.podman : Check user with getsubids] *************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:52
ok: [localhost] => {"changed": false, "cmd": ["getsubids", "tronde"], "delta": "0:00:00.002888", "end": "2024-05-10 16:57:34.078900", "msg": "", "rc": 0, "start": "2024-05-10 16:57:34.076012", "stderr": "", "stderr_lines": [], "stdout": "0: tronde 100000 65536", "stdout_lines": ["0: tronde 100000 65536"]}
TASK [rhel-system-roles.podman : Check group with getsubids] ************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:56
ok: [localhost] => {"changed": false, "cmd": ["getsubids", "-g", "tronde"], "delta": "0:00:00.002964", "end": "2024-05-10 16:57:34.320204", "msg": "", "rc": 0, "start": "2024-05-10 16:57:34.317240", "stderr": "", "stderr_lines": [], "stdout": "0: tronde 100000 65536", "stdout_lines": ["0: tronde 100000 65536"]}
TASK [rhel-system-roles.podman : Check if user is in subuid file] *******************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:66
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Fail if user not in subuid file] *******************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:74
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Check if group is in subgid file] ******************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:81
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Fail if group not in subgid file] ******************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:89
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Set per-container variables part 3] ****************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_quadlet_spec.yml:102
ok: [localhost] => {"ansible_facts": {"__podman_activate_systemd_unit": true, "__podman_images_found": [], "__podman_kube_yamls_raw": "", "__podman_service_name": "", "__podman_systemd_scope": "user", "__podman_user_home_dir": "/home/tronde", "__podman_xdg_runtime_dir": "/run/user/1000"}, "changed": false}
TASK [rhel-system-roles.podman : Set per-container variables part 4] ****************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_quadlet_spec.yml:134
ok: [localhost] => {"ansible_facts": {"__podman_quadlet_path": "/home/tronde/.config/containers/systemd"}, "changed": false}
TASK [rhel-system-roles.podman : Get kube yaml contents] ****************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_quadlet_spec.yml:140
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Set per-container variables part 5] ****************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_quadlet_spec.yml:151
ok: [localhost] => {"ansible_facts": {"__podman_images": [], "__podman_quadlet_file": "/home/tronde/.config/containers/systemd/mytinytodo-demo.network", "__podman_volumes": []}, "changed": false}
TASK [rhel-system-roles.podman : Cleanup quadlets] **********************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_quadlet_spec.yml:205
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Create and update quadlets] ************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_quadlet_spec.yml:209
included: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/create_update_quadlet_spec.yml for localhost
TASK [rhel-system-roles.podman : Enable lingering if needed] ************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/create_update_quadlet_spec.yml:2
ok: [localhost] => {"changed": false, "cmd": ["loginctl", "enable-linger", "tronde"], "delta": null, "end": null, "msg": "Did not run command since '/var/lib/systemd/linger/tronde' exists", "rc": 0, "start": null, "stderr": "", "stderr_lines": [], "stdout": "skipped, since /var/lib/systemd/linger/tronde exists", "stdout_lines": ["skipped, since /var/lib/systemd/linger/tronde exists"]}
TASK [rhel-system-roles.podman : Create host directories] ***************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/create_update_quadlet_spec.yml:8
skipping: [localhost] => {"changed": false, "skipped_reason": "No items in the list"}
TASK [rhel-system-roles.podman : Ensure container images are present] ***************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/create_update_quadlet_spec.yml:26
skipping: [localhost] => {"changed": false, "skipped_reason": "No items in the list"}
TASK [rhel-system-roles.podman : Ensure the quadlet directory is present] ***********************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/create_update_quadlet_spec.yml:43
ok: [localhost] => {"changed": false, "gid": 1000, "group": "tronde", "mode": "0755", "owner": "tronde", "path": "/home/tronde/.config/containers/systemd", "secontext": "unconfined_u:object_r:config_home_t:s0", "size": 137, "state": "directory", "uid": 1000}
TASK [rhel-system-roles.podman : Ensure quadlet file is copied] *********************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/create_update_quadlet_spec.yml:52
ok: [localhost] => {"changed": false, "checksum": "10362d776d62f80968c4ecefd3851ae3342366d1", "dest": "/home/tronde/.config/containers/systemd/mytinytodo-demo.network", "gid": 1000, "group": "tronde", "mode": "0644", "owner": "tronde", "path": "/home/tronde/.config/containers/systemd/mytinytodo-demo.network", "secontext": "unconfined_u:object_r:config_home_t:s0", "size": 10, "state": "file", "uid": 1000}
TASK [rhel-system-roles.podman : Ensure quadlet file content is present] ************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/create_update_quadlet_spec.yml:62
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Ensure quadlet file is present] ********************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/create_update_quadlet_spec.yml:74
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Reload systemctl] **********************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/create_update_quadlet_spec.yml:86
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Start service] *************************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/create_update_quadlet_spec.yml:115
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Restart service] ***********************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/create_update_quadlet_spec.yml:131
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Set per-container variables part 0] ****************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_quadlet_spec.yml:14
ok: [localhost] => {"ansible_facts": {"__podman_quadlet_file_src": "../quadlet/mariadb.volume", "__podman_quadlet_spec": {}, "__podman_quadlet_str": "[Volume]\nDevice=../buildah/mytinytodo/mytinytodo", "__podman_quadlet_template_src": ""}, "changed": false}
TASK [rhel-system-roles.podman : Set per-container variables part 1] ****************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_quadlet_spec.yml:40
ok: [localhost] => {"ansible_facts": {"__podman_continue_if_pull_fails": true, "__podman_pull_image": true, "__podman_state": "created", "__podman_systemd_unit_scope": "", "__podman_user": "tronde"}, "changed": false}
TASK [rhel-system-roles.podman : Fail if no quadlet spec is given] ******************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_quadlet_spec.yml:57
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Set per-container variables part 2] ****************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_quadlet_spec.yml:70
ok: [localhost] => {"ansible_facts": {"__podman_quadlet_name": "mariadb", "__podman_quadlet_type": "volume", "__podman_rootless": true}, "changed": false}
TASK [rhel-system-roles.podman : Check user and group information] ******************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_quadlet_spec.yml:97
included: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml for localhost
TASK [rhel-system-roles.podman : Get user information] ******************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:2
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Fail if user does not exist] ***********************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:10
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Set group for podman user] *************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:17
ok: [localhost] => {"ansible_facts": {"__podman_group": "tronde"}, "changed": false}
TASK [rhel-system-roles.podman : Get group information] *****************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:28
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Set group name] ************************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:36
ok: [localhost] => {"ansible_facts": {"__podman_group_name": "tronde"}, "changed": false}
TASK [rhel-system-roles.podman : See if getsubids exists] ***************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:41
ok: [localhost] => {"changed": false, "stat": {"atime": 1715349821.3887033, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 32, "charset": "binary", "checksum": "076f89863b474504550ab1b5c11216a77ed7123c", "ctime": 1700817235.2266545, "dev": 64768, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 16846320, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "application/x-pie-executable", "mode": "0755", "mtime": 1689166932.0, "nlink": 1, "path": "/usr/bin/getsubids", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 15496, "uid": 0, "version": "1087703794", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true}}
TASK [rhel-system-roles.podman : Check user with getsubids] *************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:52
ok: [localhost] => {"changed": false, "cmd": ["getsubids", "tronde"], "delta": "0:00:00.003552", "end": "2024-05-10 16:57:38.880634", "msg": "", "rc": 0, "start": "2024-05-10 16:57:38.877082", "stderr": "", "stderr_lines": [], "stdout": "0: tronde 100000 65536", "stdout_lines": ["0: tronde 100000 65536"]}
TASK [rhel-system-roles.podman : Check group with getsubids] ************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:56
ok: [localhost] => {"changed": false, "cmd": ["getsubids", "-g", "tronde"], "delta": "0:00:00.003093", "end": "2024-05-10 16:57:39.156785", "msg": "", "rc": 0, "start": "2024-05-10 16:57:39.153692", "stderr": "", "stderr_lines": [], "stdout": "0: tronde 100000 65536", "stdout_lines": ["0: tronde 100000 65536"]}
TASK [rhel-system-roles.podman : Check if user is in subuid file] *******************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:66
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Fail if user not in subuid file] *******************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:74
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Check if group is in subgid file] ******************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:81
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Fail if group not in subgid file] ******************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:89
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Set per-container variables part 3] ****************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_quadlet_spec.yml:102
ok: [localhost] => {"ansible_facts": {"__podman_activate_systemd_unit": false, "__podman_images_found": [], "__podman_kube_yamls_raw": "", "__podman_service_name": "mariadb.volume", "__podman_systemd_scope": "user", "__podman_user_home_dir": "/home/tronde", "__podman_xdg_runtime_dir": "/run/user/1000"}, "changed": false}
TASK [rhel-system-roles.podman : Set per-container variables part 4] ****************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_quadlet_spec.yml:134
ok: [localhost] => {"ansible_facts": {"__podman_quadlet_path": "/home/tronde/.config/containers/systemd"}, "changed": false}
TASK [rhel-system-roles.podman : Get kube yaml contents] ****************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_quadlet_spec.yml:140
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Set per-container variables part 5] ****************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_quadlet_spec.yml:151
ok: [localhost] => {"ansible_facts": {"__podman_images": [], "__podman_quadlet_file": "/home/tronde/.config/containers/systemd/mariadb.volume", "__podman_volumes": []}, "changed": false}
TASK [rhel-system-roles.podman : Cleanup quadlets] **********************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_quadlet_spec.yml:205
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Create and update quadlets] ************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_quadlet_spec.yml:209
included: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/create_update_quadlet_spec.yml for localhost
TASK [rhel-system-roles.podman : Enable lingering if needed] ************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/create_update_quadlet_spec.yml:2
ok: [localhost] => {"changed": false, "cmd": ["loginctl", "enable-linger", "tronde"], "delta": null, "end": null, "msg": "Did not run command since '/var/lib/systemd/linger/tronde' exists", "rc": 0, "start": null, "stderr": "", "stderr_lines": [], "stdout": "skipped, since /var/lib/systemd/linger/tronde exists", "stdout_lines": ["skipped, since /var/lib/systemd/linger/tronde exists"]}
TASK [rhel-system-roles.podman : Create host directories] ***************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/create_update_quadlet_spec.yml:8
skipping: [localhost] => {"changed": false, "skipped_reason": "No items in the list"}
TASK [rhel-system-roles.podman : Ensure container images are present] ***************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/create_update_quadlet_spec.yml:26
skipping: [localhost] => {"changed": false, "skipped_reason": "No items in the list"}
TASK [rhel-system-roles.podman : Ensure the quadlet directory is present] ***********************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/create_update_quadlet_spec.yml:43
ok: [localhost] => {"changed": false, "gid": 1000, "group": "tronde", "mode": "0755", "owner": "tronde", "path": "/home/tronde/.config/containers/systemd", "secontext": "unconfined_u:object_r:config_home_t:s0", "size": 137, "state": "directory", "uid": 1000}
TASK [rhel-system-roles.podman : Ensure quadlet file is copied] *********************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/create_update_quadlet_spec.yml:52
ok: [localhost] => {"changed": false, "checksum": "13d017cee420faaeefd04d6085cc2c670a6f24fa", "dest": "/home/tronde/.config/containers/systemd/mariadb.volume", "gid": 1000, "group": "tronde", "mode": "0644", "owner": "tronde", "path": "/home/tronde/.config/containers/systemd/mariadb.volume", "secontext": "unconfined_u:object_r:config_home_t:s0", "size": 49, "state": "file", "uid": 1000}
TASK [rhel-system-roles.podman : Ensure quadlet file content is present] ************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/create_update_quadlet_spec.yml:62
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Ensure quadlet file is present] ********************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/create_update_quadlet_spec.yml:74
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Reload systemctl] **********************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/create_update_quadlet_spec.yml:86
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Start service] *************************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/create_update_quadlet_spec.yml:115
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Restart service] ***********************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/create_update_quadlet_spec.yml:131
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Set per-container variables part 0] ****************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_quadlet_spec.yml:14
ok: [localhost] => {"ansible_facts": {"__podman_quadlet_file_src": "../quadlet/mytinytodo.volume", "__podman_quadlet_spec": {}, "__podman_quadlet_str": "[Volume]", "__podman_quadlet_template_src": ""}, "changed": false}
TASK [rhel-system-roles.podman : Set per-container variables part 1] ****************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_quadlet_spec.yml:40
ok: [localhost] => {"ansible_facts": {"__podman_continue_if_pull_fails": true, "__podman_pull_image": true, "__podman_state": "created", "__podman_systemd_unit_scope": "", "__podman_user": "tronde"}, "changed": false}
TASK [rhel-system-roles.podman : Fail if no quadlet spec is given] ******************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_quadlet_spec.yml:57
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Set per-container variables part 2] ****************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_quadlet_spec.yml:70
ok: [localhost] => {"ansible_facts": {"__podman_quadlet_name": "mytinytodo", "__podman_quadlet_type": "volume", "__podman_rootless": true}, "changed": false}
TASK [rhel-system-roles.podman : Check user and group information] ******************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_quadlet_spec.yml:97
included: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml for localhost
TASK [rhel-system-roles.podman : Get user information] ******************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:2
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Fail if user does not exist] ***********************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:10
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Set group for podman user] *************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:17
ok: [localhost] => {"ansible_facts": {"__podman_group": "tronde"}, "changed": false}
TASK [rhel-system-roles.podman : Get group information] *****************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:28
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Set group name] ************************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:36
ok: [localhost] => {"ansible_facts": {"__podman_group_name": "tronde"}, "changed": false}
TASK [rhel-system-roles.podman : See if getsubids exists] ***************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:41
ok: [localhost] => {"changed": false, "stat": {"atime": 1715349821.3887033, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 32, "charset": "binary", "checksum": "076f89863b474504550ab1b5c11216a77ed7123c", "ctime": 1700817235.2266545, "dev": 64768, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 16846320, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "application/x-pie-executable", "mode": "0755", "mtime": 1689166932.0, "nlink": 1, "path": "/usr/bin/getsubids", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 15496, "uid": 0, "version": "1087703794", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true}}
TASK [rhel-system-roles.podman : Check user with getsubids] *************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:52
ok: [localhost] => {"changed": false, "cmd": ["getsubids", "tronde"], "delta": "0:00:00.003437", "end": "2024-05-10 16:57:43.576814", "msg": "", "rc": 0, "start": "2024-05-10 16:57:43.573377", "stderr": "", "stderr_lines": [], "stdout": "0: tronde 100000 65536", "stdout_lines": ["0: tronde 100000 65536"]}
TASK [rhel-system-roles.podman : Check group with getsubids] ************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:56
ok: [localhost] => {"changed": false, "cmd": ["getsubids", "-g", "tronde"], "delta": "0:00:00.004198", "end": "2024-05-10 16:57:43.905713", "msg": "", "rc": 0, "start": "2024-05-10 16:57:43.901515", "stderr": "", "stderr_lines": [], "stdout": "0: tronde 100000 65536", "stdout_lines": ["0: tronde 100000 65536"]}
TASK [rhel-system-roles.podman : Check if user is in subuid file] *******************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:66
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Fail if user not in subuid file] *******************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:74
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Check if group is in subgid file] ******************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:81
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Fail if group not in subgid file] ******************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:89
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Set per-container variables part 3] ****************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_quadlet_spec.yml:102
ok: [localhost] => {"ansible_facts": {"__podman_activate_systemd_unit": false, "__podman_images_found": [], "__podman_kube_yamls_raw": "", "__podman_service_name": "mytinytodo.volume", "__podman_systemd_scope": "user", "__podman_user_home_dir": "/home/tronde", "__podman_xdg_runtime_dir": "/run/user/1000"}, "changed": false}
TASK [rhel-system-roles.podman : Set per-container variables part 4] ****************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_quadlet_spec.yml:134
ok: [localhost] => {"ansible_facts": {"__podman_quadlet_path": "/home/tronde/.config/containers/systemd"}, "changed": false}
TASK [rhel-system-roles.podman : Get kube yaml contents] ****************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_quadlet_spec.yml:140
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Set per-container variables part 5] ****************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_quadlet_spec.yml:151
ok: [localhost] => {"ansible_facts": {"__podman_images": [], "__podman_quadlet_file": "/home/tronde/.config/containers/systemd/mytinytodo.volume", "__podman_volumes": []}, "changed": false}
TASK [rhel-system-roles.podman : Cleanup quadlets] **********************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_quadlet_spec.yml:205
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Create and update quadlets] ************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_quadlet_spec.yml:209
included: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/create_update_quadlet_spec.yml for localhost
TASK [rhel-system-roles.podman : Enable lingering if needed] ************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/create_update_quadlet_spec.yml:2
ok: [localhost] => {"changed": false, "cmd": ["loginctl", "enable-linger", "tronde"], "delta": null, "end": null, "msg": "Did not run command since '/var/lib/systemd/linger/tronde' exists", "rc": 0, "start": null, "stderr": "", "stderr_lines": [], "stdout": "skipped, since /var/lib/systemd/linger/tronde exists", "stdout_lines": ["skipped, since /var/lib/systemd/linger/tronde exists"]}
TASK [rhel-system-roles.podman : Create host directories] ***************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/create_update_quadlet_spec.yml:8
skipping: [localhost] => {"changed": false, "skipped_reason": "No items in the list"}
TASK [rhel-system-roles.podman : Ensure container images are present] ***************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/create_update_quadlet_spec.yml:26
skipping: [localhost] => {"changed": false, "skipped_reason": "No items in the list"}
TASK [rhel-system-roles.podman : Ensure the quadlet directory is present] ***********************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/create_update_quadlet_spec.yml:43
ok: [localhost] => {"changed": false, "gid": 1000, "group": "tronde", "mode": "0755", "owner": "tronde", "path": "/home/tronde/.config/containers/systemd", "secontext": "unconfined_u:object_r:config_home_t:s0", "size": 137, "state": "directory", "uid": 1000}
TASK [rhel-system-roles.podman : Ensure quadlet file is copied] *********************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/create_update_quadlet_spec.yml:52
ok: [localhost] => {"changed": false, "checksum": "585f8cbdf0ec73000f9227dcffbef71e9552ea4a", "dest": "/home/tronde/.config/containers/systemd/mytinytodo.volume", "gid": 1000, "group": "tronde", "mode": "0644", "owner": "tronde", "path": "/home/tronde/.config/containers/systemd/mytinytodo.volume", "secontext": "unconfined_u:object_r:config_home_t:s0", "size": 9, "state": "file", "uid": 1000}
TASK [rhel-system-roles.podman : Ensure quadlet file content is present] ************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/create_update_quadlet_spec.yml:62
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Ensure quadlet file is present] ********************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/create_update_quadlet_spec.yml:74
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Reload systemctl] **********************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/create_update_quadlet_spec.yml:86
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Start service] *************************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/create_update_quadlet_spec.yml:115
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Restart service] ***********************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/create_update_quadlet_spec.yml:131
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Set per-container variables part 0] ****************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_quadlet_spec.yml:14
ok: [localhost] => {"ansible_facts": {"__podman_quadlet_file_src": "../quadlet/mariadb.container", "__podman_quadlet_spec": {}, "__podman_quadlet_str": "[Unit]\nDescription=MariaDB Container for Demo\n\n[Container]\nImage=registry.redhat.io/rhel9/mariadb-105:1-177.1712857771\nContainerName=mariadb-demo\nNetwork=mytinytodo-demo.network\nVolume=mariadb:/var/lib/mysql:Z\nEnvironment=MYSQL_USER=mtt_user\nEnvironment=MYSQL_DATABASE=mtt_db\nSecret=mysql-root-password-container,type=env,target=MYSQL_ROOT_PASSWORD\nSecret=mysql-user-password-container,type=env,target=MYSQL_PASSWORD\n\n[Install]\n# Start by default on boot\nWantedBy=multi-user.target default.target", "__podman_quadlet_template_src": ""}, "changed": false}
TASK [rhel-system-roles.podman : Set per-container variables part 1] ****************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_quadlet_spec.yml:40
ok: [localhost] => {"ansible_facts": {"__podman_continue_if_pull_fails": true, "__podman_pull_image": true, "__podman_state": "created", "__podman_systemd_unit_scope": "", "__podman_user": "tronde"}, "changed": false}
TASK [rhel-system-roles.podman : Fail if no quadlet spec is given] ******************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_quadlet_spec.yml:57
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Set per-container variables part 2] ****************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_quadlet_spec.yml:70
ok: [localhost] => {"ansible_facts": {"__podman_quadlet_name": "mariadb", "__podman_quadlet_type": "container", "__podman_rootless": true}, "changed": false}
TASK [rhel-system-roles.podman : Check user and group information] ******************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_quadlet_spec.yml:97
included: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml for localhost
TASK [rhel-system-roles.podman : Get user information] ******************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:2
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Fail if user does not exist] ***********************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:10
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Set group for podman user] *************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:17
ok: [localhost] => {"ansible_facts": {"__podman_group": "tronde"}, "changed": false}
TASK [rhel-system-roles.podman : Get group information] *****************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:28
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Set group name] ************************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:36
ok: [localhost] => {"ansible_facts": {"__podman_group_name": "tronde"}, "changed": false}
TASK [rhel-system-roles.podman : See if getsubids exists] ***************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:41
ok: [localhost] => {"changed": false, "stat": {"atime": 1715349821.3887033, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 32, "charset": "binary", "checksum": "076f89863b474504550ab1b5c11216a77ed7123c", "ctime": 1700817235.2266545, "dev": 64768, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 16846320, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "application/x-pie-executable", "mode": "0755", "mtime": 1689166932.0, "nlink": 1, "path": "/usr/bin/getsubids", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 15496, "uid": 0, "version": "1087703794", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true}}
TASK [rhel-system-roles.podman : Check user with getsubids] *************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:52
ok: [localhost] => {"changed": false, "cmd": ["getsubids", "tronde"], "delta": "0:00:00.003176", "end": "2024-05-10 16:57:48.262267", "msg": "", "rc": 0, "start": "2024-05-10 16:57:48.259091", "stderr": "", "stderr_lines": [], "stdout": "0: tronde 100000 65536", "stdout_lines": ["0: tronde 100000 65536"]}
TASK [rhel-system-roles.podman : Check group with getsubids] ************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:56
ok: [localhost] => {"changed": false, "cmd": ["getsubids", "-g", "tronde"], "delta": "0:00:00.003442", "end": "2024-05-10 16:57:48.522965", "msg": "", "rc": 0, "start": "2024-05-10 16:57:48.519523", "stderr": "", "stderr_lines": [], "stdout": "0: tronde 100000 65536", "stdout_lines": ["0: tronde 100000 65536"]}
TASK [rhel-system-roles.podman : Check if user is in subuid file] *******************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:66
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Fail if user not in subuid file] *******************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:74
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Check if group is in subgid file] ******************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:81
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Fail if group not in subgid file] ******************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:89
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Set per-container variables part 3] ****************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_quadlet_spec.yml:102
ok: [localhost] => {"ansible_facts": {"__podman_activate_systemd_unit": true, "__podman_images_found": ["registry.redhat.io/rhel9/mariadb-105:1-177.1712857771"], "__podman_kube_yamls_raw": "", "__podman_service_name": "mariadb.service", "__podman_systemd_scope": "user", "__podman_user_home_dir": "/home/tronde", "__podman_xdg_runtime_dir": "/run/user/1000"}, "changed": false}
TASK [rhel-system-roles.podman : Set per-container variables part 4] ****************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_quadlet_spec.yml:134
ok: [localhost] => {"ansible_facts": {"__podman_quadlet_path": "/home/tronde/.config/containers/systemd"}, "changed": false}
TASK [rhel-system-roles.podman : Get kube yaml contents] ****************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_quadlet_spec.yml:140
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Set per-container variables part 5] ****************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_quadlet_spec.yml:151
ok: [localhost] => {"ansible_facts": {"__podman_images": ["registry.redhat.io/rhel9/mariadb-105:1-177.1712857771"], "__podman_quadlet_file": "/home/tronde/.config/containers/systemd/mariadb.container", "__podman_volumes": []}, "changed": false}
TASK [rhel-system-roles.podman : Cleanup quadlets] **********************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_quadlet_spec.yml:205
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Create and update quadlets] ************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_quadlet_spec.yml:209
included: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/create_update_quadlet_spec.yml for localhost
TASK [rhel-system-roles.podman : Enable lingering if needed] ************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/create_update_quadlet_spec.yml:2
ok: [localhost] => {"changed": false, "cmd": ["loginctl", "enable-linger", "tronde"], "delta": null, "end": null, "msg": "Did not run command since '/var/lib/systemd/linger/tronde' exists", "rc": 0, "start": null, "stderr": "", "stderr_lines": [], "stdout": "skipped, since /var/lib/systemd/linger/tronde exists", "stdout_lines": ["skipped, since /var/lib/systemd/linger/tronde exists"]}
TASK [rhel-system-roles.podman : Create host directories] ***************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/create_update_quadlet_spec.yml:8
skipping: [localhost] => {"changed": false, "skipped_reason": "No items in the list"}
TASK [rhel-system-roles.podman : Ensure container images are present] ***************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/create_update_quadlet_spec.yml:26
ok: [localhost] => (item=registry.redhat.io/rhel9/mariadb-105:1-177.1712857771) => {"ansible_loop_var": "item", "changed": false, "failed_when_result": false, "item": "registry.redhat.io/rhel9/mariadb-105:1-177.1712857771", "msg": "Failed to pull image registry.redhat.io/rhel9/mariadb-105:1-177.1712857771"}
TASK [rhel-system-roles.podman : Ensure the quadlet directory is present] ***********************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/create_update_quadlet_spec.yml:43
ok: [localhost] => {"changed": false, "gid": 1000, "group": "tronde", "mode": "0755", "owner": "tronde", "path": "/home/tronde/.config/containers/systemd", "secontext": "unconfined_u:object_r:config_home_t:s0", "size": 137, "state": "directory", "uid": 1000}
TASK [rhel-system-roles.podman : Ensure quadlet file is copied] *********************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/create_update_quadlet_spec.yml:52
ok: [localhost] => {"changed": false, "checksum": "2f35cec12b6e1de8cf87407cc82d8cefed809587", "dest": "/home/tronde/.config/containers/systemd/mariadb.container", "gid": 1000, "group": "tronde", "mode": "0644", "owner": "tronde", "path": "/home/tronde/.config/containers/systemd/mariadb.container", "secontext": "unconfined_u:object_r:config_home_t:s0", "size": 497, "state": "file", "uid": 1000}
TASK [rhel-system-roles.podman : Ensure quadlet file content is present] ************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/create_update_quadlet_spec.yml:62
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Ensure quadlet file is present] ********************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/create_update_quadlet_spec.yml:74
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Reload systemctl] **********************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/create_update_quadlet_spec.yml:86
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Start service] *************************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/create_update_quadlet_spec.yml:115
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Restart service] ***********************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/create_update_quadlet_spec.yml:131
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Set per-container variables part 0] ****************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_quadlet_spec.yml:14
ok: [localhost] => {"ansible_facts": {"__podman_quadlet_file_src": "../quadlet/mytinytodo.container", "__podman_quadlet_spec": {}, "__podman_quadlet_str": "[Unit]\nDescription=Apache with PHP to run myTinyTodo\n\n[Container]\nImage=localhost/mytinytodo_image\nContainerName=mytinytodo-demo\nVolume=mytinytodo:/opt/app-root/src:Z\nNetwork=mytinytodo-demo.network\nPublishPort=8080:8080\n#AutoUpdate=registry\n\n[Install]\n# Start by default on boot\nWantedBy=multi-user.target default.target", "__podman_quadlet_template_src": ""}, "changed": false}
TASK [rhel-system-roles.podman : Set per-container variables part 1] ****************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_quadlet_spec.yml:40
ok: [localhost] => {"ansible_facts": {"__podman_continue_if_pull_fails": true, "__podman_pull_image": true, "__podman_state": "created", "__podman_systemd_unit_scope": "", "__podman_user": "tronde"}, "changed": false}
TASK [rhel-system-roles.podman : Fail if no quadlet spec is given] ******************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_quadlet_spec.yml:57
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Set per-container variables part 2] ****************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_quadlet_spec.yml:70
ok: [localhost] => {"ansible_facts": {"__podman_quadlet_name": "mytinytodo", "__podman_quadlet_type": "container", "__podman_rootless": true}, "changed": false}
TASK [rhel-system-roles.podman : Check user and group information] ******************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_quadlet_spec.yml:97
included: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml for localhost
TASK [rhel-system-roles.podman : Get user information] ******************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:2
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Fail if user does not exist] ***********************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:10
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Set group for podman user] *************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:17
ok: [localhost] => {"ansible_facts": {"__podman_group": "tronde"}, "changed": false}
TASK [rhel-system-roles.podman : Get group information] *****************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:28
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Set group name] ************************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:36
ok: [localhost] => {"ansible_facts": {"__podman_group_name": "tronde"}, "changed": false}
TASK [rhel-system-roles.podman : See if getsubids exists] ***************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:41
ok: [localhost] => {"changed": false, "stat": {"atime": 1715349821.3887033, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 32, "charset": "binary", "checksum": "076f89863b474504550ab1b5c11216a77ed7123c", "ctime": 1700817235.2266545, "dev": 64768, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 16846320, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "application/x-pie-executable", "mode": "0755", "mtime": 1689166932.0, "nlink": 1, "path": "/usr/bin/getsubids", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 15496, "uid": 0, "version": "1087703794", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true}}
TASK [rhel-system-roles.podman : Check user with getsubids] *************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:52
ok: [localhost] => {"changed": false, "cmd": ["getsubids", "tronde"], "delta": "0:00:00.003150", "end": "2024-05-10 16:57:54.183470", "msg": "", "rc": 0, "start": "2024-05-10 16:57:54.180320", "stderr": "", "stderr_lines": [], "stdout": "0: tronde 100000 65536", "stdout_lines": ["0: tronde 100000 65536"]}
TASK [rhel-system-roles.podman : Check group with getsubids] ************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:56
ok: [localhost] => {"changed": false, "cmd": ["getsubids", "-g", "tronde"], "delta": "0:00:00.006617", "end": "2024-05-10 16:57:54.438912", "msg": "", "rc": 0, "start": "2024-05-10 16:57:54.432295", "stderr": "", "stderr_lines": [], "stdout": "0: tronde 100000 65536", "stdout_lines": ["0: tronde 100000 65536"]}
TASK [rhel-system-roles.podman : Check if user is in subuid file] *******************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:66
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Fail if user not in subuid file] *******************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:74
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Check if group is in subgid file] ******************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:81
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Fail if group not in subgid file] ******************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_user_group.yml:89
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Set per-container variables part 3] ****************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_quadlet_spec.yml:102
ok: [localhost] => {"ansible_facts": {"__podman_activate_systemd_unit": true, "__podman_images_found": ["localhost/mytinytodo_image"], "__podman_kube_yamls_raw": "", "__podman_service_name": "mytinytodo.service", "__podman_systemd_scope": "user", "__podman_user_home_dir": "/home/tronde", "__podman_xdg_runtime_dir": "/run/user/1000"}, "changed": false}
TASK [rhel-system-roles.podman : Set per-container variables part 4] ****************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_quadlet_spec.yml:134
ok: [localhost] => {"ansible_facts": {"__podman_quadlet_path": "/home/tronde/.config/containers/systemd"}, "changed": false}
TASK [rhel-system-roles.podman : Get kube yaml contents] ****************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_quadlet_spec.yml:140
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Set per-container variables part 5] ****************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_quadlet_spec.yml:151
ok: [localhost] => {"ansible_facts": {"__podman_images": ["localhost/mytinytodo_image"], "__podman_quadlet_file": "/home/tronde/.config/containers/systemd/mytinytodo.container", "__podman_volumes": []}, "changed": false}
TASK [rhel-system-roles.podman : Cleanup quadlets] **********************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_quadlet_spec.yml:205
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Create and update quadlets] ************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/handle_quadlet_spec.yml:209
included: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/create_update_quadlet_spec.yml for localhost
TASK [rhel-system-roles.podman : Enable lingering if needed] ************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/create_update_quadlet_spec.yml:2
ok: [localhost] => {"changed": false, "cmd": ["loginctl", "enable-linger", "tronde"], "delta": null, "end": null, "msg": "Did not run command since '/var/lib/systemd/linger/tronde' exists", "rc": 0, "start": null, "stderr": "", "stderr_lines": [], "stdout": "skipped, since /var/lib/systemd/linger/tronde exists", "stdout_lines": ["skipped, since /var/lib/systemd/linger/tronde exists"]}
TASK [rhel-system-roles.podman : Create host directories] ***************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/create_update_quadlet_spec.yml:8
skipping: [localhost] => {"changed": false, "skipped_reason": "No items in the list"}
TASK [rhel-system-roles.podman : Ensure container images are present] ***************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/create_update_quadlet_spec.yml:26
ok: [localhost] => (item=localhost/mytinytodo_image) => {"ansible_loop_var": "item", "changed": false, "failed_when_result": false, "item": "localhost/mytinytodo_image", "msg": "Failed to pull image localhost/mytinytodo_image:latest"}
TASK [rhel-system-roles.podman : Ensure the quadlet directory is present] ***********************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/create_update_quadlet_spec.yml:43
ok: [localhost] => {"changed": false, "gid": 1000, "group": "tronde", "mode": "0755", "owner": "tronde", "path": "/home/tronde/.config/containers/systemd", "secontext": "unconfined_u:object_r:config_home_t:s0", "size": 137, "state": "directory", "uid": 1000}
TASK [rhel-system-roles.podman : Ensure quadlet file is copied] *********************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/create_update_quadlet_spec.yml:52
ok: [localhost] => {"changed": false, "checksum": "98873bf5bad8d245ddbb23cfeb4342e8bcad747e", "dest": "/home/tronde/.config/containers/systemd/mytinytodo.container", "gid": 1000, "group": "tronde", "mode": "0644", "owner": "tronde", "path": "/home/tronde/.config/containers/systemd/mytinytodo.container", "secontext": "unconfined_u:object_r:config_home_t:s0", "size": 322, "state": "file", "uid": 1000}
TASK [rhel-system-roles.podman : Ensure quadlet file content is present] ************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/create_update_quadlet_spec.yml:62
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Ensure quadlet file is present] ********************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/create_update_quadlet_spec.yml:74
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Reload systemctl] **********************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/create_update_quadlet_spec.yml:86
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Start service] *************************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/create_update_quadlet_spec.yml:115
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [rhel-system-roles.podman : Restart service] ***********************************************************
task path: /usr/share/ansible/roles/rhel-system-roles.podman/tasks/create_update_quadlet_spec.yml:131
skipping: [localhost] => {"changed": false, "skip_reason": "Conditional result was False"}
PLAY RECAP **************************************************************************************************
localhost : ok=122 changed=2 unreachable=0 failed=0 skipped=117 rescued=0 ignored=0
]$ podman secret list
ID NAME DRIVER CREATED UPDATED
0bb4b5ec6e0c1f2a8e48ddc9f mysql-root-password-container file 36 seconds ago 36 seconds ago
0ebc0fe40a9e7de6d5055631a mysql-user-password-container file 35 seconds ago 35 seconds ago
Just as a heads up before you'll spend too much time in the code blocks. With rhel-system-roles
version 1.23.0-2.21.el9
the podman secrets were not created.
Downgrading the rhel-system-roles
to version 1.22.0-2.el9
they were created.
If I recall correctly - in 1.22.0 lingering was not configured correctly for secrets - but somehow it would "just work" - I tried to fix that in 1.23.0 but I believe I inadvertently broke it in some other way - I think I finally got it right with https://github.com/linux-system-roles/podman/pull/138 and https://github.com/linux-system-roles/podman/pull/140 among other changes - please try the latest upstream code to verify
OK, here is what I did to test with current upstream code:
]$ git clone https://github.com/linux-system-roles/podman.git
]$ sudo mv /usr/share/ansible/roles/linux-system-roles.podman /usr/share/ansible/roles/linux-system-roles.podman.bak
]$ sudo mv ~/podman /usr/share/ansible/roles/linux-system-roles.podman
]$ ansible-galaxy collection install containers.podman
]$ sudo reboot NOW
]$ sudo loginctl show-user tronde | grep Linger
Linger=yes
I set no_log: false
in /usr/share/ansible/roles/linux-system-roles.podman/tasks/handle_secret.yml
and in /usr/share/ansible/roles/linux-system-roles.podman/tasks/main.yml
to show output.
Then I ran the playbook again with option -vv
. You find the output attached in:
troubleshooting_information.txt
TL;DR: The podman secrets
were not created and therefore the container cannot start.
Edit: I tried it on my F39 workstation with the same result. Following code block shows how I integrated the upstream code in to the linux_system_roles collection directory.
cd src/ && git clone git@github.com:linux-system-roles/podman.git
sudo mv podman /usr/share/ansible/collections/ansible_collections/fedora/linux_system_roles/roles/podman
Result looks the same as in RHEL 9.4 with upstream code.
What else can I do to track this down?
so we do have tests for this: https://github.com/linux-system-roles/podman/blob/main/tests/tests_quadlet_basic.yml#L174 Here is the latest weekly test run: https://dl.fedoraproject.org/pub/alt/linuxsystemroles/logs/lsr-citool_podman-103-0afd954_CentOS-Stream-9_20240504-165350/artifacts/summary.html This test uses ansible-core 2.16 for the control node, and the managed node is a centos-9 latest VM. here is the test which includes rootless secrets - https://dl.fedoraproject.org/pub/alt/linuxsystemroles/logs/lsr-citool_podman-103-0afd954_CentOS-Stream-9_20240504-165350/artifacts/tests_quadlet_basic-PASSED.log here is what the test output looks like when it successfully adds a secret:
TASK [fedora.linux_system_roles.podman : Manage each secret] *******************
task path: /WORKDIR/git-weekly-cifktrzocl/.collection/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml:34
Saturday 04 May 2024 16:42:50 +0000 (0:00:00.224) 0:01:13.557 **********
changed: [sut] => {
"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
"changed": true
}
...
TASK [fedora.linux_system_roles.podman : Manage each secret] *******************
task path: /WORKDIR/git-weekly-cifktrzocl/.collection/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml:34
Saturday 04 May 2024 16:42:51 +0000 (0:00:00.205) 0:01:15.119 **********
changed: [sut] => {
"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result",
"changed": true
}
note the "changed": true
in your output, using ansible-core 2.14, I see something different:
TASK [linux-system-roles.podman : Manage each secret] *********************************************************************************
task path: /usr/share/ansible/roles/linux-system-roles.podman/tasks/handle_secret.yml:35
[WARNING]: Using a variable for a task's 'args' is unsafe in some situations (see
https://docs.ansible.com/ansible/devel/reference_appendices/faq.html#argsplat-unsafe)
ok: [localhost] => {"changed": false}
so - 2 differences - 2.14 has the argsplat warning - and "changed": false
another difference - we don't do localhost
testing in CI, but we do downstream - this test passed RHEL 9.4 localhost testing.
I simplified my test environment, using the following playbook now:
---
- name: Troubleshoot Podman Secrets
hosts: rhel9
tasks:
- name: Deploy Podman Secrets
ansible.builtin.include_role:
name: linux-system-roles.podman
vars:
podman_run_as_user: tronde
podman_run_as_group: tronde
podman_secrets:
- name: mysql-root-password-container
state: present
skip_existing: true
data: "mysql_root_password"
- name: mysql-user-password-container
state: present
skip_existing: true
data: "mysql_user_password"
]$ cat /etc/fedora-release && ansible --version && echo "Check changlog of linux-system-role.podman" && egrep '#138' /usr/share/ansible/collections/ansible_collections/fedora/linux_system_roles/roles/podman/CHANGELOG.md && egrep '#140' /usr/share/ansible/collections/ansible_collections/fedora/linux_system_roles/roles/podman/CHANGELOG.md
Fedora release 39 (Thirty Nine)
ansible [core 2.16.5]
config file = /etc/ansible/ansible.cfg
configured module search path = ['/home/tronde/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python3.12/site-packages/ansible
ansible collection location = /home/tronde/.ansible/collections:/usr/share/ansible/collections
executable location = /usr/bin/ansible
python version = 3.12.3 (main, Apr 17 2024, 00:00:00) [GCC 13.2.1 20240316 (Red Hat 13.2.1-7)] (/usr/bin/python3)
jinja version = 3.1.3
libyaml = True
Check changlog of linux-system-role.podman
- fix: use correct user for cancel linger file name (#138)
- test: do not check for root linger (#140)```
]$ cat /etc/redhat-release && python3 --version && podman --version && sudo loginctl show-user tronde | grep Linger
Red Hat Enterprise Linux release 9.4 (Plow)
Python 3.9.18
podman version 4.9.4-rhel
Linger=yes
Complete output: troubleshooting_information_minimal.txt
I cannot get rid of the warning and my tasks keep stating changed: false
:
TASK [linux-system-roles.podman : Manage each secret] ************************************************************************************************
task path: /usr/share/ansible/roles/linux-system-roles.podman/tasks/handle_secret.yml:35
[WARNING]: Using a variable for a task's 'args' is unsafe in some situations (see
https://docs.ansible.com/ansible/devel/reference_appendices/faq.html#argsplat-unsafe)
ok: [rhel9] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false}
…
TASK [linux-system-roles.podman : Manage each secret] ************************************************************************************************
task path: /usr/share/ansible/roles/linux-system-roles.podman/tasks/handle_secret.yml:35
ok: [rhel9] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false}
I understand that your tests pass especially on RHEL 9.4 localhost. But I cannot get there while I assume it should work the way I try. I don't know what else to try. :-/ Guess I have to stick with the rhel-system-roles version 1.22.0-2.el9
until we can sort this out.
I believe this is a bug in containers.podman that was fixed by https://github.com/containers/ansible-podman-collections/pull/733 and released in https://galaxy.ansible.com/ui/repo/published/containers/podman version 1.14.0
@Tronde please try to reproduce with containers.podman 1.14.0
@richm
Using the following playbook to deploy an example application from my podman demo/workshop fails in the first run but succeeds in the second run without any changes to the playbook or the other files involved.
I would expect the playbook to fail on the second run too with same reason or to complete successful in the first run.
The the debugging output has to many characters for this form I attached a file containing the complete information: troubleshooting_information.txt
I did three playbook runs, setting
no_log: false
for the task[rhel-system-roles.podman : Handle secrets]
on the third run. Somehow thepodman secrets
are not being created.Environment information
The playbook
The mariadb.container file
To be honest, I don't know whether it's a bug or just me holding it wrong. Any help on this is much appreciated.
Best regards,
Joerg