k3s-io / k3s-ansible

Apache License 2.0
1.89k stars 780 forks source link

Adding new worker node to an existing cluster failed #291

Closed admun closed 5 months ago

admun commented 5 months ago

I am trying to add a new worker node but it failed

running playbook/upgrade.yml

TASK [k3s_upgrade : Save current K3s service] ***********************************************************************
changed: [192.168.1.x]
fatal: [192.168.1.y]: FAILED! => {"changed": true, "cmd": "cp /etc/systemd/system/k3s*.service /tmp/", "delta": "0:00:00.012402", "end": "2024-01-15 12:00:13.128083", "msg": "non-zero return code", "rc": 1, "start": "2024-01-15 12:00:13.115681", "stderr": "cp: cannot stat '/etc/systemd/system/k3s*.service': No such file or directory", "stderr_lines": ["cp: cannot stat '/etc/systemd/system/k3s*.service': No such file or directory"], "stdout": "", "stdout_lines": []}
changed: [192.168.1.z]

192.168.1.y is the new node.

I think saving current service should not be fatal?

not sure this related to #264

admun commented 5 months ago

nm... adding node should be running playbook/site.yml