Once the pod has started up, and systemd is considered started, it looks like it tries to locate the node the pod is running on.
However, the search it does is via the pod name 'uyuni', while the true pod was named uyuni-684b544d99-dnqhk by the deployment.
This causes it to fail out, due to unable to locate the pod.
Debug output:
2:05PM DBG shared/utils/exec.go:67 > Running: kubectl get -o jsonpath={.items[?(@.metadata.name=="uyuni")].status.readyReplicas} deploy -n uyuni
⠋ kubectl get pod -lapp=uyuni -A -o=jsonpath={.items[0].metadata.name}
⠋ kubectl exec uyuni-684b544d99-dnqhk -- systemctl is-active -q multi-user.target
⠋ kubectl get pod uyuni -o jsonpath={.items[*].spec.nodeName}
2:05PM DBG shared/utils/exec.go:67 > Running: kubectl get pod uyuni -o jsonpath={.items[*].spec.nodeName}
⠋ kubectl get pod uyuni -o jsonpath={.items[*].spec.nodeName}
⠋ kubectl get pod uyuni -o jsonpath={.items[*].spec.nodeName}
⠋ kubectl get pod uyuni -o jsonpath={.items[*].spec.nodeName}
[repeats]
Error: cannot find node running uyuni: cannot find node name matching filter uyuni
Manually ran:
> kubectl get pod uyuni -o jsonpath={.items[*].spec.nodeName}
Error from server (NotFound): pods "uyuni" not found
If i switch to a label selector manually, it works properly
> kubectl get pod -l app=uyuni -o jsonpath={.items[*].spec.nodeName}
sesian-worker-2726fd44-rjwlf
Environment:
Harvester cluster with 3 nodes using longhorn
RKE2 cluster on top of openSUSE MicroOS
Manually created volumes to have some use different storage classes
Once the pod has started up, and systemd is considered started, it looks like it tries to locate the node the pod is running on. However, the search it does is via the pod name 'uyuni', while the true pod was named uyuni-684b544d99-dnqhk by the deployment.
This causes it to fail out, due to unable to locate the pod.
Debug output:
Manually ran:
If i switch to a label selector manually, it works properly
Environment:
Migration command:
mgradm migrate kubernetes uyuni.i.hackerguild.com --helm-uyuni-namespace uyuni --helm-uyuni-values values.yaml --logLevel debug
values.yaml