JamesTurland / JimsGarage

Homelab Goodies
2.16k stars 482 forks source link

Dangerous command in k3s.sh #62

Open mas-kon opened 9 months ago

mas-kon commented 9 months ago

Command in script echo "StrictHostKeyChecking no" > ~/.ssh/config destroying your config file for SSH. :(

Please change to sed -i '1s/^/StrictHostKeyChecking no\n/' ~/.ssh/config

zatricky commented 8 months ago

Luckily I searched existing issues before suggesting the same change. Of note is that this same line is actually in multiple scripts, sometimes twice (local machine - and then sometimes run on the remote machine too):

./Docker-Swarm/swarm-3-nodes.sh
./Docker-Swarm/swarm.sh
./Kubernetes/K3S-Deploy/k3s.sh
./Kubernetes/Kubernetes-Lite/k3s.sh
./Kubernetes/RKE2-Cilium/rke2.sh
./Kubernetes/RKE2/rke2.sh
JamesTurland commented 8 months ago

Good spot, I'll amend that now. Thanks

zatricky commented 8 months ago

I see there were some updates. I realised that technically there is still a potential problem introduced by the change. If the file exists, the old behaviour would wipe it. If the file doesn't exist, the new behaviour results in an error. 🫣

I don't know how complex or edge-case-proof you want the scripts to be - but perhaps putting these security-reducing lines into the ssh config is actually a bad idea compared to just importing the keys:

mkdir -p ~/.ssh
for node in "${all[@]}"; do
  ssh-keyscan ${remote_host} >> ~/.ssh/known_hosts
done
perl -i -ne 'print if ! $x{$_}++' ~/.ssh/known_hosts

The first line makes sure the .ssh folder exists. It gives no output or errors unless the folder didn't exist and it was unable to create it. The line inside the for loop connects to the remote host and saves the host's keys into known_hosts. The last line assumes we have perl installed and is only there to remove duplicated entries out of the known_hosts file in case we happened to run the script a second time. There are many other ways to do the same thing however: https://stackoverflow.com/questions/11532157/remove-duplicate-lines-without-sorting https://stackoverflow.com/questions/1444406/how-to-delete-duplicate-lines-in-a-file-without-sorting-it-in-unix