aquasecurity / kube-hunter

Hunt for security weaknesses in Kubernetes clusters
Apache License 2.0
4.66k stars 578 forks source link

"Unrecognized K8s API" and having zero vulnerabilities in old K8s version #480

Closed moritzluedtke closed 2 years ago

moritzluedtke commented 2 years ago

I'm just trying to figure out if kube-hunter is working correctly here.

We have a k8s-cluster using v1.18.9. I ran kube-hunter --k8s-auto-discover-nodes --kubeconfig ~/.kube/config locally on my machine which has network access to the cluster.

The output is the following:

2021-10-11 17:36:28,236 INFO kube_hunter.modules.report.collector Started hunting
2021-10-11 17:36:28,236 INFO kube_hunter.modules.report.collector Discovering Open Kubernetes Services
2021-10-11 17:36:28,593 INFO kube_hunter.modules.discovery.kubernetes_client Listed 3 nodes in the cluster
2021-10-11 17:36:36,175 INFO kube_hunter.modules.report.collector Found open service "Unrecognized K8s API" at xx.xxx.xx.141:6443
2021-10-11 17:36:36,175 INFO kube_hunter.modules.report.collector Found open service "Unrecognized K8s API" at xx.xxx.xx.142:6443
2021-10-11 17:36:36,176 INFO kube_hunter.modules.report.collector Found open service "Unrecognized K8s API" at xx.xxx.xx.143:6443
2021-10-11 17:36:36,176 INFO kube_hunter.modules.report.collector Found open service "Unrecognized K8s API" at our-own-internal-dns-1:6443
2021-10-11 17:36:36,177 INFO kube_hunter.modules.report.collector Found open service "Unrecognized K8s API" at our-own-internal-dns-2:6443
2021-10-11 17:36:36,178 INFO kube_hunter.modules.report.collector Found open service "Unrecognized K8s API" at our-own-internal-dns-3:6443

Nodes
+-------------+----------------------+
| TYPE        | LOCATION             |
+-------------+----------------------+
| Node/Master | our-own-             |
|             | internal-dns-3       |
+-------------+----------------------+
| Node/Master | our-own-             |
|             | internal-dns-2       |
+-------------+----------------------+
| Node/Master | our-own-             |
|             | internal-dns-1       |
+-------------+----------------------+
| Node/Master | xx.xxx.xx.143        |
+-------------+----------------------+
| Node/Master | xx.xxx.xx.142        |
+-------------+----------------------+
| Node/Master | xx.xxx.xx.141        |
+-------------+----------------------+

Detected Services
+----------------------+----------------------+----------------------+
| SERVICE              | LOCATION             | DESCRIPTION          |
+----------------------+----------------------+----------------------+
| Unrecognized K8s API | our-own-internal     | A Kubernetes API     |
|                      | -dns-3:6443          | service              |
+----------------------+----------------------+----------------------+
| Unrecognized K8s API | our-own-internal     | A Kubernetes API     |
|                      | -dns-2:6443          | service              |
+----------------------+----------------------+----------------------+
| Unrecognized K8s API | our-own-internal     | A Kubernetes API     |
|                      | -dns-1:6443          | service              |
+----------------------+----------------------+----------------------+
| Unrecognized K8s API | xx.xxx.xx.143:6443   | A Kubernetes API     |
|                      |                      | service              |
+----------------------+----------------------+----------------------+
| Unrecognized K8s API | xx.xxx.xx.142:6443   | A Kubernetes API     |
|                      |                      | service              |
+----------------------+----------------------+----------------------+
| Unrecognized K8s API | xx.xxx.xx.141:6443   | A Kubernetes API     |
|                      |                      | service              |
+----------------------+----------------------+----------------------+

No vulnerabilities were found

Questions

  1. Why is the K8s API unrecognized? Is that something to be concerned about
  2. I'm a little bit sceptical about not having any vulnerabilities in our cluster. Is this normal? I found CVE-2021-25741 which is stated to affect any version "up to and including" 1.19.14. Or is kube-hunter only looking for open vulnerabilities from the outside of the cluster? So frome the view of an outside attacker and not an inside user who wants to do harm?
danielsagi commented 2 years ago

Hi @moritzluedtke Answering your questions:

  1. They way kube-hunter works is by first categorizing k8s apis and then reassuring them as what they truly are. it seems like it could do it. (if you could attach a run with --log debug flag I could help you figure out why this happend)
  2. Starting from #482 . We made cve hunting optional. this means if you want to see CVE results you need to pass the flag --enable-cve-hunting