aquasecurity / kube-hunter

Hunt for security weaknesses in Kubernetes clusters
Apache License 2.0
4.66k stars 579 forks source link

not able scan individual ip of node #404

Closed ghost closed 3 years ago

ghost commented 3 years ago

I have installed kube-hunter on gke cluster ,but when I hit command kube-hunter and selected option 1 ,in that I have put ip of node which getting from from kubectl get nodes -o wide ,then it gives error couldn't find cluster. If create pod and checked logs of kube-hunter that method works fine .[when cluster implemented on on premise then all everything working fine.Only on cloud it give above error (GCP,AWS)] 1 2 3 4

github-actions[bot] commented 3 years ago

Hola! @varshabhijeet 🥳 , You've just created an Issue!🌟 Thanks for making the Project Better

yashsaurabh commented 3 years ago

yes the same issue i am also facing !!!!

danielsagi commented 3 years ago

Hi, @varshabhijeet Can you provide more details about the machine you are running kube-hunter from? I see you use the internal IP of the node, can you access this IP? have you tried using the external IP?

ghost commented 3 years ago

Yes I have tried by both IP but got same result

ghost commented 3 years ago

In case of AWS i have tried from ubuntu machine from which I have connected to kubernetes cluster

danielsagi commented 3 years ago

@varshabhijeet can you try running kube-hunter as a job inside the cluster?

danielsagi commented 3 years ago

Also please run with --log debug, this will help us alot with solving this problem

ghost commented 3 years ago

@varshabhijeet can you try running kube-hunter as a job inside the cluster? yes kube hunter pod working inside cluster but from remote machine access to cluster not possible.

ghost commented 3 years ago

Also please run with --log debug, this will help us alot with solving this problem Remotes (separated by a ','): 35.185.215.135 2020-11-10 13:54:37,802 DEBUG kube_hunter.core.events.handler Event <class 'kube_hunter.core.events.types.HuntStarted'> got published with <kube_hunter.core.events.types.HuntStarted object at 0x7fd7104309b0> 2020-11-10 13:54:37,802 DEBUG kube_hunter.core.events.handler Event <class 'kube_hunter.core.events.types.HuntStarted'> got published with <kube_hunter.core.events.types.HuntStarted object at 0x7fd7104309b0> 2020-11-10 13:54:37,802 DEBUG kube_hunter.core.events.handler Event <class 'kube_hunter.modules.discovery.hosts.HostScanEvent'> got published with <kube_hunter.modules.discovery.hosts.HostScanEvent object at 0x7fd7104309e8> 2020-11-10 13:54:37,802 DEBUG kube_hunter.core.events.handler Executing <class 'kube_hunter.modules.report.collector.StartedInfo'> with {'previous': None, 'hunter': None} 2020-11-10 13:54:37,803 INFO kube_hunter.modules.report.collector Started hunting 2020-11-10 13:54:37,803 INFO kube_hunter.modules.report.collector Discovering Open Kubernetes Services 2020-11-10 13:54:37,802 DEBUG kube_hunter.core.events.handler Executing <class 'kube_hunter.modules.discovery.kubectl.KubectlClientDiscovery'> with {'previous': None, 'hunter': None} 2020-11-10 13:54:37,803 DEBUG kube_hunter.modules.discovery.kubectl Attempting to discover a local kubectl client 2020-11-10 13:54:37,803 DEBUG kube_hunter.core.events.handler Executing <class 'kube_hunter.modules.discovery.hosts.HostDiscovery'> with {'active': False, 'predefined_hosts': []} 2020-11-10 13:54:37,809 DEBUG kube_hunter.core.events.handler Event <class 'kube_hunter.core.events.types.NewHostEvent'> got published with 35.185.215.135 2020-11-10 13:54:37,810 DEBUG kube_hunter.core.events.handler Executing <class 'kube_hunter.modules.discovery.ports.PortDiscovery'> with {'host': '35.185.215.135', 'cloud_type': None, 'event_id': 0, 'previous': <kube_hunter.modules.discovery.hosts.HostScanEvent object at 0x7fd7104309e8>, 'hunter': <class 'kube_hunter.modules.discovery.hosts.HostDiscovery'>} 2020-11-10 13:54:37,810 DEBUG kube_hunter.modules.discovery.ports host 35.185.215.135 try ports: [8001, 8080, 10250, 10255, 30000, 443, 6443, 2379] 2020-11-10 13:54:37,810 DEBUG kube_hunter.modules.discovery.ports Scanning 35.185.215.135:8001 2020-11-10 13:54:37,815 DEBUG kube_hunter.modules.discovery.kubectl Could not find kubectl client 2020-11-10 13:54:39,312 DEBUG kube_hunter.modules.discovery.ports Scanning 35.185.215.135:8080 2020-11-10 13:54:40,814 DEBUG kube_hunter.modules.discovery.ports Scanning 35.185.215.135:10250 2020-11-10 13:54:42,316 DEBUG kube_hunter.modules.discovery.ports Scanning 35.185.215.135:10255 2020-11-10 13:54:43,817 DEBUG kube_hunter.modules.discovery.ports Scanning 35.185.215.135:30000 2020-11-10 13:54:45,319 DEBUG kube_hunter.modules.discovery.ports Scanning 35.185.215.135:443 2020-11-10 13:54:45,481 DEBUG kube_hunter.modules.discovery.ports Scanning 35.185.215.135:6443 2020-11-10 13:54:46,983 DEBUG kube_hunter.modules.discovery.ports Scanning 35.185.215.135:2379 2020-11-10 13:54:48,485 DEBUG kube_hunter.core.events.handler Event <class 'kube_hunter.core.events.types.HuntFinished'> got published with <kube_hunter.core.events.types.HuntFinished object at 0x7fd710430a20> 2020-11-10 13:54:48,485 DEBUG kube_hunter.core.events.handler Executing <class 'kube_hunter.modules.report.collector.SendFullReport'> with {'previous': None, 'hunter': None} 2020-11-10 13:54:48,485 DEBUG kube_hunter.modules.report.dispatchers Dispatching report via stdout

Kube Hunter couldn't find any clusters 2020-11-10 13:54:48,485 DEBUG kube_hunter.main Cleaned Queue

this is output

danielsagi commented 3 years ago

@varshabhijeet looks like normal behavior. This is not a problem in kube-hunter, this ip of the node is probably not accessible from where you are running kube-hunter. I can further help you to figure out why this is happening but I would recommend just running kube-hunter as a pod inside your cluster.