hakluke / hakrawler

Simple, fast web crawler designed for easy, quick discovery of endpoints and assets within a web application
https://hakluke.com
GNU General Public License v3.0
4.41k stars 483 forks source link

Hight consume of memory: SSH losing conection. #138

Closed GiuBravo closed 2 years ago

GiuBravo commented 2 years ago

imagem

Command:

hakrawler -d 10 -t 100 -subs -u

The wordlist used was gerated from httpx and have less then 600 targets.

Today was the fifth time that hakrawler uses all the memory of the machine and I lose the connection with ssh.

hakluke commented 2 years ago

This command would probably produce a lot of results - to lower memory consumption you could try not using the -u option as it is not very efficient... instead, just pipe the results to a file, then do a sort -u on the file after it is completed.

GiuBravo commented 2 years ago

Thanks a lot hakluke! I love your hakrawler! I gonna try it again.

hakluke commented 2 years ago

Another way to decrease memory usage of course, is less threads! Let me know how you go!