hakluke / hakrawler

Simple, fast web crawler designed for easy, quick discovery of endpoints and assets within a web application
https://hakluke.com
GNU General Public License v3.0
4.41k stars 483 forks source link

Terminated by signal SIGKILL (Force quit) #143

Closed vhgbao01 closed 1 year ago

vhgbao01 commented 1 year ago

This happened to me 2 weeks ago, it was running fine before that. I'm currently running hakrawler using tmux on kali on AWS EC2. Command: cat subdomains_probed.txt | hakrawler -u -subs -insecure -d 6 -t 1 > hakrawler.txt Result: fish: Process 12578, 'hakrawler' from job 1, 'cat subdomains_probed.txt…' terminated by signal SIGKILL (Forced quit) The result still outputs to hakrawler.txt but the last URL is suddenly cut, not the full one

hakluke commented 1 year ago

Potentially a lack of memory?

vhgbao01 commented 1 year ago

Hi, I can confirm that it worked fine on my pc. I was wondering why the memory is lacking although I used only 1 thread.

hakluke commented 1 year ago

Not sure, how many lines were in subdomains_probed.txt?

Given that it's a SIGKILL, it may have just been a Ctrl+C on the terminal?

vhgbao01 commented 1 year ago

About 310 lines, the result was generated by httprobe. I don't think that I used Ctrl+C to kill the process, I just simply detached tmux session using Ctrl+B D after running the command and left it there for a few days. My previous works are running fine, only this one got an error.

garlic0x1 commented 1 year ago

If you we're using the unique flag, and were running the program for multiple days, you probably filled up your memory with sm sync.Map.

vhgbao01 commented 1 year ago

Interesting, I guess that piping the result to sort -u as mentioned in #138 is the solution. Thanks for the information guys.