R0X4R / Garud

An automation tool that scans sub-domains, sub-domain takeover, then filters out XSS, SSTI, SSRF, and more injection point parameters and scans for some low hanging vulnerabilities automatically.
MIT License
768 stars 176 forks source link

crawling issue #47

Closed Aksh095 closed 2 years ago

Aksh095 commented 2 years ago

"domains/endpoints.txt\" file not found or doesn't contain anything"

jehant001 commented 2 years ago

Same issue pls help..

patrickhener commented 2 years ago

same here. please clarify.

Short debugging shows that the command which is supposed to create that endpoints.txt for the first time is:

cat .tmp/gospider.txt .tmp/gauplus.txt .tmp/waybackurls.txt 2> /dev/null | sed '/\[/d' | grep $domain | sort -u | uro | anew -q domains/endpoints.txt

Turns out that I am missing uro https://github.com/s0md3v/uro.

d0rksh commented 2 years ago

Same error

Starting subdomain enumeration of example.com

-- SUBDOMAINS: 0

Crawling subdomains of example.com
[!] - "domains/endpoints.txt" file not found or doesn't contain anything
DhiralGit commented 2 years ago

Same error!