Usage:
./katana [flags]
Flags:
INPUT:
-u, -list string[] target url / list to crawl (single / comma separated / file input)
CONFIGURATIONS:
-config string cli flag configuration file
-d, -depth maximum depth to crawl (default 1)
-ct, -crawl-duration int maximum duration to crawl the target for
-mrs, -max-response-size int maximum response size to read (default 10 MB)
-timeout int time to wait in seconds before timeout (default 5)
-p, -proxy string[] http/socks5 proxy list to use (single / comma separated / file input)
-H, -header string[] custom header/cookie to include in request (single / file input)
SCOPE:
-cs, -crawl-scope string[] in scope target to be followed by crawler (single / comma separated / file input) # regex input
-cos, -crawl-out-scope string[] out of scope target to exclude by crawler (single / comma separated / file input) # regex input
-is, -include-sub include subdomains in crawl scope (false)
RATE-LIMIT:
-c, -concurrency int number of concurrent fetchers to use (default 300)
-rd, -delay int request delay between each request in seconds (default -1)
-rl, -rate-limit int maximum requests to send per second (default 150)
-rlm, -rate-limit-minute int maximum number of requests to send per minute
OUTPUT:
-o, -output string output file to write
-json write output in JSONL(ines) format (false)
-nc, -no-color disable output content coloring (ANSI escape codes) (false)
-silent display output only (false)
-v, -verbose display verbose output (false)
-version display project version
Reference:
https://github.com/projectdiscovery/gocrawl https://github.com/projectdiscovery/katana/tree/backup/pkg/engine/standard (improved)