Closed mossharris closed 7 years ago
No more links to crawl? It prompts http code 403. It may be a BUG.
./bin/arachni --scope-page-limit 2000 --scope-include-subdomains --scope-exclude-file-extensions jpg,png,js,css,ico --checks xss* https://www.uber.com` Arachni - Web Application Security Scanner Framework v2.0dev Author: Tasos "Zapotek" Laskos <tasos.laskos@arachni-scanner.com> (With the support of the community and the Arachni Team.) Website: http://arachni-scanner.com Documentation: http://arachni-scanner.com/wiki [~] No element audit options were specified, will audit links, forms, cookies, UI inputs, UI forms, JSONs and XMLs. [*] Initializing... [*] Preparing plugins... [*] ... done. [*] BrowserCluster: Initializing 6 browsers... [*] BrowserCluster: Spawned #1 with PID 31562 [lifeline at PID 31559]. [*] BrowserCluster: Spawned #2 with PID 31585 [lifeline at PID 31582]. [*] BrowserCluster: Spawned #3 with PID 31607 [lifeline at PID 31604]. [*] BrowserCluster: Spawned #4 with PID 31629 [lifeline at PID 31626]. [*] BrowserCluster: Spawned #5 with PID 31651 [lifeline at PID 31648]. [*] BrowserCluster: Spawned #6 with PID 31673 [lifeline at PID 31670]. [*] BrowserCluster: Initialization completed with 6 browsers in the pool. [*] [HTTP: 403] https://www.uber.com/ [~] Analysis resulted in 0 usable paths. [~] DOM depth: 0 (Limit: 5) [*] XSS in path: Checking for: https://www.uber.com/<my_tag_080a7612baa2121929d5e616ffb593d0/> [*] XSS in path: Checking for: https://www.uber.com/>"'><my_tag_080a7612baa2121929d5e616ffb593d0/> [*] XSS in path: Checking for: https://www.uber.com/ [*] XSS in path: Checking for: https://www.uber.com/ [*] XSS in path: Checking for: https://www.uber.com/ [*] XSS in path: Checking for: https://www.uber.com/ [*] Harvesting HTTP responses... [~] Depending on server responsiveness and network conditions this may take a while. ================================================================================ [+] Web Application Security Report - Arachni Framework [~] Report generated on: 2016-08-13 06:56:06 +0000 [~] Report false positives at: http://github.com/Arachni/arachni/issues [+] System settings: [~] --------------- [~] Version: 2.0dev [~] Seed: 080a7612baa2121929d5e616ffb593d0 [~] Audit started on: 2016-08-13 06:56:00 +0000 [~] Audit finished on: 2016-08-13 06:56:06 +0000 [~] Runtime: 00:00:05 [~] URL: https://www.uber.com/ [~] User agent: Arachni/v2.0dev [*] Audited elements: [~] * Links [~] * Forms [~] * Cookies [~] * XMLs [~] * JSONs [~] * UI inputs [~] * UI forms [*] Checks: xss_event, xss_tag, xss_path, xss_dom_script_context, xss_dom, xss_script_context, xss [~] =========================== [+] 0 issues were detected. [+] Plugin data: [~] --------------- [*] Health map [~] ~~~~~~~~~~~~~~ [~] Description: Generates a simple list of safe/unsafe URLs. [~] Legend: [+] No issues [-] Has issues [+] https://www.uber.com/ [~] Total: 1 [+] Without issues: 1 [-] With issues: 0 ( 0% ) [~] Audited 1 page snapshots. [~] Audit limited to a max of 2000 pages. [~] Duration: 00:00:05 [~] Processed 8/8 HTTP requests. [~] -- 4.552 requests/second. [~] Processed 0/0 browser jobs. [~] -- 0.0 second/job. [~] Currently auditing https://www.uber.com/ [~] Burst response time sum 3.285 seconds [~] Burst response count 6 [~] Burst average response time 0.548 seconds [~] Burst average 4.785 requests/second [~] Timed-out requests 0 [~] Original max concurrency 20 [~] Throttled max concurrency 20
I don't know why but the address returned a 403 code, as the system clearly shows:
403
[*] [HTTP: 403] https://www.uber.com/
I have found the reason, network problem. Thank you!
No more links to crawl? It prompts http code 403. It may be a BUG.