fcavallarin / htcap

htcap is a web application scanner able to crawl single page application (SPA) recursively by intercepting ajax calls and DOM changes.
GNU General Public License v2.0
610 stars 114 forks source link

Only errors displayed on the report #79

Open gauravnarwani97 opened 3 years ago

gauravnarwani97 commented 3 years ago

The urls with Errors are only displayed on the html report. Rest no urls are visible. I tried using https://htcap.org/scanme/ but still same output

Errors3

probe_killed probe_failure HTTP Error 400: Bad Request

Command: ./htcap.py crawl https://htcap.org htcap.db -v Initializing . . . done Database htcap-2.db initialized, crawl started with 10 threads (^C to pause or change verbosity) [================== ] 5 of 9 pages processed in 0 minutes^C Crawler is paused. r resume v verbose mode p show progress bar q quiet mode Hit ctrl-c again to exit

v Crawler is running crawl result for: redirect GET https://htcap.org/scanme/ng/

fcavallarin commented 3 years ago

seems that you are missing some nodejs deps. try to "cd core/nodejs" and run "npm i"

gauravnarwani97 commented 3 years ago

asd@asd nodejs % npm i npm WARN htcap-chrome-probe@1.0.0 No description npm WARN htcap-chrome-probe@1.0.0 No repository field.

audited 45 packages in 1.334s

1 package is looking for funding run npm fund for details

found 0 vulnerabilities

asd@asd nodejs % npm fund htcap-chrome-probe@1.0.0 └─┬ https://github.com/sponsors/isaacs └── glob@7.1.6

fcavallarin commented 3 years ago

please try to clone the whole project from scratch and run htcrawl.pw crawl

BilalLatif commented 3 years ago

I am getting the same error while crawling a website hosted locally

**crawl result for: link GET http://localhost:3000/home

Crawl command: python3.3 htcap.py crawl -vwl localhost:3000/home target.db

npm dependencies are up-to-date