Arachni / arachni

Web Application Security Scanner Framework
http://www.arachni-scanner.com
Other
3.77k stars 763 forks source link

Not getting Arachni Report,But Pages Discovered: and Progress :0% #337

Closed ramijunnisa closed 11 years ago

ramijunnisa commented 11 years ago

Hi,

After starting the Arachni Server.Url is Passed to scan.But No Progress is shown and No Report is Generated.

Zapotek commented 11 years ago

Could you send me the URL you scanned? Also, did the status move past the crawling phase?

ramijunnisa commented 11 years ago

no status is shown on crawling phase.Nothing is happening in Crawling.Progress is 0%.Pages discovered:21.the url is like this:http://portno:8080/ApplicationName

Zapotek commented 11 years ago

I'm not sure exactly what you mean, could I see a screenshot please?

ramijunnisa commented 11 years ago

Sorry For Confusion.After pass the url to scan status is not updating in crawling phase.After six overs also no status update shown on crawling phase.Progress also showing 0%.can u please tell me why like this?

Zapotek commented 11 years ago

As long as it crawls the progress will stay in 0%, if the statistics get updated and you see more requests being made and pages discovered then everything is fine. I'm assuming your using the web interface right?

However, without having access to the webapp so that I can reproduce the issue I don't think I'll be able to be of much help.

ramijunnisa commented 11 years ago

yes,i am using web interface.

ramijunnisa commented 11 years ago

Sorry,i cannot give the url.There is no End For Scanning after many (minimum 10) hours of scanning. There is no hope to see the Report. How to troubleshoot this issue,can u please tell me?

Zapotek commented 11 years ago

You could start a Dispatcher like so:

./bin/arachni_rpcd --reroute-to-logfile --debug

Add it to the web interface and then use it to perform the scan, once the scan starts to exhibit the weird behavior you mentioned you can stop it and send me (via e-mail) the files under:

system/gems/gems/arachni-0.4.2/logs/
ramijunnisa commented 11 years ago

For Small Modules in my application which are having less pages(6).Report is generated and everything is fine..But For the Other Modules which are having more Pages it is taking too much time.I had observed that server went into the non-responsive mode.And i am not able to see crawling status for these large nodules.can u tell me why like this?

ramijunnisa commented 11 years ago

From log file there are no errors showed while doing debug

Zapotek commented 11 years ago

A few questions, what do you mean be "small modules" and "other modules"? Also, if the server dies then it probably means that it can't take the load of the scan so this could be your problem.

I'd suggest you try setting the http request limit option to 1 in the profile you're using but Arachni already does that automatically if it detects that the server is struggling.

So long story short, if the server doesn't have enough resources to withstand the scan then there's nothing that can be done from Arachni's side.

Zapotek commented 11 years ago

Any news on this?

Zapotek commented 11 years ago

I couldn't reproduce this and no-one else has reported anything similar so I'm afraid I'll have to close this issue.