Closed ramijunnisa closed 11 years ago
Could you send me the URL you scanned? Also, did the status move past the crawling phase?
no status is shown on crawling phase.Nothing is happening in Crawling.Progress is 0%.Pages discovered:21.the url is like this:http://portno:8080/ApplicationName
I'm not sure exactly what you mean, could I see a screenshot please?
Sorry For Confusion.After pass the url to scan status is not updating in crawling phase.After six overs also no status update shown on crawling phase.Progress also showing 0%.can u please tell me why like this?
As long as it crawls the progress will stay in 0%, if the statistics get updated and you see more requests being made and pages discovered then everything is fine. I'm assuming your using the web interface right?
However, without having access to the webapp so that I can reproduce the issue I don't think I'll be able to be of much help.
yes,i am using web interface.
Sorry,i cannot give the url.There is no End For Scanning after many (minimum 10) hours of scanning. There is no hope to see the Report. How to troubleshoot this issue,can u please tell me?
You could start a Dispatcher like so:
./bin/arachni_rpcd --reroute-to-logfile --debug
Add it to the web interface and then use it to perform the scan, once the scan starts to exhibit the weird behavior you mentioned you can stop it and send me (via e-mail) the files under:
system/gems/gems/arachni-0.4.2/logs/
For Small Modules in my application which are having less pages(6).Report is generated and everything is fine..But For the Other Modules which are having more Pages it is taking too much time.I had observed that server went into the non-responsive mode.And i am not able to see crawling status for these large nodules.can u tell me why like this?
From log file there are no errors showed while doing debug
A few questions, what do you mean be "small modules" and "other modules"? Also, if the server dies then it probably means that it can't take the load of the scan so this could be your problem.
I'd suggest you try setting the http request limit option to 1 in the profile you're using but Arachni already does that automatically if it detects that the server is struggling.
So long story short, if the server doesn't have enough resources to withstand the scan then there's nothing that can be done from Arachni's side.
Any news on this?
I couldn't reproduce this and no-one else has reported anything similar so I'm afraid I'll have to close this issue.
Hi,
After starting the Arachni Server.Url is Passed to scan.But No Progress is shown and No Report is Generated.