Closed GoogleCodeExporter closed 8 years ago
I would have marked this as an enhancement, but I wast't sure how. Thanks again
for
looking into this.
Original comment by chre...@gmail.com
on 4 Jun 2010 at 6:16
Hum... 1000 requests is almost certainly insufficient (definitely if you are
using
dictionary-based discovery). Issuing 1000 requests should take under 10
seconds, so I
am guessing that you are also running into some underlying performance
problems; I
recommend having a look at:
http://code.google.com/p/skipfish/wiki/KnownIssues
Otherwise, to execute a time-bound scan, simply schedule skipfish to run at
time T,
and schedule 'killall -INT skipfish' to run at time T+x, where x is the limit
you
want to enforce. Since scan time is a very poor predictor of coverage, I am
inclined
not to integrate this into the application, though.
Original comment by lcam...@gmail.com
on 4 Jun 2010 at 8:02
Hey Icamtuf,
Thanks for the quick reply and thorough answer :) I'll take your advice and see
if the scan finishes in a weekend so I can get the most comprehensive results.
Cool idea for implementing time-bound processes via cron too, I'll add that to
my list of useful stuff.
Thanks again Icamtuf,
Chrelad
Original comment by chre...@gmail.com
on 8 Jun 2010 at 4:28
Original issue reported on code.google.com by
chre...@gmail.com
on 4 Jun 2010 at 6:14