king155 / skipfish

Automatically exported from code.google.com/p/skipfish
Apache License 2.0
0 stars 0 forks source link

Maximum duration/time limit would be nice #67

Closed GoogleCodeExporter closed 8 years ago

GoogleCodeExporter commented 8 years ago
Just a suggestion, but I think it would be nice to have a time 
limit/duration parameter we could pass in. Once the duration is reached then 
the scan stops and the report is written like normal.

This would be great when running cron jobs for sure. If there is a way to do 
this outside of the program, could someone post how to do it? Right now, I 
am setting a limit of 1000 requests before it will automatically stop and 
write the report.

Thanks for the great bit of software!

Original issue reported on code.google.com by chre...@gmail.com on 4 Jun 2010 at 6:14

GoogleCodeExporter commented 8 years ago
I would have marked this as an enhancement, but I wast't sure how. Thanks again 
for 
looking into this.

Original comment by chre...@gmail.com on 4 Jun 2010 at 6:16

GoogleCodeExporter commented 8 years ago
Hum... 1000 requests is almost certainly insufficient (definitely if you are 
using 
dictionary-based discovery). Issuing 1000 requests should take under 10 
seconds, so I 
am guessing that you are also running into some underlying performance 
problems; I 
recommend having a look at:

http://code.google.com/p/skipfish/wiki/KnownIssues

Otherwise, to execute a time-bound scan, simply schedule skipfish to run at 
time T, 
and schedule 'killall -INT skipfish' to run at time T+x, where x is the limit 
you 
want to enforce. Since scan time is a very poor predictor of coverage, I am 
inclined 
not to integrate this into the application, though.

Original comment by lcam...@gmail.com on 4 Jun 2010 at 8:02

GoogleCodeExporter commented 8 years ago
Hey Icamtuf,

Thanks for the quick reply and thorough answer :) I'll take your advice and see 
if the scan finishes in a weekend so I can get the most comprehensive results. 
Cool idea for implementing time-bound processes via cron too, I'll add that to 
my list of useful stuff.

Thanks again Icamtuf,

Chrelad

Original comment by chre...@gmail.com on 8 Jun 2010 at 4:28