Open GoogleCodeExporter opened 9 years ago
Need to check if this really improves the checks or not. i.e. does it help
detect
more WAFs?
Original comment by sandrogauc
on 9 May 2009 at 10:19
[deleted comment]
Hi Sandro,
I believe there are 2 things we need to think here:
1 - We should not remove the checks we have today, because we can use them to
see differences in product
and do more accreted fingerprint. Do a request as query_string as you suggested
is a good idea, I know some
WAFs look at it, but some (in special the ones with learning mode) will not
deal with it if the app do not use
query_string. So it's nice to add, but the best should be use a valid URI with
a valid parameter. I believe
following HTTP redirects and with some parsing and maybe more 1 or 3 requests
we can get it in most cases,
and consequently inject it in the best point. Anyway, I believe the 3
techniques should be in Wafw00f, since
it's very useful in a near future for us compare products based in basic
behavior and open our range of
detection.
2 - Another point is that we nee a more robust test-case, I mean a big set of
offensive contents to send and
test, today our test is very small reduced to maybe 4-7 basic tests, we may
improve it, since it will reflect
direct in quality of detection. Also, in the other hand, we must be careful,
because it will direct impact in
number of requests and time to reply.
I believe it will enhance, but more for refined products, not the small craps
with trial...hehehe :)
Regards,
Original comment by wsguglie...@gmail.com
on 9 May 2009 at 3:34
1. Understood. Regarding following HTTP redirects etc, sure that's ok. However
probably the best way would be to crawl the website for the valid URI with valid
parameter.
2. Yep. I noticed Imperva has the following behavior:
a. http://iii/?x=<script> does not get blocked
b. http://iii/?validparam=<script> gets blocked
My concern is that adding more checks can make the process noisier so I was
thinking
of choosing the best checks rather than adding more indiscriminately.
Also - adding a crawler feature might be an overkill. Others have done it before
multiple times. Maybe make it optional. And maybe stop at the first one or two
parameters that it can use.
Original comment by sandrogauc
on 9 May 2009 at 3:37
ok regarding your no. 2, we're on the same wavelength
Original comment by sandrogauc
on 9 May 2009 at 3:40
Great, we have the same idea, I think it's fine. What we can do to minimize the
crawling and whole detection
problem is:
- First do all checks as we do today.
- If nothing detected, check the previous HTTP redirect followed and inject a
fake parameters with our hostile
and well chosen attack strings and see if we detect anything.
- If nothing detected, let's do a very basic crawly to find the first URI with
a valid parameter and inject our
hostile and well chosen attack strings and check if we can detect it (for sure
will work with Imperva). I agree
with you that we can do add this last check as optional. ;)
Do you agree?
Original comment by wsguglie...@gmail.com
on 9 May 2009 at 8:59
Original issue reported on code.google.com by
sandrogauc
on 9 May 2009 at 10:19