Closed sakalajuraj closed 10 years ago
This is a real problem... I think that we can present to user the top 20 hostnames, but allowing to type the hostname anyway and check, using ajax, is is on the database. So the filter will be valid only when an exinting hostname is put in place.
I expect to check this for the next release...
Klaubert
On Thu, Jun 12, 2014 at 7:32 PM, Juraj Sakala notifications@github.com wrote:
When attacker tries bad thing on your web server and he does more then about 2000 requests and each with unique Host header, he is able to cause filter page malfunction that can be categorized as DoS attack against waf-fle. Try imagine, that attacker sends ten thousand requests and each request will have unique value. Filter page is generated via filter.php script which generates list of unique hostnames and their id's to the page. In such situation there will be generated page with many more then 10000 lines. For example firefox or chrome won't process such huge page (they strip page after about 2000th line with html comment because of security reasons), and you will be able see only part of the filter page (up to Sensor field). So some improvements are needed. Changing way how the hostnames are pulled from the database will be desirable. For example it should be better choice to let user to input hostname instead choosing from the list, but I have no idea now how to do it when filter is transforming the hostname to id.
Reply to this email directly or view it on GitHub https://github.com/klaubert/waf-fle/issues/15.
Based on my experience first couple thousands of hostnames are faked and unusabe (random strings, SQLi strings etc.). So maybe simple input box will be better choice. You are right the query must be limited. This was first thing I did so I could use the filter. Second problem is, that this unusable hostnames are stored permanently in the database even you delete related events. I think (but I am not sure) they will be never removed from the database unless manualy. Manual cleanup assumes user intervention directly to the database. This can be potential problem too if unexperienced user does it (database disrubtion etc.). So some tool for cleanup could be goot idea.
+1 to that.
Lot of rubish on hostname list. A tool to automaticaly remove hostnames with no events associated to it would be nice. A tool to remove old events too =)
Atenciosamente,
Marcus Semblano
On Tue, Jun 17, 2014 at 7:46 AM, Juraj Sakala notifications@github.com wrote:
Based on my experience first couple thousands of hostnames are faked and unusabe (random strings, SQLi strings etc.). So maybe simple input box will be better choice. You are right the query must be limited. This was first thing I did so I could use the filter. Second problem is, that this unusable hostnames are stored permanently in the database even you delete related events. I think (but I am not sure) they will be never removed from the database unless manualy. Manual cleanup assumes user intervention directly to the database. This can be potential problem too if unexperienced user does it (database disrubtion etc.). So some tool for cleanup could be goot idea.
— Reply to this email directly or view it on GitHub https://github.com/klaubert/waf-fle/issues/15#issuecomment-46292141.
Is on the way... :)
Klaubert Em 17/06/2014 08:24, "Marcus Semblano" notifications@github.com escreveu:
+1 to that.
Lot of rubish on hostname list. A tool to automaticaly remove hostnames with no events associated to it would be nice. A tool to remove old events too =)
Atenciosamente,
Marcus Semblano
On Tue, Jun 17, 2014 at 7:46 AM, Juraj Sakala notifications@github.com wrote:
Based on my experience first couple thousands of hostnames are faked and unusabe (random strings, SQLi strings etc.). So maybe simple input box will be better choice. You are right the query must be limited. This was first thing I did so I could use the filter. Second problem is, that this unusable hostnames are stored permanently in the database even you delete related events. I think (but I am not sure) they will be never removed from the database unless manualy. Manual cleanup assumes user intervention directly to the database. This can be potential problem too if unexperienced user does it (database disrubtion etc.). So some tool for cleanup could be goot idea.
Reply to this email directly or view it on GitHub https://github.com/klaubert/waf-fle/issues/15#issuecomment-46292141.
Reply to this email directly or view it on GitHub https://github.com/klaubert/waf-fle/issues/15#issuecomment-46294996.
I have pushed code that fix this issue.
When attacker tries bad thing on your web server and he does more then about 2000 requests and each with unique Host header, he is able to cause filter page malfunction that can be categorized as DoS attack against waf-fle. Try imagine, that attacker sends ten thousand requests and each request will have unique value. Filter page is generated via filter.php script which generates list of unique hostnames and their id's to the page. In such situation there will be generated page with many more then 10000 lines. For example firefox or chrome won't process such huge page (they strip page after about 2000th line with html comment because of security reasons), and you will be able see only part of the filter page (up to Sensor field). So some improvements are needed. Changing way how the hostnames are pulled from the database will be desirable. For example it should be better choice to let user to input hostname instead choosing from the list, but I have no idea now how to do it when filter is transforming the hostname to id.