JefferyHus / es6-crawler-detect

:spider: This is an ES6 adaptation of the original PHP library CrawlerDetect, this library will help you detect bots/crawlers/spiders vie the useragent.
MIT License
90 stars 30 forks source link

Potential DDOS vulnerability? #25

Closed exx8 closed 3 years ago

exx8 commented 3 years ago

Describe the bug I see no code in the package that limits the user agent string length. And I see you use regex massively, in a way that might be vulnerable to ReDos: https://en.wikipedia.org/wiki/ReDoS Please consider adding length limitation. To Reproduce One might send a 2mb of characters.

Expected behavior any user agent above 4k in size should be rejected without regex matching.

Additional context Some node mechanisms will usually defend against such vulnerability, but for some configuration, the package might still be vulnerable: https://stackoverflow.com/questions/24167656/nodejs-max-header-size-in-http-request

JefferyHus commented 3 years ago

Thanks for this issue. It makes total sense that this is a vulnerability.

exx8 commented 3 years ago

Can I offer my help?

JefferyHus commented 3 years ago

Yes of course

exx8 commented 3 years ago

26

exx8 commented 3 years ago

I thought about this, and the best approach, I believe, will be to limit the execution time, as detecting evil input might be challenging: https://nodejs.org/api/vm.html#vm_script_runincontext_contextifiedobject_options