Closed LeMoussel closed 7 years ago
When parsing large robots.txt file, the process can take several minutes to finish, even with 100% CPU power dedicated. On a small server, it slow down the server, and even cause it to hang or crash.
Example URL: http://www.testmateriel.com/robots.txt (20997 lines, 1 122 590 characters)
One solution is to limit the maximum number of bytes to parse (see Issue #2 ). But, perhaps, it may be possible to optimize parsing.
I think it can`t be optimized in such look. Need try to rewrite completely :)
<1s, really fast now. earlier it took at least several minutes or more.
When parsing large robots.txt file, the process can take several minutes to finish, even with 100% CPU power dedicated. On a small server, it slow down the server, and even cause it to hang or crash.
Example URL: http://www.testmateriel.com/robots.txt (20997 lines, 1 122 590 characters)
One solution is to limit the maximum number of bytes to parse (see Issue #2 ). But, perhaps, it may be possible to optimize parsing.