Open let4be opened 3 years ago
Good point. This crate simply ported the Google original library to Rust. Therefore we kept the logic remain: parse -> emit
for each request. Indeed, this could be optimized to one parse for multiple requests. (p.s. I don't know why Google never did this.) Of course, contributions are always welcome.
I was looking for a decent robots.txt library written in rust to integrate into my Broad Web Crawler(open source toy project) and so far this one seems like the best bet because of "google" and tests...
But I don't like "parse for each request approach", seems hurtful and unnecessary for performance reasons, from a swift look at source code I think the change would be somewhere here https://github.com/Folyd/robotstxt/blob/d46c028d63f15c52ec5ebd321db7782b7c033e81/src/matcher.rs#L350
If I get some time in the next couple of weeks I might go for it :)
This API and example are really confusing... Why cannot we simply parse once and then call methods to check if URL is allowed?...