Folyd / robotstxt

A native Rust port of Google's robots.txt parser and matcher C++ library.
https://crates.io/crates/robotstxt
Apache License 2.0
88 stars 13 forks source link

Why does allowed_by_robots and one_agent_allowed_by_robots parse robotstxt for each request? #3

Open let4be opened 3 years ago

let4be commented 3 years ago

This API and example are really confusing... Why cannot we simply parse once and then call methods to check if URL is allowed?...

Folyd commented 3 years ago

Good point. This crate simply ported the Google original library to Rust. Therefore we kept the logic remain: parse -> emit for each request. Indeed, this could be optimized to one parse for multiple requests. (p.s. I don't know why Google never did this.) Of course, contributions are always welcome.

let4be commented 3 years ago

I was looking for a decent robots.txt library written in rust to integrate into my Broad Web Crawler(open source toy project) and so far this one seems like the best bet because of "google" and tests...

But I don't like "parse for each request approach", seems hurtful and unnecessary for performance reasons, from a swift look at source code I think the change would be somewhere here https://github.com/Folyd/robotstxt/blob/d46c028d63f15c52ec5ebd321db7782b7c033e81/src/matcher.rs#L350

If I get some time in the next couple of weeks I might go for it :)