wummel / linkchecker

check links in web documents or full websites
http://wummel.github.io/linkchecker/
GNU General Public License v2.0
1.42k stars 234 forks source link

add --no-robots commandline flag #655

Closed anarcat closed 7 years ago

anarcat commented 8 years ago

While this flag can be abused, it seems to me like a legitimate use case that you want to check a fairly small document for mistakes, which includes references to a website which has a robots.txt that denies all robots. It turns out that most websites do not add a permission for LinkCheck to use their site, and some sites, like the Debian BTS for example, are very hostile with bots in general.

Between me using linkcheck and me using my web browser to check those links one by one, there is not a big difference. In fact, using linkcheck may be better for the website because it will use HEAD requests instead of a GET, and will not fetch all page elements (javascript, images, etc) which can often be fairly big.

Besides, hostile users will patch the software themselves: it took me only a few minutes to disable the check, and a few more to make that into a proper patch.

By forcing robots.txt without any other option, we are hurting our good users and not keeping hostile users from doing harm.

The patch is still incomplete, but works. It lacks: documentation and unit tests.

Closes: #508

anarcat commented 7 years ago

this patch was merged in master in the new linkcheck organization.