More than an issue it should be read as a feature request.
I was wondering whether it is possible or not to optionally ignore the robots exclusion protocol so you can check for broken links without modifying the configuration on your website.
I assume it could be easily implemented as a new command-line or configuration file option where you can temporary disable the current behaviour.
Hi all,
More than an issue it should be read as a feature request.
I was wondering whether it is possible or not to optionally ignore the robots exclusion protocol so you can check for broken links without modifying the configuration on your website.
I assume it could be easily implemented as a new command-line or configuration file option where you can temporary disable the current behaviour.
Many thanks in advance for your support.