Closed Ronald-Diemicke closed 5 months ago
Thinking about this a bit more; you could also add a 'noindex, nofollow' on the login page... this would help for if the domain requires auth... but the robots.txt would also help allow for things that DIDN'T require a login...
This could be considered a security threat if people can find your setup... they can start trying to attack it...
This is a great suggestion, thanks
added in 0.15
Feature Description
As a user, it would likely make sense to have a way to upload a robots.txt file so that if my split proxy is exposed to the internet, I could disallow search engines from indexing the domain. It would likely also make sense for subdomains to also either allow for separate robots.txt files or to use a generic one.