Open dennisboris opened 4 years ago
Definitely need this. Any workarounds until someone gives an update?
Do we have to write every single proxy_pass for every subdomain?
Looking for this option too 🙂
Would also love to have this
would agree it would be great to have this
I was able to get this working by adding the below to the advanced section of a host you don't want indexed.
location = /robots.txt {
add_header Content-Type text/plain;
return 200 "User-agent: *\nDisallow: /\n";
}
any updates on this
Start a bounty. I'll pitch in for this feature.
Issue is now considered stale. If you want to keep it open, please comment :+1:
Is your feature request related to a problem? Please describe. I want to prevent some sites to be indexed by search engines.
Describe the solution you'd like A checkbox to prevent a site being indexed or the option to add custom headers like:
add_header X-Robots-Tag "noindex, nofollow, nosnippet, noarchive";
Describe alternatives you've considered Adding the header by adding a location block in the custom configuration but it requires me to write my own location block configuration. That I don't understand completely I think because I can't load some sites afterwards.
Additional context I hope is is possible to add this feature and I think a lot of people are interested in it.