NginxProxyManager / nginx-proxy-manager

Docker container for managing Nginx proxy hosts with a simple, powerful interface
https://nginxproxymanager.com
MIT License
20.79k stars 2.41k forks source link

Prevent sites being indexed by search engines #245

Open dennisboris opened 4 years ago

dennisboris commented 4 years ago

Is your feature request related to a problem? Please describe. I want to prevent some sites to be indexed by search engines.

Describe the solution you'd like A checkbox to prevent a site being indexed or the option to add custom headers like: add_header X-Robots-Tag "noindex, nofollow, nosnippet, noarchive";

Describe alternatives you've considered Adding the header by adding a location block in the custom configuration but it requires me to write my own location block configuration. That I don't understand completely I think because I can't load some sites afterwards.

location / {
    proxy_pass http://192.168.2.10:8123;
    add_header  X-Robots-Tag "noindex, nofollow, nosnippet, noarchive";
}

Additional context I hope is is possible to add this feature and I think a lot of people are interested in it.

dl-lim commented 3 years ago

Definitely need this. Any workarounds until someone gives an update?

Do we have to write every single proxy_pass for every subdomain?

schmurtzm commented 3 years ago

Looking for this option too 🙂

tristanXme commented 3 years ago

Would also love to have this

Aceriz commented 3 years ago

would agree it would be great to have this

HidYn commented 3 years ago

I was able to get this working by adding the below to the advanced section of a host you don't want indexed.

location = /robots.txt {
  add_header  Content-Type  text/plain;
  return 200 "User-agent: *\nDisallow: /\n";
}
LDprg commented 1 year ago

any updates on this

wanderling commented 11 months ago

Start a bounty. I'll pitch in for this feature.

github-actions[bot] commented 3 months ago

Issue is now considered stale. If you want to keep it open, please comment :+1: