This ticket is to have Tor2web to serve full reverse-proxied pages to search engine but not to all other users, that should in-turn receive a 302 redirect to the original website.
The goal is to be able to implement the de-robots concept that re-index on search engine existing website configured from #316, serving the full pages to search engines (based on a list of configured UA), but not serving the reverse-proxed web-content to general users.
That way any general users following a "google indexed" reverse-proxed domain, clicking on that link will receive a 302 redirect to the original resource.
That way the platform running the software will not serve any kind of possibly privacy-sensitive file to the general public, but only provide the indexing.
---
Want to back this issue? **[Post a bounty on it!](https://www.bountysource.com/issues/40880867-de-robots-tor2web-should-serve-full-reverse-proxied-pages-to-search-engine-but-not-to-all-other-users?utm_campaign=plugin&utm_content=tracker%2F318575&utm_medium=issues&utm_source=github)** We accept bounties via [Bountysource](https://www.bountysource.com/?utm_campaign=plugin&utm_content=tracker%2F318575&utm_medium=issues&utm_source=github).
This ticket is to have Tor2web to serve full reverse-proxied pages to search engine but not to all other users, that should in-turn receive a 302 redirect to the original website.
The goal is to be able to implement the de-robots concept that re-index on search engine existing website configured from #316, serving the full pages to search engines (based on a list of configured UA), but not serving the reverse-proxed web-content to general users.
That way any general users following a "google indexed" reverse-proxed domain, clicking on that link will receive a 302 redirect to the original resource.
That way the platform running the software will not serve any kind of possibly privacy-sensitive file to the general public, but only provide the indexing.