Add a /robots.txt route to all Sites, which defaults to disallowing bots from crawling and indexing ALL routes.
Add a "Enable SEO" property to Sites, defaults to false. If true, then the /robots.txt route would query for all routes accessible to the Site guest user profile for that Site's bundle, and would allow ONLY those route PREFIXES to be crawled. Any routes with path parameters would need to have path parameters stripped out in order to allow crawling (e.g. /product/{productid} would need to be converted to Allow: /product/ and this would need to be deduplicated.
Acceptance Criteria
See "Add robots.txt and sitemap.txt routes to all sites" in the SEO TRD
https://docs.google.com/document/d/1HiokABLO__GErYr3zjWVsVo7S4_qVR6pGg_i_ut3eu4/edit#heading=h.qp0ns3jqljh7
For this task --- we want to:
Allow: /product/
and this would need to be deduplicated.┆Issue is synchronized with this Clickup by Unito