Updated robots.txt generation so it's part of ripple-nuxt-tide, this is opt in so existing sites can still manage their own robots.txt generation.
Updated list of excludedPaths to include preview urls, oauth urls and the share url.
Screenshots
Here's three screenshots showing robots.txt working with the three config options, i.e. boolean, object and array.
They all have Disallow: '/' because it's not production.
How Has This Been Tested?
This has been tested in browser locally, still needs testing on 'staging'. Does this change need tests written?
Types of changes
[ ] Bug fix (non-breaking change which fixes an issue)
[x] New feature (non-breaking change which adds functionality)
[ ] Breaking change (fix or feature that would cause existing functionality to not work as expected)
[ ] Improvement/refactoring (non-breaking change that doesn't add any features but makes things better)
Checklist
[x] I've added relevant changes to the documentation. (💁♂️ I've updated the readme and reference site, let me know if I need to update anything else)
[ ] I have added tests to cover my changes
[x] My change requires a template update for create-ripple-app. (💁♂️ I've enabled this config option by default, is they anything else I need to do around versioning of create-ripple-app or is that automated?)
[ ] I have added template update script for next release.
Motivation and Context
As a content editor I don’t want my CMS preview pages indexed by search engines so the public can’t access non published pages.
JIRA issue: https://digital-vic.atlassian.net/browse/SDPAP-5916
Changed
Screenshots
Here's three screenshots showing robots.txt working with the three config options, i.e. boolean, object and array. They all have
Disallow: '/'
because it's not production.How Has This Been Tested?
This has been tested in browser locally, still needs testing on 'staging'. Does this change need tests written?
Types of changes
Checklist