Open bhpayne opened 11 months ago
Currently both https://derivationmap.net/robots.txt and https://allofphysics.com/robots.txt both resolve to the same file which contains
User-agent: * Allow: / Sitemap: https://derivationmap.net/sitemap.txt
I may need to have two separate robots.txt files for the different domains? The other robots.txt would contain
User-agent: * Allow: / Sitemap: https://allofphysics.com/sitemap.txt
but then I'd need to also have a separate sitemap.txt that references as the domain.
Two different nginx approaches:
https://www.digitalocean.com/community/questions/nginx-same-root-folder-for-multiple-website-different-robots-txt
https://stackoverflow.com/questions/26308779/nginx-different-robots-txt-for-alternate-domain
For now I'm going to hold off on fixing what might not actually be an issue.
If the Google index doesn't include allofphysics.com then a separate robots.txt might be relevant.
Currently both https://derivationmap.net/robots.txt and https://allofphysics.com/robots.txt both resolve to the same file which contains
I may need to have two separate robots.txt files for the different domains? The other robots.txt would contain
but then I'd need to also have a separate sitemap.txt that references as the domain.