Closed jbmoelker closed 7 months ago
Latest commit: |
f7ab6be
|
Status: | ✅ Deploy successful! |
Preview URL: | https://2df41e43.head-start.pages.dev |
Branch Preview URL: | https://feat-robots-txt-and-sitemap.head-start.pages.dev |
Seems like
robots.txt.ts
is engineered only to be able to reuse the site url. You can make that part of the readme ("make sure you configure a robots.txt file"); I'd say it's not important for a dev environment to have a working sitemap (ie. having thelocalhost:2343
URL in there). Alternatively I would recommend an Astro integration that writes arobots.txt
(post build, perhaps, but it shouldn't matter much). I think an integration is a more logical/expected approach and is easier to extend.
Thanks @phortuin!
Yes, I was in doubt about which route to take. Had a different approach before. The annoying thing about hardcoding things is that if Head Start is used as a template for another project and that project changes hardcoded values, it makes it harder to sync with the upstream repository.
And I think we'll need this somefile.ext.ts
trick more often, for instance for an opensearch.xml.ts
with more dynamic values.
Changes
getStaticPaths()
and using the official@astro/sitemap
package.robots.txt
file.robots.txt
with the deployment's site URL.Associated issue
Resolves #5
How to test
/sitemap-index.xml
/sitemap-0.xml
)/robots.txt
link[rel="sitemap"]
correctly pointing to the sitemap indexChecklist
I have added/updated tests to prove that my feature works (if not possible please explain why)I have made changes to the README and if the change affects the project setup (npm commands changed, new service added, environmental variable added)I have added a decision log entry if the change affects the architecture or changes a significant technology