Closed drydenwilliams closed 2 years ago
@chrismclarke please could you just check my proposed implementation of this. And also It was mentioned that this might be a nice story for @alromh87?
We will definitely want to dynamically generate an xml file. My 2 main concerns would be:
How long do most searchbots wait to receive the sitemap? If interrogating the data on the fly it might take a few seconds to run all the required operations.
How often do searchbots scrape? If it is frequent, or if there were a malicious 3rd party, it could be quite resource-intensive to go through the entire db every request.
It might be better to somehow schedule as part of a cron job (perhaps at same time as backup) instead of on request, and hosted as a static xml file in a public folder. The tricky part here however would be how to upload just an updated sitemap to firebase hosting... for that I'm not sure.
Is your feature request related to a problem? Please describe. We need to have a
sitemap.xml
to list all the pages we have for nice Google Bots.Describe the solution you'd like A solution I've done in the past is to do this on the API by adding a for
/sitemap.xml
and dynamically making the.xml
file..Here is an example of a
sitemap.xml
and you can see how it structures it which is very uniform:How I did this before was by (this is using express):
Then in my case it was using the routes I had, and looping through all the ski-resorts that were returned in the DB. Then I took this list of ski resorts and made them into the same router structure on my page. E.g.
how-to called :
run-a-workshop-on-an-event
would become:https://community.preciousplastic.com/how-to/run-a-workshop-on-an-event
and here was my code for inspiration: