Open haja-ran opened 2 years ago
@haja-ran Did you ever figure out a way to do this?
@mathiasrando I ended up creating a cron job that copies the resulting xml files from the node server to Amazon S3 every 6 hours. Then I updated the robots.txt file so that the sitemap points to the static files in S3.
@haja-ran Thanks for the very swift reply. I haven't been able to find anyone who've managed to regenerate it without restarting Nuxt or bombarding some endpoint checking for the "freshness" of the current sitemap by reducing the cache time to 1ms.
That sounds like a reasonable solution. For our specific setup I think we'll go with writing new files whenever a specific request hits the middleware in Nuxt. Then we'll trigger the request when changes in the CMS should result in updating the sitemap(s).
First of all, thanks for the great work !
I’m working on a large project with lots of dynamic routes. The API requests to fetch the dynamic routes are quite slow (sometimes more than 15s) because of the sheer number of URLs.
Is there a way to regenerate the cache manually (instead of waiting for robots to visit the sitemap url) ?
I would like to set a longer cache time so that I can use something like a cron job on the server side to ping a private URL that would regenerate the cache before it becomes invalid.
Is there a way I can achieve that or something similar ?