Closed TheRealFlyingCoder closed 1 year ago
@TheRealFlyingCoder Hi! you should be able to send the headers in the config like
createSitemapGenerator({
siteUrl: 'https://example.com',
headers: {
'Cache-Control': 'max-age=3600'
}
})
Oh, awesome I'll have a look at that today... I should have thought to check there before opening the ticket 🙄
It might pay to add that in the docs and suggest it be used by default, especially if you have dynamic routes. I don't see many use cases where you'd want it regenerated on every visit
Love the project btw!
Yeah, i forgot to add the headers
in the config documentation, it must to be there.
In addition to this i'm thinking in another api to cache the whole sitemap using another solution like redis. maybe something like this:
const redis = new Redis();
createSitemapGenerator({
siteUrl: 'https://example.com',
cache: {
get: async () => {
const value = await redis.get('sitemap');
return value ? JSON.parse(value) : null;
},
set: async (sitemap) => {
await redis.set('sitemap', JSON.stringify(sitemap), 'EX', 3600);
}
}
});
what you think? also thanks ❤️
Yeah opening up getter and setter hooks for caching responses is a great idea, i'd definitely use it 👀
Just a quick one, we should be able to pass through a configuration option to set cache headers on the sitemap / robots responses.
If I have 10K dynamically loaded pages, and never plan to add more I should be able to set the sitemap to cache indefinitely.
If I might only add a page once a week, I would want the sitemap to update accordingly.
The most common one would be hourly, or every 12 hours depending on the activity