Closed hisnameisjimmy closed 6 years ago
So I'm not a huge fan of cache warming. From what I've found, it's often best to just let the cache re-populate itself naturally.
The real problem is that FastCGI Cache Bust busts the entire cache... so really it'd have re-warm everything for it to be helpful, and I've found that in trying to do this helpful thing, you can actually bring a webserver to its knees.
Sure, you can do things like re-warm the URLs on a delay so avoid swamping the webserver, but I haven't found a huge gain from doing this vs. letting the cache re-populate organically. Yes, someone will get a slow(er) page load, but we can optimize our non-cache hits too.
I view caching as solving concurrency / scalability, and not necessarily as a way of masking performance.
What would be really cool is if there was a way to individually bust a cache URL in a fine-grained way. Then warming just that URL again would make a whole lot of sense.
Those are good points.
For me, the reason I want to warm every URL is for performance and SEO. Even if initial page load is decent, it's very rarely going to be anywhere close to fastCGI levels.
Further, if you have a large site (thousands of pages) and either Bing or Google hits your sitemap and crawls everything, you can run into a huge performance hit that can potentially take your site down. If everything was cached, you'd most likely weather the storm without issue.
Cache warming, even if done without delay, is usually fine on a CPU optimized machine from Digital Ocean. Further, you could potentially schedule it for off hours.
Yep, all true.
While your initial pageload will never approach FastCGI Cache levels, does it need to? If we assume that the frequency of breaking the cache is reasonable (maybe once per day) the more traffic you get, the more quickly your cache will be organically warmed.
And then you'll also be optimizing pages naturally that are highest trafficked.
I think the real issue with FastCGI Cache Bust as implemented is that when it breaks the cache, it does it for the entire site. In most cases, this is okay... but in some cases, more finely granular cache busting would really be much more useful.
Yeah, in our case we have about 8k pages and around 11 languages.
There are many pages that won't be hit regularly, or are targeted around SEO long tail type stuff. Initial page load causing a bounce in those cases can be a bummer.
I agree with finely grained cache busting, but ultimately I think my dream is an 'all-in-one' tool that could handle the use-cases I originally outlined. Maybe I'll just build it!
If you're using fastCGI, the huge benefit is that secondary page loads are really fast. However, those first page loads can be slow.
It would be sick if there were these options available for fastcgicachebust:
This would make it so you could do a deployment, curl a URL to bust the cache, then curl another to warm it at the end of the deployment. It would also mean that when your content writers are doing their own thing saving entries or making adjustments, they don't have to think about something like keeping the cache warm.