When using 'fastcgi_cache_use_stale updating' we can rely on serving and expired cache while the upstream builds a fresh new copy.
But if we delete the cache, there is no expired resource... of course, there is just nothing.
Imagine the situation in which you have a very active site, think a news site, and suddenly you experience a huge traffic spike. If we are serving fastcgi cached pages, everything goes well.
But if editors update the content the cache gets purged (deleted), and every subsequent request go straight to the upstream (if using php5-fpm you can see an army of new processes firing up). If you are lucky everything would normalize in a while... If you aren't, it will all go to hell.
But if we have the ability to flag a cache as EXPIRED, we can leverage 'fastcgi_cache_use_stale updating' to serve the old copy while the new one is being built.
Hi!
When using 'fastcgi_cache_use_stale updating' we can rely on serving and expired cache while the upstream builds a fresh new copy. But if we delete the cache, there is no expired resource... of course, there is just nothing.
Imagine the situation in which you have a very active site, think a news site, and suddenly you experience a huge traffic spike. If we are serving fastcgi cached pages, everything goes well. But if editors update the content the cache gets purged (deleted), and every subsequent request go straight to the upstream (if using php5-fpm you can see an army of new processes firing up). If you are lucky everything would normalize in a while... If you aren't, it will all go to hell.
But if we have the ability to flag a cache as EXPIRED, we can leverage 'fastcgi_cache_use_stale updating' to serve the old copy while the new one is being built.
What do you guys think?