Open gnjack opened 1 week ago
A possible workaround is to implement a custom IOutputCachePolicy
which disables AllowCacheStorage
when the request is cancelled:
public ValueTask ServeResponseAsync(OutputCacheContext context, CancellationToken cancellationToken)
{
if (cancellationToken.IsCancellationRequested)
{
context.AllowCacheStorage = false;
return ValueTask.CompletedTask;
}
...
return ValueTask.CompletedTask;
}
Which should possibly be part of the default policy?
However, when combined with resource locking this can cause nasty cache stampedes. Resource locking ensures that all requests for a given response wait for the first request to populate the cache and is the default cache behaviour.
Currently, if you disable AllowCacheStorage
in ServeResponseAsync
while using the default LockingPolicy.Enabled
then all queued requests locked behind the single running request will all be allowed through at once by the caching middleware, resulting in many concurrent non-cached requests for the same resource.
If the application is running slowly due to load and a request is aborted due to timeout, this can result in undesirable behaviour where a struggling application now needs to serve many concurrent uncached requests for a resource, instead of only one at a time.
We've instead opted to use cancellationToken.ThrowIfCancellationRequested();
in ServeResponseAsync
. This will ensure empty responses from cancelled requests are not cached due to the thrown expetion. Any pending requests that were waiting on the cancelled request to populate the cache will receive a 500 error. Having one client cancelling a request cause other clients to receive an error is not ideal, but allows the clients to retry while still only allowing a single concurrent request to attempt to populate the cache on retry - preventing a cache stampede.
Ideally the caching middleware should only allow one request at a time to attempt to populate the cache when locking is enabled, even if the first request failed to populate the cache, to prevent failures from causing cache stampedes.
Is there an existing issue for this?
Describe the bug
The output caching middleware returns an empty response body for subsequent requests when an in progress request for the same resource is cancelled by the client.
I believe this is because the request cancellation causes the cancellation of writing the JSON stream to the response buffer however the caching middleware still considers this response buffer to be valid cacheable output and returns that incomplete buffer to all other pending / subsequent requests until the cache expires.
Expected Behavior
Empty / partial output from a cancelled request should not be cached.
Steps To Reproduce
Here is a minimal API that reproduces the issue:
The API takes 5 seconds to return some JSON with the current time, simulating a slow response. This should be cached for 10 seconds.
Loading
/time
in the browser and hitting refresh a few times in the 5 seconds while the page loads will cause an empty response to be returned instead of the JSON. Subsequent requests will immediately return the invalid empty cached response.If you wait 10 seconds for the cache to expire and load the page a single time, a valid JSON response is returned and cached for subsequent requests.
I was unable to reproduce the issue using
TestServer
in memory integration tests - I think this might be because the testHttpClient
doesn't properly reproduce request cancellation.This console app sends a request to the above API every second, but cancels the first request.
The invalid empty cache response can be seen in the logs, which resolves after the cache expires:
.NET Version
8.0.200