umbraco / Umbraco.StorageProviders

MIT License
29 stars 21 forks source link

Frequent batches of "System.OutOfMemoryException" tracing back to Umbraco.StorageProviders.AzureBlob #62

Closed davidzweben closed 8 months ago

davidzweben commented 1 year ago

Which Umbraco version are you using? (Please write the exact version, example: 10.1.0)

12.3.1

Bug summary

Every day, roughly 2-5 times a day, in batches of perhaps 15-30 errors each fractions of a second apart, we are getting the following error message on our Live and Staging sites: System.OutOfMemoryException: Exception of type 'System.OutOfMemoryException' was thrown. This traces back to Umbraco.StorageProviders.AzureBlob, which we use to store Umbraco Media on Azure Blob Storage. Much further back, it also traces back to SixLabors.ImageSharp.Web.Middleware.ImageSharpMiddleware.

We are having issues with the site crashing and restarting periodically (about twice a day), and while I believe this is more directly correlated with a different error we are observing (The process cannot access the file 'C:\local\Temp\UmbracoData\0fb4791a2d6da2484000f508079587952035cfb7\NuCache\NuCache.Content.db' because it is being used by another process. ) that's related to the use of Azure deployment slot swapping, it seems like this memory issue may be a contributing factor, and it certainly isn't good.

The errors in the log also capture the media URLs that are associated with the error, but these seem to be random commonly accessed images; I don't see any other pattern to them, nor to the overall timing of the batches of errors.

Specifics

Every day, roughly 2-5 times a day, in batches of perhaps 15-30 errors each fractions of a second apart, we are getting the following error message on our Live and Staging sites:

System.OutOfMemoryException: Exception of type 'System.OutOfMemoryException' was thrown.
   at System.GC.AllocateNewArray(IntPtr typeHandle, Int32 length, GC_ALLOC_FLAGS flags)
   at System.GC.<AllocateUninitializedArray>g__AllocateNewUninitializedArray|66_0[T](Int32 length, Boolean pinned)
   at System.Buffers.TlsOverPerCoreLockedStacksArrayPool`1.Rent(Int32 minimumLength)
   at Azure.Storage.LazyLoadingReadOnlyStream`1..ctor(DownloadInternalAsync downloadInternalFunc, GetPropertiesAsync getPropertiesFunc, DownloadTransferValidationOptions transferValidation, Boolean allowModifications, Int64 initialLenght, Int64 position, Nullable`1 bufferSize, PredictEncryptedRangeAdjustment rangePredictionFunc)
   at Azure.Storage.Blobs.Specialized.BlobBaseClient.OpenReadInternal(Int64 position, Nullable`1 bufferSize, BlobRequestConditions conditions, Boolean allowModifications, DownloadTransferValidationOptions transferValidationOverride, Boolean async, CancellationToken cancellationToken)
   at Azure.Storage.Blobs.Specialized.BlobBaseClient.OpenRead(Int64 position, Nullable`1 bufferSize, BlobRequestConditions conditions, CancellationToken cancellationToken)
   at Umbraco.StorageProviders.AzureBlob.AzureBlobItemInfo.CreateReadStream()
   at Microsoft.AspNetCore.Http.SendFileResponseExtensions.SendFileAsyncCore(HttpResponse response, IFileInfo file, Int64 offset, Nullable`1 count, CancellationToken cancellationToken)
   at Microsoft.AspNetCore.StaticFiles.StaticFileContext.SendAsync()
   at Microsoft.AspNetCore.StaticFiles.StaticFileContext.ServeStaticFile(HttpContext context, RequestDelegate next)
   at StackExchange.Profiling.MiniProfilerMiddleware.Invoke(HttpContext context) in C:\projects\dotnet\src\MiniProfiler.AspNetCore\MiniProfilerMiddleware.cs:line 114
   at Umbraco.Cms.Web.Common.Middleware.UmbracoRequestMiddleware.InvokeAsync(HttpContext context, RequestDelegate next)
   at Microsoft.AspNetCore.Builder.UseMiddlewareExtensions.<>c__DisplayClass6_1.<<UseMiddlewareInterface>b__1>d.MoveNext()
--- End of stack trace from previous location ---
   at Umbraco.Cms.Web.Common.Middleware.PreviewAuthenticationMiddleware.InvokeAsync(HttpContext context, RequestDelegate next)
   at Microsoft.AspNetCore.Builder.UseMiddlewareExtensions.<>c__DisplayClass6_1.<<UseMiddlewareInterface>b__1>d.MoveNext()
--- End of stack trace from previous location ---
   at Umbraco.Cms.Web.Common.Middleware.UmbracoRequestLoggingMiddleware.InvokeAsync(HttpContext context, RequestDelegate next)
   at Microsoft.AspNetCore.Builder.UseMiddlewareExtensions.<>c__DisplayClass6_1.<<UseMiddlewareInterface>b__1>d.MoveNext()
--- End of stack trace from previous location ---
   at SixLabors.ImageSharp.Web.Middleware.ImageSharpMiddleware.Invoke(HttpContext httpContext, Boolean retry)
   at Microsoft.AspNetCore.Diagnostics.ExceptionHandlerMiddlewareImpl.<Invoke>g__Awaited|8_0(ExceptionHandlerMiddlewareImpl middleware, HttpContext context, Task task)

The call stack is the same every time. The image URL seems to be a random representation of commonly accessed image URLs on the site; PNGs, GIFs, JPGs, an MP4. I don't see any pattern to it, and don't see any way to narrow it down any more. This will happen at a seemingly random time in a big batch, then nothing for hours, then again a big batch milliseconds apart, presumably tied to the same web page request by some random visitor.

Steps to reproduce

Expected result / actual result

No recurring System.OutOfMemoryException errors.

github-actions[bot] commented 1 year ago

Hi there @davidzweben!

Firstly, a big thank you for raising this issue. Every piece of feedback we receive helps us to make Umbraco better.

We really appreciate your patience while we wait for our team to have a look at this but we wanted to let you know that we see this and share with you the plan for what comes next.

We wish we could work with everyone directly and assess your issue immediately but we're in the fortunate position of having lots of contributions to work with and only a few humans who are able to do it. We are making progress though and in the meantime, we will keep you in the loop and let you know when we have any questions.

Thanks, from your friendly Umbraco GitHub bot :robot: :slightly_smiling_face:

Zeegaan commented 1 year ago

Heyo! 👋 Sounds like an annoying problem, sorry to hear that!

Could you maybe come up with some steps to reproduce this? I would really love to help, but can't really do anything without being able to reproduce it myself 🙏

ronaldbarendse commented 8 months ago

The OutOfMemoryException clearly indicates your application requires more memory to execute. This can be caused by a memory leak, but seeing you're the only one reporting this, I'm pretty confident this isn't the case within either the CMS or this Storage Providers package.

Reading files from Azure Blob Storage uses streams, so it doesn't load every file completely into memory, but do note that the default buffer is 4MB (at least on the latest version). So if your application gets lots of file requests, it may require more memory to stream the responses... To avoid this, I recommend adding a CDN in front of your application, reducing this pressure on your application.