Open PierreGSQ opened 7 years ago
Hi @PierreGSQ
2GB is a large file. What bitdepth are you working in 32bit or 64bit? You'll need to be in 64bit to handle that much contiguous memory.
Cheers
Thanks for your answer!
Yes I already changed my Platform target in order to work only with x64. It works but the point is that I needed to scale up my platform (on Azure) to fit a RAM capacity that can feet my large file. But I think it's not an optimal way to do it because the load can surpass my memory capacity.
Is there a way to avoid the total file to be loaded in memory after the media upload ?
I don't suppose you ever found a way around this? We are dealing with some large files that upload fine, but out of memrory exceptions occur on the front end when we download them (500MB zip file)
The problem is this line.
We should return CloudBlockBlob.OpenRead
and let Umbraco deal with the returned stream. Any large file handling should be abstracted away there.
Ahhh I see, thanks for the heads up 👍
I only just noticed three years later that @PierreGSQ uses Upload in the title and description but is describing download API calls.
Hi!
I get an OutOfMemoryException when I try to upload a large file (for example 2GB).
After investigating a little bit, it seems that when the media is uploaded, the media's file size property is set through the property
GetSize
of theFileSystemExtensions
class which calls the Azure file system provider methodOpenFile
of theAzureFileSystem
class. This one uses theDownloadToStream
to open the file and load it into aMemoryStream
which seems to load the entire content in my memory and throws the out of memory exception.Do I misused the plugin ? Is there another way to use it in order to avoid this RAM consumption ?
Thanks for your work ;)