Closed Gonyoda closed 1 year ago
Thanks for reporting issue. We will look into it.
in the meantime as a workaround the following method could be perhaps used
PUT
AUTO_COMPRESS = FALSE
and SOURCE_COMPRESSION = <your_compression_method>
in the PUT
optionsThe issue has been addressed within PR #734 and will be available with the next driver release.
fix is now out with release 2.1.2
I ran the benchmarks from my pull request #622 against your changes and here are the results:
| Method | Mean | Error | StdDev | Allocated |
|----------------- |---------:|---------:|---------:|-------------:|
| RunSmallMemory | 744.0 ms | 14.10 ms | 32.96 ms | 19.4 KB |
| RunDefaultMemory | 208.3 ms | 2.38 ms | 4.04 ms | 2062.27 KB |
| RunLargeMemory | 197.1 ms | 2.09 ms | 1.86 ms | 859397.59 KB |
RunSmallMemory: MaxBytesInMemory = 1024 bytes RunDefaultMemory: MaxBytesInMemory =1mb RunLargeMemory: MaxBytesInMemory = 220mb
So: very nice improvement! Thank you!
Issue description
When uploading or downloading files, the EncryptionProvider and related code work with the file contents entirely using byte arrays. For large files, this can cause OutOfMemory exceptions and/or cause the process to crash due to Docker memory limits.
Instead, use Streams which can use files vs memory to accomplish the same goal.
Example code
Configuration
Driver version: Snowflake.Data 2.0.21
Dotnet framework and version: .NET Core 7.0.201
Server version: 7.8.1
Client OS: Windows and/or Ubuntu