Closed kornelski closed 7 years ago
agrees with you Guetzli's is use to much memory , my linux only have 4gb ram is very lag
Will it be reliable enough if it's done heuristically based on number of pixels in the image?
The 300MB/MPix estimate is reasonably conservative. There is no mechanism that can cause memory usage to grow superlinearly with the size of an image.
I second the memory limit request. This becomes even more important if you run the algorithm in parallel on multiple images at once (to help with the slow performance when batch-processing images.)
I've also found it easy to bring my Mac to a screeching halt. Those with fewer than 16GB of RAM will be especially vulnerable.
I've tried Guetzli on a large batch of images, and my macOS machine ended locking up hard. I presume it's because macOS doesn't fail
malloc
, but instead switches to using a combination of compressed RAM and swap, which on such a large memory demand bring performance to a halt.I'm interested in limiting Guetzli's memory use to a percentage of machine's RAM size (e.g. no more than half of all RAM).
Will it be reliable enough if it's done heuristically based on number of pixels in the image?
Is
CacheAligned::Allocate
used for the majority of allocations? Would it be better to put a limit there?