multiSnow / mcomix3

End of Fork
Other
97 stars 39 forks source link

Archives with Images of High resolution #142

Closed sakkamade closed 3 years ago

sakkamade commented 3 years ago

What is the problem Upon an attempt to open a .zip archive containing images of >10000 pixels' width and ~6000 height, application is utterly unable to handle it, and hang the whole system until being terminated by task manager (thankfully).

First, I would like to understand if this is the issue of my particular system― If not, there is one suggestion: Inasmuch as this probably won't/cannot be fixed, restrict the program from trying to load such archives.

Experience this issue using the package from Arch Linux Repository.

Thank you.

sakkamade commented 3 years ago

What curious is that the archive is of a size of 30MB, however nearly 6 of 8GB's RAM, and 4 of 9GB's swap had been released upon termination (SIGKILL, to be exact, SIGTERM didn't work).

multiSnow commented 3 years ago
  1. check the total uncompressed size of images. if, unfortunately, image is in bmp format, even one image will over 170mb. It does take time to write the decompressed data into disk (or take huge RAM if temp directory is in tmpfs);

  2. no matter the format, one image with 10000x6000 pixels will cost at least about 460mb RAM (10000*6000*8, as pixbuf is always RGB colorspace). It means only eighteen pages will eat all 8GB RAM. You may adjust the 'Preference -> Advanced -> Maximum number of pages to store in the cache' to reduce the RAM usage of cache;

  3. It is really difficult to control RAM usage in application all the time in every situation. It is why we need the OOM sysrq.

sakkamade commented 3 years ago

Thank you for quick response!

Apologise for providing you with incorrect and insufficient data:


  1. Uncompressed size is 146MiB; the format of pictures is .jpg; the average size is 8MiB; 18 files.
  2. Indeed! I overlooked this option!—
    • Setting the value to 2, I was able to review the archive with no issues comparatively (if 3, the freezes are still happen, not so severe, of course; and with 1, the pictures seem to never load but thumbnails);
    • Didn't notice any great improvement upon changing the Maximum value of concurrent extraction threads to 0.

From other information, I am able to plainly preview all images (whilst each of them simultaneously stored in RAM probably on disk) from compressed archive by the program such as Ark; and, amongst viewers, QuickViewer (in this case files stored certainly on disk).

sakkamade commented 3 years ago

Setting the value to 2, I was able to review the archive with no issues comparatively

Setting it, however, I have been deprived of the Previous Page shortcuts in a Double page mode, i.e. the Pg Up, Up Arrow, and Left Arrow keys.

(Clicking upon thumbnails is still works.)

multiSnow commented 3 years ago

https://github.com/multiSnow/mcomix3/commit/9c641061904d4c07759a803794dec3c7b1209592 try to fix issue with the '1' and '2' value of the cache option.

'Maximum value of concurrent extraction threads' option almostly only useful with no-solid 7z and rar archive. zip archive support is provided by zipfile python module so it has no multi-threading support.

sakkamade commented 3 years ago

Upon trying the current master:

sakkamade commented 3 years ago

Thank you!