multiSnow / mcomix3

End of Fork
Other
97 stars 39 forks source link

Bad performance with high-resolution images #159

Closed sakkamade closed 3 years ago

sakkamade commented 3 years ago

Description

The problem that I have noticed is that mcomix3 appears to be unable to properly handle image files of very high resolution. Loading of the next/previous image may take up to ~15 seconds. As of now, it is even slightly worse with zipped (no compression) archives of the same images.

Either zathura, which is unable to act as image viewer, and has no numbering sort*, although proven to show one of the best performance, and Geeqie (unable to open archives) display the same image files very smoothly.

Problems start to be noticeable from about x7000px, and begin to annoy after about x12000px.

The file size seems not to be a problem here at all.

Questions

  1. Is there a way to improve it, somehow?
  2. May the problem lay in that they are cached in RAM? Since it often takes almost entire RAM and sometimes even part of a swap (8GB both).
  3. Why does the size of the RAM cache is twice, sometimes thrice, as much as the total size of all images?
  4. Why are they even cached there? (It may be seen in both cases, viewing archived files and not.)

Version

I use master branch with latest commit aa630d1478acfaab8ec08a918e4f6faab117d43a.

Extra

As far as I remember, the best performance that I could obtain is with Maximum number of pages in cache: 2. Increasing this value slightly improves the image viewing but significantly degrades viewing of the archives.

I have already opened a very similar issue some time ago, https://github.com/multiSnow/mcomix3/issues/142, but back then I couldn't even review/load the archives of images, while now it is only a, comparatively, minor performance issue.


*files are being sorted as following: file_10 file_11 file_1 file_2 file_3 etc.

multiSnow commented 3 years ago

1, Both PIL and GdkPixbuf do not provide 'in-place' operation, and both are 'external' python object. It means python may be 'lazy' to collect memory space.

2, Some operation, such as enhance, auto background color and high quality scale, is only supported by PIL, while the mcomix and the mcomix3 at present is using GdkPixbuf as image object. It means the image object may be converted between PIL and GdkPixbuf again and again.

3, Historically, comix extract all contains of archive to disk. Even for now the temporary file is used, it still does cost space in disk/ram/swap.

4, Both PIL and GdkPixbuf do not seem to be designed for high-resolution or even ultra-resolution image. PIL has a hard-coded limitation on decoded image by pixels, while GdkPixbuf just raise error to process image if exceed some resolution.

The 'mage' branch will try to do something with these problem, and maybe a 'low spec' preference will be also added if really necessary, but whatever, please be aware that the hardware limitation always exists and the GdkPixbuf or even the mcomix3 does not tend to be another AAA professional image and photo processing suit.

sakkamade commented 3 years ago

Sorry, I will give my reply to every point later, but as for this one:

Historically, comix extract all contains of archive to disk. Even for now the temporary file is used, it still does cost space in disk/ram/swap.

Yes, but apparently, if I set "Temporary directory" to folder on disk, the images are still cached in RAM. Since it only used to extract archives, right? Is it possible to change the "cache directory/place" as well? Even if it is possible, will it change something?

I am not sure if this behaviour can be reproduced with any images, however, for example, when I try to open the folder with such high-res images, the mcomix takes thrice as much space on the RAM, e.g. folder size is 300MB, RAM taken >1GB.

When I just opened such folder, mcomix takes only ~300MB's RAM, but as I scroll through images in it the RAM usage is increasing with each image change. (I also tried with default settings, and I can plainly see it.)

sakkamade commented 3 years ago

1 ...

This is only a concern of slow initial images' loading, I understand, but I see image loading just as slow even after I viewed it a moment ago.

2 ...

So this is the likely culprit of the behaviour I described above.

I don't use "auto background color," but by "high quality scale," do you mean scaling the >2000x image down to view it on my HD (~1700x) screen?

3 ...

https://github.com/multiSnow/mcomix3/issues/159#issuecomment-907573227

4 ....

What the exact numbers? Searching for it superficially, I could not come across them.

The 'mage' branch will try to do something with these problem,

Is it all ready to use/test? If so, I will test it tomorrow.

pirate486743186 commented 2 years ago

@multiSnow hum reopen the issue?

There's a "new" comic book format out there. The whole thing on a single image with normal width and very large height (for example 28000). I don't know how wide spread that "format" really is, but it exists. So properly supporting very large images is not just a gimmick.

In this case, it actually crashes when maximizing or going full screen with "fit width". Probably hitting the limits in the libraries( they probably didn't like the height).

It would be nice, if at least it didn't crash for the fullscreen/fit_width. Technically it should be able to do it. The libraries probably lazily abort if either of the sizes exceeds a limit, but here the image actually has a normal width and it's not as large as the height alone would imply. I'm not saying to remove the limits, i would not be happy if i accidentally opened a 28000x28000 image. The limit should consider the actual size of the image.

@sakkamade The latest version of zathura can view image folders.

sakkamade commented 2 years ago

@pirate486743186, Please report your issues to https://sourceforge.net/projects/mcomix/ It looks like it was revived.

Read https://github.com/multiSnow/mcomix3

please use the original mcomix (https://sourceforge.net/projects/mcomix/)


The latest version of zathura can view image folders.

Thanks!