dannyedel / dspdfviewer

Dual-Screen PDF Viewer for latex-beamer
http://dspdfviewer.danny-edel.de
GNU General Public License v2.0
218 stars 27 forks source link

bad allocation error #140

Closed ghost closed 8 years ago

ghost commented 8 years ago

I'm trying to use the binaries for the last available version v1.14 (I've tried older ones and get the same problem) in a Windows 10 64 bits PC. I get a 'bad allocation' error.

Any idea of how solving this? Thanks!

error

dannyedel commented 8 years ago

I will look into it, but I'm CCing @projekter on this since he's the windows expert.

In the meantime, can you try downloading other versions from https://github.com/projekter/dspdfviewer/releases and check if one of them works? This may be a regression.

I see in your screenshot that this is the version v1.13-55-g3af1b8c which is still qt4 based.

In the meantime, we upgraded to qt5 which has better windows compatibility, so it may be worth trying one of those too.

dannyedel commented 8 years ago

Sorry, I overlooked that you already checked that all versions are affected. I'm trying to reproduce this now in a VM

dannyedel commented 8 years ago

@raissel according to google, "bad allocation" normally indicates some kind of out-of-memory condition on windows.

Could you check whether you can correctly open the very small lipsum pdf ? If that works, but your large presentation does not, please report how much memory your system has free, how many pages your presentation has, and what screen resolutions you are using.

For example, on a dual-full-HD computer an estimation of the memory per cached page (the real amount will be higher, due to overhead)

1920px x 1080px x 24bit x 2screen = 99,532,800 bit = 11.8 MiB

Additionally, dspdfviewer by default loads the entire pdf file into RAM. If that is not what you want, please pass the --cache-to-memory false option or specify it in the configuration file.

Also play around with --prerender-previous-pages 1 --prerender-next-pages 1 in case you're low on memory and want to avoid serious pre-loading.

(Sorry for the generic answer, I'm still downloading the VM image, this takes a while with 600kb/sec connection...)

ghost commented 8 years ago

It is definitely a memory problem.

I have 8gb RAM, 4 of them free when launching the application. My presentation has 98 slides, the resolution is 1920x1080 and the PDF file weights 135 mb.

I tried with just a few slides and it works perfectly.

Thanks @dannyedel for answering so fast !!

projekter commented 8 years ago

I built and tested the application on Windows 10, so this should not be an issue. I will try to figure out whether I can build a 64 bit version of the application, which can then make use of more than 2 GB of RAM. The presentation which I use twice a week has 380 slides and 3 MB; dspdfviewer loaded with it consumes about 100 MB of RAM at the beginning and slowly increases, when more slides are loaded. But this does not lead to an allocation failure at the beginning. Can you perhaps try to figure out how many pages (ie which file size) is the limit by splitting into halves? Perhaps the problem is related to some specific page, which might contain data that are corrupted in the eyes of poppler. Edit: I checked building a 64 bit version. This requires recompilation of all external libraries (at least; some of them perhaps will even need modifications, if they make extensive use of the underlying architecture), so I cannot quickly offer such a version. It will definitely take some time (which I don't have at the moment).

dannyedel commented 8 years ago

On 28/01/16 14:54, Benjamin Desef wrote:

I will try to figure out whether I can build a 64 bit version of the application, which can then make use of more than 2 GB of RAM.

And in the meantime, on Linux I will try and see whether I can improve the "out of memory" handling. Just reporting bad alloc to the user and dying is certainly not very useful, especially if caused by slides in the cache, which could be safely deleted and re-created.

dannyedel commented 8 years ago

I can now reliably reproduce the issue.

With the following commands, it is currently guaranteed to die somewhere due to bad_alloc.

cd build
bash #spawn a new shell
ulimit -v 786432 # limit to 768M virtual address space - this includes the memory used by qt etc.
./dspdfviewer something-with-many-pages.pdf --prerender-next-pages=100

As a workaround, I can currently recommend setting --prerender-next=1 --prerender-previous=1 to minimize the impact by the render threads. The current implementation turns out to be a massive memory hog, since it will start up to (prev+next) renderers at the same time, regardless of how many CPUs you actually have.

I will do my best to improve the programming of the render infrastructure. Thanks for pointing me to that bottleneck : )