Open mpheyse opened 8 years ago
Its said on readme.md, firefox is an ass when handling files with 350mb+ even if its under filesize limit of 800mb, you could try it on chrome for this one gallery, or try the torrent. I dont know if the script could construct the blob if the file is not compressed. There is a Fork that download the files in a folder using the same way. https://github.com/8qwe24657913/E-Hentai-Downloader-NW.js
Duplicate of #18 in some way, you can also see our discussion here.
I understand you, all the problems you mentioned are the shortage of the project.
It's not a good way (or even foolish) to zip them in browser, but as a user script, it's probably the best way. JavaScript (in browser) doesn't have access to write files into HDD, even on Chrome, File System API is a split and independent environment, so we can't store them into HDD directly. If we require to save them one by one, it's not friendly for some users who don't set the default download path, so they will see a lot of "Save As" dialog.
In fact, I'm not sure the maximum blob storage size of Firefox is 800 MB, I think it all depends on the free RAM space. This script is modified from my another script Minus Downloader (I wrote it in 2014 and doesn't on public), a script to download images on minus.com (thought this site is down now). At that time my RAM is 4 GB, and I usually got an error of out of memory, thought the gallery was less than 200 MB. Now my RAM is 8 GB, and from my latest test, I constructed the maximum Blob object is 800 MB - 1 B (799,999,999 Bytes, but I can still construct more small Blob object), and the maximum size of all Blob objects is about 2 GB (mentioned in #18). So I think the size of RAM is also the limitation, thought still confused about how Firefox works on Blob and Array Buffer.
Browser doesn't garbage collection immediately, so sometimes downloaded parts don't remove from RAM. Here is a way to reduce your memory usage, thought not usually works: https://github.com/ccloli/E-Hentai-Downloader/wiki/Can't-make-Zip-file-successfully
If you usually have this problem, try torrents or other tools.
So I'm tying to download a archive that way too big... I understand that.
So I downloaded just 30 images, Worked (101MBs)
The I tried several times to get 100 images. That Dies with some failed to zip error. And I was running out of ram.
Restart with clean run of FF, tried to get 50 images. FF used about 300MB (of RAM), and fails during the start of the zip process with (NS_Out of Memory). FF isn't using anymore ram past downloading the last pic. System isn't out of ram..., and FF's only using 770MB overall
Tried both 100, and 50; 4 times each,
Go back to 30 images per zip, and it instantly makes a zip grab the next 30, it instantly get 20, and starts downloading the 50th. also works
Those two archives are only 172MB total, which is 60 images and at double ram usage is only 344MB which should be well under the 800MB FF limit.
The four archives (30,30,30,10) that represent the 100 images originally tried, total up to just 304MB which at double ram is 608MB and still well under the 800MB FF limit.
It is downloading all the images fine, its just the JSZIP that's dying. I would much rather have the script download all 600 images (and save), and then zip (or RAR/.7/.ace) them by hand; As downloading 20 little 30-image zipfiles, is annoying.