Closed sniperct closed 6 months ago
Hmm yea running webtoon-dl --max-ep=1 "https://www.webtoons.com/en/comedy/mage-and-demon-queen/list?title_no=1438"
gives a 26 page PDF that is 3.3MB. 3.3MB / 26 pages * 3742 pages (how far you got) = ~475MB, much less than your stated limits.
I'll start running this command from my machine and see if I hit similar limits, then investigate further.
An intermediate workaround for you would be to use --min-ep
and --max-ep
flags to download multiple PDFs.
Hmm yea running
webtoon-dl --max-ep=1 "https://www.webtoons.com/en/comedy/mage-and-demon-queen/list?title_no=1438"
gives a 26 page PDF that is 3.3MB. 3.3MB / 26 pages * 3742 pages (how far you got) = ~475MB, much less than your stated limits.I'll start running this command from my machine and see if I hit similar limits, then investigate further.
An intermediate workaround for you would be to use
--min-ep
and--max-ep
flags to download multiple PDFs.
Thanks for looking at it! I'll try that work around!
@sniperct I'll also add a --eps-per-file
option soon to save the hassle of manually running a bunch of commands like
webtoon-dl --min-ep=1 --max-ep=10 "https://www.webtoons.com/en/comedy/mage-and-demon-queen/list?title_no=1438"
webtoon-dl --min-ep=11 --max-ep=20 "https://www.webtoons.com/en/comedy/mage-and-demon-queen/list?title_no=1438"
...
@sniperct weird, I got up to page 10433/11606 before getting the following error, I think due to poor wifi connection/computer sleeping:
added page 10433/11606
read tcp 192.168.1.7:53267->23.204.115.216:443: read: connection reset by peer
It took 1h51min to get there :( wish I could have checkpointed it as it went! I guess that's the --eps-per-file N
will do, write out PDFs incrementally every N episodes so that even if the whole thing fails most of the way through, you can restart it at the last saved episode using --min-ep
Anyway, weird that I didn't get the OOM problem. I monitored memory usage periodically and the program was never using more than ~5% of RAM, even past 3700 pages in. Were you possibly running in a resource constrained environment like a docker container or something?
@sniperct weird, I got up to page 10433/11606 before getting the following error, I think due to poor wifi connection/computer sleeping:
added page 10433/11606 read tcp 192.168.1.7:53267->23.204.115.216:443: read: connection reset by peer
It took 1h51min to get there :( wish I could have checkpointed it as it went! I guess that's the
--eps-per-file N
will do, write out PDFs incrementally every N episodes so that even if the whole thing fails most of the way through, you can restart it at the last saved episode using--min-ep
Anyway, weird that I didn't get the OOM problem. I monitored memory usage periodically and the program was never using more than ~5% of RAM, even past 3700 pages in. Were you possibly running in a resource constrained environment like a docker container or something?
Doing it in pieces seemed to have helped and after I was checking the initial one I did breaking it up into smaller chunks is probably better for reading anyway.
I was just running it from a command prompt(both elevated and not), nothing special. I had about 34gb of ram free (firefox and wow taking up the bulk of the used 29gb)
Sweet. I've now released v0.0.5, which defaults to saving a pdf every 10 episodes (--eps-per-file=10
), but that number is configurable. The updated README has the latest usage examples.
I'm going to close this issue now as that should get around the memory issues for the majority of users, but feel free to reopen it or open a new issue if you see the same or new problems with v0.0.5
Getting the following error while its adding pages:
runtime: out of memory: cannot allocate 4194304-byte block (1853358080 in use) fatal error: out of memory
I've got 300gb free space and 64gb of ram so I don't think that's actually the issue. I pasted in more of the error if that helps.
error.txt