Bionus / imgbrd-grabber

Very customizable imageboard/booru downloader with powerful filenaming features.
https://www.bionus.org/imgbrd-grabber/
Apache License 2.0
2.51k stars 216 forks source link

Download to pause then resume when wifi cuts out then reconnects #2445

Open SvyraCosmix opened 3 years ago

SvyraCosmix commented 3 years ago

Just as the title says when internet is disconnected the download stops and does this (see image) cancels all the remaining files to be downloaded, wouldn't it make more sense for the downloading to pause then when the connection is reestablished the downloading would resume

Screenshot 2021-08-12 065140

Bionus commented 3 years ago

wouldn't it make more sense for the downloading to pause

Yes, I guess that's possible to detect common internet loss errors and pause the download rather than just keep failing.

then when the connection is reestablished the downloading would resume

That's thougher. All Qt classes in 5.15 that provide this kind of stuff are deprecated and don't seem to work very well. Qt 6 has QNetworkInformation but Grabber still uses Qt 5.

yami-no-tusbas commented 3 years ago

Well, can't grabber like "retry" every half-hour or something in that case ? Like "network error detected" put the work in pause, start a counter for 30minutes, then try just to get a page from the job in waiting, if ti work, restart, else pause another time ?

Bionus commented 3 years ago

I guess it could, but that feels very hacky 🤔

SvyraCosmix commented 3 years ago

I guess it could, but that feels very hacky 🤔

Good thing you didnt make the program to hack peoples computers, lmao in any case once I get another router it wont be a problem. It's not a huge problem just a suggestion

yami-no-tusbas commented 3 years ago

Well hacky, can't this function work ? it seems to be for an older QT : https://doc.qt.io/qt-5/qhostinfo.html If the name lookup fail, we can assume the connection has a problem, especially if it was working before. (note : I'm not a qt or c++ dev, i just looked the doc and it seems to do what we need) This seems to look for a route to a website, if this fail but the website was reachable before, then the connection is failing, pause, then retry later. This would solve #1706 too, it's basicly the same.

yami-no-tusbas commented 3 years ago

Ok i'm testing it with a download list, disconnecting internet when t he job run.

  1. try, the app crash here is the logs : main.log
  2. retry : The app stopped with an error, didn't reconnect the cable, just press resume, and crash : main.log So for the moment 6fb284b4 seems to crash when an error happen.

Now with monitors : I used start now to force a refresh, then cut internet while running the list and... Well it skip them :

[13:41:45.681][Info] Monitoring new images for 'precure' on 'Gelbooru'
[13:41:45.692][Info] [gelbooru.com][Xml] Loading page `https://gelbooru.com/index.php?page=dapi&s=post&q=index&limit=20&pid=0&tags=precure`
[13:41:45.730][Info] [gelbooru.com][Xml] Receiving page `https://gelbooru.com/index.php?page=dapi&s=post&q=index&limit=20&pid=0&tags=precure`
[13:41:45.730][Error] [gelbooru.com][Xml] Loading error: Network unreachable (99)
[13:41:45.731][Warning] [gelbooru.com] Loading using Xml failed. Retry using Html.
[13:41:45.731][Info] [gelbooru.com][Html] Loading page `https://gelbooru.com/index.php?page=post&s=list&tags=precure&pid=0`
[13:41:45.761][Info] [gelbooru.com][Html] Receiving page `https://gelbooru.com/index.php?page=post&s=list&tags=precure&pid=0`
[13:41:45.761][Error] [gelbooru.com][Html] Loading error: Network unreachable (99)
[13:41:45.762][Warning] [gelbooru.com] No valid source of the site returned result.
[13:41:45.762][Warning] No results for monitor 'precure' on site 'Gelbooru'

But at least it doesn't freeze at all, nor crashes. And it's fine for monitors, since next time the timers goes it'll try again and download what was missed. So it's more aboiut the crash on download list (.igl)

Bionus commented 3 years ago

Strange, I did the very same test and it just kept giving me an error each time I clicked "resume" when I was without internet, no crash. Maybe it's something about simultaneous downloads?

EDIT: same thing (no crash) even with 5 simultaneous downloads. 🤔

Well it skip them

That's expected, since monitors shouldn't block anything. As long as the "last state" is correct it should be fine I believe.

yami-no-tusbas commented 3 years ago

So my steps : use gelbooru as source, download first page (20 images), open the newtork interface window ready to deactivate the network adaptater. I set my simultaneous to one for the experience. Launch the download, error happens, I can resume or cancel, I choos resume. Error looping, can't continue (witch is fine, network is down atm) Reconnect the newtork, press resume, and its fine. Hum 🤔

Trying with 10 simultaneous download now. (I changed the limit to 200 images because 10 by 10 it goes fast) same procedure. Stoping it in the midle of a file and crash ! I couldn't even try to resume, the program hang and hard crash. here is the log but i see nothing wrong. main.log