Closed mananas77 closed 3 years ago
I believe I'm narrowing in on this issue. If you wouldn't mind to run a follow up test for me, I can verify that I'm on the right path. Can you run this again exactly the way you described, but go into settings under "core" and change the multi-part download threshold to 10 GB. Let me know if it runs correctly after doing this.
Yes you are right, with "multi-part download threshold 10 GB" it did go all the way to completion. I did two tests in the same manner with the new threshold.
First test:
ERROR: Failed to extract due to: Unsupported domain ERROR: Failed to extract due to: Unknown error occurred ERROR: Failed to extract due to: Unsuccessful response from server
Second test
1.Here i wanted to end the download midway but was not allowed to stop or terminate download. It just kept going.
I did shutdown the software and started up again anew
Tried to run unfinished downloads again to get the remaining 500 posts but only got this output
I just did a Third longer test with more subreddits that was a real pain in the butt
After about 22 minutes the progress bar went missing
When FFmpeg started to do its thing the computer was unusable with all the popups. There was certainly more than a thousand and every couple of minutes i had to make manual decitions like this one:
Describe the bug
The software always become stuck in a loop, and never gets completed when i try to download all the posts of a subreddit.
It's seems that the software encounter some sort of download error that it can't recover from and then get stuck in a loop. I've tried to wait more then 18 hours without any progress at all.
At this point i'm unable to do any action related to "Stop Download" And "Terminate Download".
When i finaly exit the software it's still running in the background and i have to Terminate in task manager.
If i start again and try to download or "Run unfinished". It gets finished in 3 sec without any new files downloaded. And at this point i'm locked from downloading any remaining posts and have to delete the database to get a new chance.
It also seems that the url extraction part works while i'm running my download and something gets dumped into the database. But the database don't seem to recognice if the url has been downloaded or not and tells the software "if url is in the database, don't download".
Environment Information
Steps to reproduce my latest encounter of the behavior:
Start with a fresh build of the latest software version (Remember to remove your old config and db)
Go to settings and add Imgur credentials (or else you get more errors)
Add a new subreddit like "aww" or try with a couple that have many images and videos.
Use these settings on the subreddit and everything else default.
For all but the most trivial of issues, please attach the latest log file.
DownloaderForReddit.log