Closed Eugeoter closed 8 months ago
waifuc do NOT have any exit like this.
so the only explanation is that your OS killed this process.
just check what is going on your OS.
@Eugeoter wait, another possibility.
you said you start another download task, can you describe what is that exactly?
the another possibility is, if this "another download task" is danbooru, and too fast, the danbooru pagination api may down, by return empty data, and the waifuc process will stop (certainly without error or exception, because the empty pagination is quite normal when all the images ran out).
can u find the DanbooruSource
's code, and add one print to what the pagination api return?
NOTE: do NOT try to make danbooru unhappy in any forms if you dont want your ip get banned forever 🤣
@Eugeoter wait, another possibility.
you said you start another download task, can you describe what is that exactly?
the another possibility is, if this "another download task" is danbooru, and too fast, the danbooru pagination api may down, by return empty data, and the waifuc process will stop (certainly without error or exception, because the empty pagination is quite normal when all the images ran out).
can u find the 's code, and add one print to what the pagination api return?
DanbooruSource
NOTE: do NOT try to make danbooru unhappy in any forms if you dont want your ip get banned forever 🤣
Thank you for your early reply! Such 'another download task' is just a usual task like downloading a game (not from danbooru). So, I think your formal answer sounds more possible though I can't figure out why my OS kill the program. Anyway, thank you for your amazing project and hoping it becomes better and better~
@Eugeoter you are welcome. if you have some further finding or clues, feel free to reopen this issue. 😄
The program usually terminates abnormally without ANY error message (looks hidden?), even the try-except syntax cannot capture it. This occurs when downloading, especially when starting another download task during crawling (unstable network). Every time this happens, I have to restart crawling and request those downloaded sources again...
Here is the error message.
I'd be appreciated if you could help me~