evilhero / mylar

An automated Comic Book downloader (cbr/cbz) for use with SABnzbd, NZBGet and torrents
GNU General Public License v3.0
977 stars 172 forks source link

error preventing anymore downloads after restart #2340

Closed phairplay closed 3 years ago

phairplay commented 5 years ago

hi, every time i restart my wanted list is check and the first issue is sent for downloading, then the following error appears, which then seems to stop any more comics from being searched

2019-08-30 06:42:43 ERROR   Uncaught exception: Traceback (most recent call last):
File "C:\Mylar\mylar\logger.py", line 337, in new_run
old_run(*args, **kwargs)
File "C:\Python27\lib\threading.py", line 763, in run
self.__target(*self.__args, **self.__kwargs)
File "C:\Mylar\mylar\helpers.py", line 3147, in search_queue
ss_queue = mylar.search.searchforissue(item['issueid'])
File "C:\Mylar\mylar\search.py", line 1855, in searchforissue
foundNZB, prov = search_init(ComicName, IssueNumber, str(IssueYear), SeriesYear, Publisher, IssueDate, StoreDate, actissueid, AlternateSearch, UseFuzzy, ComicVersion, SARC=SARC, IssueArcID=IssueArcID, mode=mode, rsscheck=rsscheck, ComicID=ComicID, filesafe=Comicname_filesafe, allow_packs=allow_packs, oneoff=oneoff, manual=manual, torrentid_32p=TorrentID_32p, digitaldate=DigitalDate, booktype=booktype)
File "C:\Mylar\mylar\search.py", line 359, in search_init
findit = NZB_SEARCH(ComicName, IssueNumber, ComicYear, SeriesYear, Publisher, IssueDate, StoreDate, searchprov, send_prov_count, IssDateFix, IssueID, UseFuzzy, newznab_host, ComicVersion=ComicVersion, SARC=SARC, IssueArcID=IssueArcID, RSS="no", ComicID=ComicID, issuetitle=issuetitle, unaltered_ComicName=unaltered_ComicName, allow_packs=allow_packs, oneoff=oneoff, cmloopit=cmloopit, manual=manual, torznab_host=torznab_host, torrentid_32p=torrentid_32p, digitaldate=digitaldate, booktype=booktype)
File "C:\Mylar\mylar\search.py", line 1113, in NZB_SEARCH
parsed_comic = p_comic.listFiles()
File "C:\Mylar\mylar\filechecker.py", line 118, in listFiles
runresults = self.parseit(self.dir, self.file)
File "C:\Mylar\mylar\filechecker.py", line 1062, in parseit
elif 'XCV' in alt_issue:
TypeError: argument of type 'NoneType' is not iterable

here is the full log

Timestamp   Level   Message
2019-08-30 06:42:46 INFO    [POST-PROCESSING] Post-Processing completed for: House of X #3
2019-08-30 06:42:46 INFO    [POST-PROCESSING][NOTIFIER] PushOver notifications sent.
2019-08-30 06:42:46 INFO    [POST-PROCESSING][UPDATER] Updating Status (Downloaded) now complete for House of X issue: 3
2019-08-30 06:42:46 INFO    [POST-PROCESSING][UPDATER] Setting status to Downloaded in history.
2019-08-30 06:42:45 INFO    [POST-PROCESSING][DIRECTORY-CHECK] Found comic directory: \\phair-server\Comics\Mylar\House of X (2019)
2019-08-30 06:42:45 INFO    [POST-PROCESSING] [1/1] Starting Post-Processing for House of X issue: 3
2019-08-30 06:42:45 INFO    [DUPECHECK] Duplication detection returned no hits. This is not a duplicate of anything that I have scanned in as of yet.
2019-08-30 06:42:45 INFO    [DUPECHECK] Duplicate check for C:\Users\User\Downloads\Comics\House.of.X.03.of.06.2019.Digital.Zone-Empire
2019-08-30 06:42:45 INFO    [POST-PROCESSING] issuenzb found.
2019-08-30 06:42:45 INFO    [PPINFO-POST-PROCESSING-ATTEMPT] {'publisher': None, 'comicname': u'House of X', 'issueid': u'717519', 'comiclocation': None, 'sarc': None, 'issuenumber': u'3', 'oneoff': None, 'comicid': u'120309'}
2019-08-30 06:42:45 INFO    Starting postprocessing for : House.of.X.03.of.06.2019.Digital.Zone-Empire
2019-08-30 06:42:45 INFO    ComicRN.py version: 1.01 -- autoProcessComics.py version: 2.04
2019-08-30 06:42:45 INFO    [API] Api Call from ComicRN detected - initiating script post-processing.
2019-08-30 06:42:43 ERROR   Uncaught exception: Traceback (most recent call last):
File "C:\Mylar\mylar\logger.py", line 337, in new_run
old_run(*args, **kwargs)
File "C:\Python27\lib\threading.py", line 763, in run
self.__target(*self.__args, **self.__kwargs)
File "C:\Mylar\mylar\helpers.py", line 3147, in search_queue
ss_queue = mylar.search.searchforissue(item['issueid'])
File "C:\Mylar\mylar\search.py", line 1855, in searchforissue
foundNZB, prov = search_init(ComicName, IssueNumber, str(IssueYear), SeriesYear, Publisher, IssueDate, StoreDate, actissueid, AlternateSearch, UseFuzzy, ComicVersion, SARC=SARC, IssueArcID=IssueArcID, mode=mode, rsscheck=rsscheck, ComicID=ComicID, filesafe=Comicname_filesafe, allow_packs=allow_packs, oneoff=oneoff, manual=manual, torrentid_32p=TorrentID_32p, digitaldate=DigitalDate, booktype=booktype)
File "C:\Mylar\mylar\search.py", line 359, in search_init
findit = NZB_SEARCH(ComicName, IssueNumber, ComicYear, SeriesYear, Publisher, IssueDate, StoreDate, searchprov, send_prov_count, IssDateFix, IssueID, UseFuzzy, newznab_host, ComicVersion=ComicVersion, SARC=SARC, IssueArcID=IssueArcID, RSS="no", ComicID=ComicID, issuetitle=issuetitle, unaltered_ComicName=unaltered_ComicName, allow_packs=allow_packs, oneoff=oneoff, cmloopit=cmloopit, manual=manual, torznab_host=torznab_host, torrentid_32p=torrentid_32p, digitaldate=digitaldate, booktype=booktype)
File "C:\Mylar\mylar\search.py", line 1113, in NZB_SEARCH
parsed_comic = p_comic.listFiles()
File "C:\Mylar\mylar\filechecker.py", line 118, in listFiles
runresults = self.parseit(self.dir, self.file)
File "C:\Mylar\mylar\filechecker.py", line 1062, in parseit
elif 'XCV' in alt_issue:
TypeError: argument of type 'NoneType' is not iterable
2019-08-30 06:42:43 INFO    no errors on data retrieval...proceeding
2019-08-30 06:42:42 INFO    Pausing for 60 seconds before continuing to avoid hammering
2019-08-30 06:42:42 INFO    Shhh be very quiet...I'm looking for Justice League issue: 24 (2019) using NZBHydra (newznab).
2019-08-30 06:42:42 INFO    [SEARCH-QUEUE] Now loading item from search queue: {'comicname': u'Justice League', 'issueid': u'709167', 'seriesyear': u'2018', 'booktype': u'Print', 'issuenumber': u'24', 'comicid': u'111428'}
2019-08-30 06:42:37 INFO    [UPDATER] Updated the status (Snatched) complete for House of X Issue: 3
2019-08-30 06:42:37 INFO    [UPDATER] Updating status to snatched
2019-08-30 06:42:37 INFO    setting the alternate nzbname for this download grabbed by NZBHydra (newznab) in the nzblog to : HouseofX03of062019DigitalZone-Empire
2019-08-30 06:42:37 INFO    setting the nzbid for this download grabbed by NZBHydra (newznab) in the nzblog to : -1422323859855797251
2019-08-30 06:42:37 INFO    mylar.COMICINFO: [{'IssueID': u'717519', 'pack_numbers': None, 'IssueNumber': u'3', 'nzbtitle': u'House of X 03 (of 06) (2019) (Digital) (Zone-Empire)', 'nzbid': u'-1422323859855797251', 'size': '45.5 MB', 'newznab': (u'NZBHydra', u'https://myownip.com/nzbhydra', u'1', u'ab00y7qye6u84lx4eqhwd0yh1wp423', u'7030', u'1'), 'ComicName': u'House of X', 'provider': u'NZBHydra', 'ComicID': u'120309', 'ComicVolume': None, 'modcomicname': u'House of X', 'link': u'http://myownip.com/nzbhydra/getnzb/api/-1422323859855797251?apikey=ab00y7qye6u84lx4eqhwd0yh1wp423', 'IssueDate': u'2019-10-01', 'kind': 'usenet', 'torznab': None, 'SARC': None, 'oneoff': False, 'pack_issuelist': None, 'IssueArcID': None, 'comyear': '2019', 'nzbprov': 'newznab', 'tmpprov': u'NZBHydra (newznab)', 'pack': False}]
2019-08-30 06:42:37 INFO    Successfully sent nzb to NZBGet!
2019-08-30 06:42:35 INFO    filen: HouseofX03of062019DigitalZone-Empire -- nzbname: House.of.X.03.of.06.2019.Digital.Zone-Empire are not identical. Storing extra value as : HouseofX03of062019DigitalZone-Empire
2019-08-30 06:42:35 INFO    [FAILED_DOWNLOAD_CHECKER] Successfully marked this download as Good for downloadable content
2019-08-30 06:42:35 INFO    prov : NZBHydra (newznab)[-1422323859855797251]
2019-08-30 06:42:35 INFO    oneoff: False
2019-08-30 06:42:35 INFO    IssueID: 717519
2019-08-30 06:42:35 INFO    nzbid: -1422323859855797251
2019-08-30 06:42:35 INFO    Found House of X (2019) #3 using NZBHydra (newznab)
2019-08-30 06:42:35 INFO    no errors on data retrieval...proceeding
2019-08-30 06:42:22 INFO    Pausing for 60 seconds before continuing to avoid hammering
2019-08-30 06:42:22 INFO    Shhh be very quiet...I'm looking for House of X issue: 3 (2019) using NZBHydra (newznab).
2019-08-30 06:41:52 INFO    Could not find Issue 3 of House of X (2019) using NZBHydra [api]
2019-08-30 06:41:52 INFO    no errors on data retrieval...proceeding
2019-08-30 06:41:50 INFO    Pausing for 60 seconds before continuing to avoid hammering
2019-08-30 06:41:50 INFO    Shhh be very quiet...I'm looking for House of X issue: 3 (2019) using NZBHydra (newznab).
2019-08-30 06:41:50 INFO    [SEARCH-QUEUE] Now loading item from search queue: {'comicname': u'House of X', 'issueid': u'717519', 'seriesyear': u'2019', 'booktype': u'Print', 'issuenumber': u'3', 'comicid': u'120309'}
2019-08-30 06:41:45 INFO    Completed Queueing API Search scan
2019-08-30 06:41:45 INFO    adding: ComicID:118536 IssueiD: 717533
2019-08-30 06:41:45 INFO    adding: ComicID:111428 IssueiD: 699376
2019-08-30 06:41:45 INFO    adding: ComicID:111428 IssueiD: 707492
2019-08-30 06:41:45 INFO    adding: ComicID:111428 IssueiD: 709167
2019-08-30 06:41:45 INFO    adding: ComicID:120309 IssueiD: 717519
2019-08-30 06:41:45 INFO    Initiating check to add Wanted items to Search Queue....
2019-08-30 06:41:45 INFO    [SEARCH] Running Search for Wanted.
2019-08-30 06:41:45 INFO    Background Schedulers successfully started...
2019-08-30 06:41:45 INFO    Firing up the Background Schedulers now....
2019-08-30 06:41:45 INFO    [WEEKLY] Checking for existance of Weekly Comic listing...
2019-08-30 06:41:45 INFO    [POST-PROCESS-QUEUE] Succesfully started Post-Processing Queuer....
2019-08-30 06:41:45 INFO    [POST-PROCESS-QUEUE] Post Process queue enabled & monitoring for api requests....
2019-08-30 06:41:45 INFO    [SEARCH-QUEUE] Successfully started the Search Queuer...
2019-08-30 06:41:45 INFO    [SEARCH-QUEUE] Attempting to background load the search queue....
2019-08-30 06:41:45 INFO    DB Updater sccheduled to fire every 5 minutes
2019-08-30 06:41:45 INFO    Mylar is up to date
2019-08-30 06:41:44 INFO    Version information: development [9fb9ab4d732fad1c181df54e419975e5ec823d2a]
2019-08-30 06:41:44 ERROR   ['C:\Program' is not recognized as an internal or external command, 
operable program or batch file. 
] Unable to find git with command: C:\Program Files\Git\bin\git.exe rev-parse HEAD
2019-08-30 06:41:43 INFO    Starting Mylar on http://0.0.0.0:8090/mylar/
2019-08-30 06:41:43 INFO    [DIRECTORY-CHECK] Found comic directory: C:\Mylar
2019-08-30 06:41:42 INFO    Sucessfully ordered 75 series in your watchlist.
2019-08-30 06:41:42 INFO    Remapping the sorting to allow for new additions.
2019-08-30 06:41:42 INFO    Successfully discovered local IP and locking it in as : 192.168.1.141
2019-08-30 06:41:42 INFO    Correcting Null entries that make the main page break on startup.
2019-08-30 06:41:42 INFO    Ensuring DB integrity - Removing all Erroneous Comics (ie. named None)
2019-08-30 06:41:42 INFO    Populating Custom Exception listings into Mylar....
2019-08-30 06:41:42 INFO    Populating Base Exception listings into Mylar....
2019-08-30 06:41:42 INFO    Checking to see if the database has all tables....
barbequesauce commented 5 years ago

What OS are you running - what version of windows? What branch and commit of Mylar are you running? Can you recreate this error with a debug log rather than normal logging?

Unfortunately there is not enough information here to help you... that’s why we have a new issue template

leaderdog commented 5 years ago

I think the biggest problem here is there needs to be an ignore error and resume functions option added to Mylar. Any error, and Mylar is crippled. Which is unfortunate since it is such a great program, but any small error and the whole program stops.

Mylar needs to add a place to collect specific bugs for the user to know what the problem is, and put it on an ignore list so it can continue to work properly. That or force a restart if it refuses to work, but again, if it chokes on the same file/issue or whatever, then Mylar literally runs and does nothing for days until the user realizes something is wrong.

Something like in sonarr how it puts an issue at the top to let you know something isn't set up correctly, or some file didn't finish the way it believed it should. You know exactly what file didn't work so you can go check it, delete it or change the name and move it yourself.

Would be a lot less work for you devs too because there would be less issues like this thread because we'd know what we need to look at to fix.

I've had mylar run for days, no clue anything is wrong, reboot it and it immediately downloads some files, then stops, does nothing. It shows it's trying to do stuff but it doesn't download anything that is for sure updated in Mylar/comicvine, and is available on several of my nzb indexers.

This has been happening for pretty much a year or more.

Windows 7 all experimental builds for the past year pretty sure latest version of Python because I kept updating it hoping it was the problem.

Just out of curiosity, I was thinking of moving mylar to a different location, remove and reinstalling everything related to mylar just in the hopes that things would work correctly, but I'm pretty sure it's the random errors that kills the program without us knowing there's a problem.

Inpacchi commented 5 years ago

I share your sentiments that any error encountered during a search, download or post-processing cripples Mylar, which is why I've been working on the program for the past couple days. I've made a few commits on my fork and have a pull request open, which you can check out and see if it works for you.

I've had quite a few comics cripple Mylar for me, but since I've been making these changes all has been golden. Today I added a bunch of new series and my wanted list was well over 1,500 issues. Normally, I'd come back and it'd still be at 1,500 due to some error locking the search queue and I'd have to restart the application until it encounters the error again. Well, I'm happy to say that there are only 500 issues remaining on my wanted list, which is phenomenal as it means that 1,000 issues processed without a hitch.

So like I said, give my changes a try and see if it works for you.