Closed Teknogrebo closed 3 years ago
Usenet crawler is redirecting your request due to it being invalid. Have you tried using 'https://usenet-crawler.com' instead of what you have?
You also need to make sure you're running at least python 2.7.9, and that your api key is entered in correctly.
I have tried the url you suggest, and tried http, rather than https. I get too many redirects no matter which one I choose. I use usenet crawler with couchpotato and sonarr without problem, so I'm not sure why it's an issue for mylar? Is there a way I can set the redirect limit for an experiment?
Usenet-crawler is https only now, so that's the only viable option. There also isn't an issue with mylar and Usenet-crawler, so it seems for some reason there's a problem with something on your system.
Is your python greater than 2.7.9? Can you switch mylar to the development branch, as that's the most up to date, although nothing recently has changed with regards to newznab entries, it doesn't hurt.
Okay, that's interesting. My python version is 2.7.6 which is the latest for my ubuntu version. I also have python3, but mylar really doesn't play well with that :) I will try and install a later version of python2 and see if that helps.
I tried the development branch first, but it was having problems even adding a new series - I found a couple of places where exceptions weren't handled due to my setup being incorrect (possibly related to what is causing me to have issues with master branch)
Yeah the problems adding a series in the development branch was due to not using python 2.7.9+ as well.
If you remember where the exceptions were not present due to your setup, if possible I'd like to know so that I could add in the exceptions and trap them accordingly for future use.
I've upgraded python to 2.7.10 and I'm still getting the error. Having reverted to the dev branch I now get this message on startup which I suspect has something to do with my issue:
.mylar/lib/requests/packages/urllib3/util/ssl_.py:90: InsecurePlatformWarning: A true SSLContext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fail. For more information, see https://urllib3.readthedocs.org/en/latest/security.html#insecureplatformwarning. InsecurePlatformWarning
I think I ugraded ndg-httpsclient as per the instructions on the page show, but I still get the error. Any ideas?
As for unhandled exceptions - here is the first one: Uncaught exception: Traceback (most recent call last): File ".mylar/mylar/logger.py", line 158, in new_run old_run(_args, _kwargs) File "/usr/lib/python2.7/threading.py", line 763, in run self.__target(_self.__args, _self.__kwargs) File "/home/tom/.mylar/mylar/importer.py", line 456, in addComictoDB os.remove(coverfile) OSError: [Errno 2] No such file or directory: '/.mylar/cache/18927.jpg'
The code doesn't deal with the coverfile existing at all - i fixed this first time by checking file size > 0
For your newznab entry for usenet-crawler in the configuration, make sure to enable verify ssl. Save the config and restart and you should be good to go after that point. There's no need to upgrade ndg-httpsclient if you're running python > 2.7.7 ;)
Also make sure that you're running mylar against 2.7.10 and not your other version. If you go into the configuration of mylar on the very first tab it will tell you which version of python is running against mylar.
Almost every series on CV has an image, is this one that actually doesn't exist or is it because of a failed attempt at an add series (which would have failed if your python wasn't up to spec for mylar). That being said, I'll still look at the cover file error since it does the files size check during the add series in the latest development build.
verify ssl is selected, but you are correct. it is showing that I am running python 2.7.6. If I do "python --version" it says 2.7.10 so what should I do to persuade mylar to use the correct version? I've tried setting PYTHON_BIN in /etc/default/mylar to /usr/bin/python2.7 (which verifies as version 2.7.10), but it's convinced it's 2.7.6 still.
The image does exist as I could retrieve it using the url it was using - it also works when I add the series when using the master branch.
This might sounds stupid, but in my experience, rebooting the machine will cause an updating of the running/reported Python; there seems to be something that changes at boot-time, although I know not what. Simply restarting a python process (such as Mylar) won't trigger the update and will continue to allow the previous version to linger. Ymmv On Tue, 15 Mar 2016 at 14:51 Teknogrebo notifications@github.com wrote:
verify ssl is selected, but you are correct. it is showing that I am running python 2.7.6. If I do "python --version" it says 2.7.10 so what should I do to persuade mylar to use the correct version? I've tried setting PYTHON_BIN in /etc/default/mylar to /usr/bin/python2.7 (which verifies as version 2.7.10), but it's convinced it's 2.7.6 still.
The image does exist as I could retrieve it using the url it was using - it also works when I add the series when using the master branch.
— You are receiving this because you are subscribed to this thread. Reply to this email directly or view it on GitHub:
https://github.com/evilhero/mylar/issues/1231#issuecomment-196853786
Haha. I can't believe you just told me to turn it of an on again :)
As I'm at the "I'll try anything" phase - I tried! Didn't work though.
How are you starting up mylar - directly from the cli or as a daemon/init.d script?
If you start mylar from the cli (in the mylar directory) and instead of typing python type in the full path to the new python binary it should work (ie. /usr/bin/python2.7 mylar.py -v)
I know, I couldn't believe I was giving that advice, but as stupid as it sounds, it seems to work for Python for me... On Tue, 15 Mar 2016 at 16:42 evilhero notifications@github.com wrote:
How are you starting up mylar - directly from the cli or as a daemon/init.d script?
If you start mylar from the cli (in the mylar directory) and instead of typing python type in the full path to the new python binary it should work (ie. /usr/bin/python2.7 mylar.py -v)
— You are receiving this because you commented.
Reply to this email directly or view it on GitHub:
https://github.com/evilhero/mylar/issues/1231#issuecomment-196915051
I am using the init.d script at the moment. I get the feeling that I may have messed something up with python as if I run python2.7 mylar.py -v I get messages about the sqlite module being missing :/
That is weird, sqlite3 comes with python by default. Are you running it as '/usr/bin/python2.7 Mylar.py -v' (it is case-sensitive)?
If you try just typing in 'whereis python' it will tell you the locations it exists at. Then entering in the actual command will get you into a Python console (ie. /usr/bin/python2.7),from there if you type in 'import sqlite3' it will tell you if has problems loading - if it doesn't give an error then it's installed fine.
If you're running it from the init.d then you need to change the python location path in the init.d to your new python location.
whereis python:
python: /usr/bin/python2.7 /usr/bin/python3.4 /usr/bin/python3.4m /usr/bin/python /etc/python2.7 /etc/python3.4 /etc/python /usr/lib/python2.7 /usr/lib/python3.4 /usr/bin/X11/python2.7 /usr/bin/X11/python3.4 /usr/bin/X11/python3.4m /usr/bin/X11/python /usr/local/bin/python2.7 /usr/local/bin/python2.7-config /usr/local/bin/python /usr/local/lib/python2.7 /usr/local/lib/python3.4 /usr/share/python /usr/share/man/man1/python.1.gz
ls -l /usr/bin/python lrwxrwxrwx 1 root root 18 Mar 15 12:07 /usr/bin/python -> /usr/bin/python2.7
import sqlite3 Traceback (most recent call last): File "
", line 1, in File "/usr/local/lib/python2.7/sqlite3/init.py", line 24, in from dbapi2 import File "/usr/local/lib/python2.7/sqlite3/dbapi2.py", line 28, in from _sqlite3 import ImportError: No module named _sqlite3
I installed 2.7.10 by building from source and doing "make install" I don't know if that has anything to do with it. Is it possible to purge python completely and start again. I've defiinitely messed things up as Coachpotato won't start now either.
K did you do the typical 'configure / make /make install' ?
You can't really delete the python that came with your os as it's pretty integral to the os - removing it is a recipe for disaster. You should be able to remove the python you just installed tho (from within the source directory if you still have it - sudo make uninstall) .. Or if you only did the make install, do the configure && make && make install to complete the installation
I did do configure/make/install etc. I've done another build after using "configure --enable-loadable-sqlite-extensions" (thank you stackoverflow - how did we survive without you?). Now if I run with "python Mylar.py" it runs and reports 2.7.10. If I do "service start mylar" it reports 2.7.6 still. So we have progress and a fix of sorts. I would prefer running all these things as services though.
I've tried adding a series again and have this in the logs now: Uncaught exception: Traceback (most recent call last): File "/home/tom/.mylar/mylar/logger.py", line 158, in new_run old_run(_args, _kwargs) File "/usr/local/lib/python2.7/threading.py", line 763, in run self.__target(_self.__args, _self.__kwargs) File "/home/tom/.mylar/mylar/importer.py", line 549, in addComictoDB issuedata = updateissuedata(comicid, comic['ComicName'], issued, comicIssues, calledfrom, SeriesYear=SeriesYear, latestissueinfo=latestissueinfo) File "/home/tom/.mylar/mylar/importer.py", line 1195, in updateissuedata weeklyissue_check = annual_check(comicname, SeriesYear, comicid, issuetype, issuechk, weeklyissue_check) File "/home/tom/.mylar/mylar/importer.py", line 1597, in annual_check sresults, explicit = mb.findComic(annComicName, mode, issue=None)#, explicit=True) File "/home/tom/.mylar/mylar/mb.py", line 132, in findComic searched = pullsearch(comicapi, comicquery, 0, explicit, type) File "/home/tom/.mylar/mylar/mb.py", line 79, in pullsearch dom = parseString(r.content) #(data) File "/usr/local/lib/python2.7/xml/dom/minidom.py", line 1928, in parseString return expatbuilder.parseString(string) File "/usr/local/lib/python2.7/xml/dom/expatbuilder.py", line 940, in parseString return builder.parseString(string) File "/usr/local/lib/python2.7/xml/dom/expatbuilder.py", line 223, in parseString parser.Parse(string, True) ExpatError: syntax error: line 1, column 0
Uncaught exception: Traceback (most recent call last): File "/home/tom/.mylar/mylar/logger.py", line 158, in new_run old_run(_args, _kwargs) File "/usr/local/lib/python2.7/threading.py", line 763, in run self.__target(_self.__args, _self.__kwargs) File "/home/tom/.mylar/mylar/updater.py", line 270, in dbUpdate cchk = mylar.importer.addComictoDB(ComicID, mismatch, annload=annload) File "/home/tom/.mylar/mylar/importer.py", line 541, in addComictoDB issued = cv.getComic(comicid, 'issue') File "/home/tom/.mylar/mylar/cv.py", line 130, in getComic searched = pulldetails(id, 'issue', None, 0, islist) File "/home/tom/.mylar/mylar/cv.py", line 103, in pulldetails dom = parseString(r.content) #(data) File "/usr/local/lib/python2.7/xml/dom/minidom.py", line 1928, in parseString return expatbuilder.parseString(string) File "/usr/local/lib/python2.7/xml/dom/expatbuilder.py", line 940, in parseString return builder.parseString(string) File "/usr/local/lib/python2.7/xml/dom/expatbuilder.py", line 223, in parseString parser.Parse(string, True) ExpatError: syntax error: line 1, column 0
You just need to update the python_path location in your /etc/init.d/mylar file to the location of your python now.
The expat error is usually due to either hitting your api limit at comicvine (if you didn't specify one, you need to get your own - it's free), or you're running mylar still against the old version of python (was the log taken when you ran it as a service)
usr/bin/python (the default path, which i haven't changed) is 2.7.10 - it has to be said Mylar seems to be grabbing a version of python from the ether when running as a service :) The log was taken when I ran it with it reporting 2.7.10 though, and comic vine reports:
Current API Usage
You have used 2 requests in the last hour for API Path '/search' (reset in 51 minutes) You have used 2 requests in the last hour for API Path '/volume' (reset in 51 minutes) You have used 1 requests in the last hour for API Path '/issues' (reset in 51 minutes) Your request rate is fine
Thanks for answering all my questions though, I appreciate it - I'm a developer myself and if somebody at work hammered me all day with problems in the way that I am with you, I would probably start ignoring them :)
I think I managed to corrupt the database or something as I removed my single series and tried re-adding it and after a couple of attempts I succeeded. It's now searching, and not getting the too many redirects message, however, I did get this: after it found an entry:
Uncaught exception: Traceback (most recent call last): File "/home/tom/.mylar/mylar/logger.py", line 158, in new_run old_run(_args, *_kwargs) File "/usr/local/lib/python2.7/threading.py", line 763, in run self.*target(_self.__args, _self.__kwargs) File "/home/tom/.mylar/mylar/search.py", line 1580, in searchforissue foundNZB, prov = search_init(comic['ComicName'], result['Issue_Number'], str(ComicYear), comic['ComicYear'], Publisher, IssueDate, StoreDate, result['IssueID'], AlternateSearch, UseFuzzy, ComicVersion, SARC=None, IssueArcID=None, mode=mode, rsscheck=rsscheck, ComicID=result['ComicID'], filesafe=comic['ComicName_Filesafe']) File "/home/tom/.mylar/mylar/search.py", line 254, in search_init findit = NZB_SEARCH(ComicName, IssueNumber, ComicYear, SeriesYear, Publisher, IssueDate, StoreDate, searchprov, send_prov_count, IssDateFix, IssueID, UseFuzzy, newznab_host, ComicVersion=ComicVersion, SARC=SARC, IssueArcID=IssueArcID, ComicID=ComicID, issuetitle=issuetitle, unaltered_ComicName=unaltered_ComicName) File "/home/tom/.mylar/mylar/search.py", line 1455, in NZB_SEARCH searchresult = searcher(nzbprov, nzbname, comicinfo, entry['link'], IssueID, ComicID, tmpprov, newznab=newznab_host) File "/home/tom/.mylar/mylar/search.py", line 2067, in searcher send_to_nzbget = server.append(nzbpath, str(mylar.NZBGET_CATEGORY), int(nzbgetpriority), True, nzbcontent64) File "/usr/local/lib/python2.7/xmlrpclib.py", line 1240, in call** return self.send(self.name, args) File "/usr/local/lib/python2.7/xmlrpclib.py", line 1599, in request verbose=self.__verbose File "/usr/local/lib/python2.7/xmlrpclib.py", line 1280, in request return self.single_request(host, handler, request_body, verbose) File "/usr/local/lib/python2.7/xmlrpclib.py", line 1328, in single_request response.msg, ProtocolError:
Can you run mylar in verbose mode (or enable verbose in log Gui), and try doing the search again?
I need to see the lead up to the error to understand what it's trying to send to nzbget, or if something is malformed in the request.
Also, don't worry about the questions - I don't mind at all. I also kinda new what I was getting into when I started this :)
No worries about the questions.. I kinda knew what I was getting into when I started this ;)
Can you paste in your init.d script - mylar should only take the one version of python every time, so if it's taking different versions at different times then its probably some config setting
Uncaught exception: Traceback (most recent call last): File "/home/tom/.mylar/mylar/logger.py", line 158, in new_run old_run(_args, *_kwargs) File "/usr/local/lib/python2.7/threading.py", line 763, in run self.*target(_self.__args, _self.kwargs) File "/home/tom/.mylar/mylar/webserve.py", line 1313, in queueissue foundcom, prov = search.search_init(ComicName, ComicIssue, ComicYear, SeriesYear, Publisher, issues['IssueDate'], storedate, IssueID, AlternateSearch, UseAFuzzy, ComicVersion, mode=mode, ComicID=ComicID, manualsearch=manualsearch, filesafe=ComicName_Filesafe) File "/home/tom/.mylar/mylar/search.py", line 254, in search_init findit = NZB_SEARCH(ComicName, IssueNumber, ComicYear, SeriesYear, Publisher, IssueDate, StoreDate, searchprov, send_prov_count, IssDateFix, IssueID, UseFuzzy, newznab_host, ComicVersion=ComicVersion, SARC=SARC, IssueArcID=IssueArcID, ComicID=ComicID, issuetitle=issuetitle, unaltered_ComicName=unaltered_ComicName) File "/home/tom/.mylar/mylar/search.py", line 1455, in NZB_SEARCH searchresult = searcher(nzbprov, nzbname, comicinfo, entry['link'], IssueID, ComicID, tmpprov, newznab=newznab_host) File "/home/tom/.mylar/mylar/search.py", line 2067, in searcher send_to_nzbget = server.append(nzbpath, str(mylar.NZBGET_CATEGORY), int(nzbgetpriority), True, nzbcontent64) File "/usr/local/lib/python2.7/xmlrpclib.py", line 1240, in call** return self.send(self.name, args) File "/usr/local/lib/python2.7/xmlrpclib.py", line 1599, in request verbose=self.verbose File "/usr/local/lib/python2.7/xmlrpclib.py", line 1280, in request return self.single_request(host, handler, request_body, verbose) File "/usr/local/lib/python2.7/xmlrpclib.py", line 1328, in single_request response.msg, ProtocolError: 2016-03-15 20:27:21 DEBUG link given by: newznab 2016-03-15 20:27:21 INFO Found Doktor Sleepless (2007) issue: 1 using usenet crawler (newznab) 2016-03-15 20:27:21 DEBUG issues match! 2016-03-15 20:27:21 DEBUG Successfully changed permissions [0777 / 0660] 2016-03-15 20:27:21 DEBUG Cache Directory successfully found at : /home/tom/.mylar/cache. Ensuring proper permissions. 2016-03-15 20:27:21 DEBUG [FILENAME] end nzbname: Doktor.Sleepless.001.2007.digital-Empire 2016-03-15 20:27:21 DEBUG [FILENAME] nzbname (\s): Doktor Sleepless 001 2007 digital-Empire 2016-03-15 20:27:21 DEBUG [FILENAME] filename (remove chars): Doktor Sleepless 001 2007 digital-Empire 2016-03-15 20:27:20 INFO Download URL: https://usenet-crawler.com/api?apikey=##&t=get&id=9bd13eaa147248b6f08dc2e5288467f2 [VerifySSL:True] 2016-03-15 20:27:20 DEBUG payload:{'apikey': '##', 't': 'get', 'id': '9bd13eaa147248b6f08dc2e5288467f2'} 2016-03-15 20:27:20 DEBUG [newznab] link: https://www.usenet-crawler.com/getnzb/9bd13eaa147248b6f08dc2e5288467f2.nzb&i=189741&r=## 2016-03-15 20:27:20 DEBUG nzbname used for post-processing:Doktor.Sleepless.001.2007.digital-Empire
Did you specify a protocol for your nzbget host in the mylar configuration? (ie. http://localhost)
Yup. It's exactly that.
Can you paste in exactly what you have for your nzbget settings from the config.ini?
Sorry about the delay. I didn't get a notification that you had responded - still, real life and all that...
[NZBGet] nzbget_host = http://localhost nzbget_port = 7003 nzbget_username = nzbget_password = nzbget_category = comics nzbget_priority = Default nzbget_directory = ""
Have you tried using a host address other than localhost (the ip itself or 0.0.0.0)?
I don't know which made the difference, but I changed localhost to 0,0,0,0 and updated to the latest version and it managed to pass a search result to nzb which duly downloaded it. However, Mylar didn't pick up on the download and I have to do a manual scan to import the comic into Mylar's db. Do I need to set up a post process action in nzbget to move the download over to mylar, or should mylar fetch it?
Well localhost means that it only listens on that network interface (localhost) for connections, whereas 0.0.0.0 listens on all your network interfaces for a connection to mylar. That might have something to do with it.
Did you setup the comicRN.py scripts within nzbget and enable post-processing within mylar? Mylar won't process anything that you download unless you have the 3 scripts setup for your download client (comicRN.py,autoProcessComics.py, and autoProcessComics.cfg - the last one requires editing for proper usage). They are all found in the post-processing/ subdirectory in the root of mylar.
You can also setup folder monitoring to monitor a given folder for comics. Point mylar to the directory and set a delay to watch the folder and as long as the series are in your watch list, mylar will post-process them accordingly.
I had enabled post processing, but not enabled the folder watch option, so enabling that has got it importing everything. I guess the previous issues that I was having meant that I hadn't got that far in the setup process. So that's great, everything is searching, downloading and post processing correctly. The next issue I have is importing my existing comics...
I've managed to import a couple of series, although I think if you have "move" selected and they are already in the correct place then it fails silently without actually importing. I just got this as well: Uncaught exception: Traceback (most recent call last): File "/home/tom/.mylar/mylar/logger.py", line 158, in new_run old_run(_args, _kwargs) File "/usr/local/lib/python2.7/threading.py", line 763, in run self.__target(_self.__args, _self.__kwargs) File "/home/tom/.mylar/mylar/webserve.py", line 3022, in preSearchit logger.info(comicinfo['ComicID']) TypeError: string indices must be integers
The 'move' silently failing would be a normal-type of occurrence. Mylar assumes that your import folder is different than the final destination folder (your ComicLocation). If they're the same, then Mylar will scan in everything prior to moving and then it can't delete them since the move already had failed.
Basically it's a recipe for disaster - the safest method is to have your import folder seperate from your Comic Location folder - that way Mylar can manage both locations (removing from the old location as it moves to the new location), with much fewer problems. Even if you have an import folder within the Comic Location folder it would be ok, because it's only going to get scanned in once (ie. your ComicLocation = /Comics, your import folder = /Comics/ImportMe)
I should add in a check/catch if users' are actually importing from their actual Comic Location folder as like I had said above, will cause issues.
The actual error you posted at the end is a problem with the importer, I'm looking into why it's happening (I haven't been able to duplicate it as of yet, but I know others have had it occur so I'm trying to figure out why exactly)
Having a check with a failure message would be okay I guess as at least the user would know what they need to do to get it to work. The problem was that I had an existing collection from using mylar before and I just wanted to rebuild the mylar database without having to move everything out of its current location only for it to be moved back. Other applications I use for other media can import an existing library without having to move it, so it would be ideal if Mylar could do that as well.
On 22 March 2016 at 13:34, evilhero notifications@github.com wrote:
The 'move' silently failing would be a normal-type of occurrence. Mylar assumes that your import folder is different than the final destination folder (your ComicLocation). If they're the same, then Mylar will scan in everything prior to moving and then it can't delete them since the move already had failed.
Basically it's a recipe for disaster - the safest method is to have your import folder seperate from your Comic Location folder - that way Mylar can manage both locations (removing from the old location as it moves to the new location), with much fewer problems. Even if you have an import folder within the Comic Location folder it would be ok, because it's only going to get scanned in once (ie. your ComicLocation = /Comics, your import folder = /Comics/ImportMe)
I should add in a check/catch if users' are actually importing from their actual Comic Location folder as like I had said above, will cause issues.
— You are receiving this because you authored the thread. Reply to this email directly or view it on GitHub https://github.com/evilhero/mylar/issues/1231#issuecomment-199816209
Hi. I had mylar set up and running a year or so ago but stopped using it. I have decided to give it another try but I'm having a few problems. I have so far been unable to get anything from my wanted list as I keep getting this in my logs: 2016-03-15 08:51:59 WARNING Error fetching data from Usenet crawler (newznab): Exceeded 30 redirects. 2016-03-15 08:51:57 INFO Download URL: https://www.usenet-crawler.com/api?apikey=######################&t=get&id=170ed49fac410caa362527061a25f283 [VerifySSL:False] If I paste the url directly in to a browser it downloads an nzb file so it would seem to be the way mylar is retrieving it that is the problem. I am using the latest version from master branch. If you could provide any insight then that would be great. Thanks.