evilhero / mylar

An automated Comic Book downloader (cbr/cbz) for use with SABnzbd, NZBGet and torrents
GNU General Public License v3.0
976 stars 172 forks source link

Issues Finding and Downloading NZBs #2160

Closed yangjustinc closed 3 years ago

yangjustinc commented 5 years ago

Hi there, I've been using Mylar for a few months now without issue until I decided to reinstall it from scratch on my Bytesized Hosting AppBox. I previously had no issues with adding new comics to Mylar but now I've found that it no longer finds NZBs from any of my indexers.

Looking in the logs, I notice this error message that I never received before I resinstalled Mylar:

Uncaught exception: Traceback (most recent call last): File "/home/hd17/yangjustinc/apps/mylar/mylar/logger.py", line 337, in new_run old_run(*args, **kwargs) File "/usr/lib/python2.7/threading.py", line 754, in run self.__target(*self.__args, **self.__kwargs) File "/home/hd17/yangjustinc/apps/mylar/mylar/helpers.py", line 3007, in search_queue ss_queue = mylar.search.searchforissue(item['issueid']) File "/home/hd17/yangjustinc/apps/mylar/mylar/search.py", line 2141, in searchforissue foundNZB, prov = search_init(ComicName, IssueNumber, str(IssueYear), SeriesYear, Publisher, IssueDate, StoreDate, actissueid, AlternateSearch, UseFuzzy, ComicVersion, SARC=SARC, IssueArcID=IssueArcID, mode=mode, rsscheck=rsscheck, ComicID=ComicID, filesafe=Comicname_filesafe, allow_packs=allow_packs, oneoff=oneoff, manual=manual, torrentid_32p=TorrentID_32p) File "/home/hd17/yangjustinc/apps/mylar/mylar/search.py", line 336, in search_init findit = NZB_SEARCH(ComicName, IssueNumber, ComicYear, SeriesYear, Publisher, IssueDate, StoreDate, searchprov, send_prov_count, IssDateFix, IssueID, UseFuzzy, newznab_host, ComicVersion=ComicVersion, SARC=SARC, IssueArcID=IssueArcID, RSS="no", ComicID=ComicID, issuetitle=issuetitle, unaltered_ComicName=unaltered_ComicName, allow_packs=allow_packs, oneoff=oneoff, cmloopit=cmloopit, manual=manual, torznab_host=torznab_host, torrentid_32p=torrentid_32p) File "/home/hd17/yangjustinc/apps/mylar/mylar/search.py", line 623, in NZB_SEARCH bb = ww.wwt_connect() File "/home/hd17/yangjustinc/apps/mylar/mylar/wwt.py", line 49, in wwt_connect cf_cookievalue, cf_user_agent = s.get_tokens(newurl, user_agent=mylar.CV_HEADERS['User-Agent']) File "/home/hd17/yangjustinc/apps/mylar/lib/cfscrape/__init__.py", line 177, in get_tokens resp = scraper.get(url, **kwargs) File "/home/hd17/yangjustinc/apps/mylar/lib/requests/sessions.py", line 477, in get return self.request('GET', url, **kwargs) File "/home/hd17/yangjustinc/apps/mylar/lib/cfscrape/__init__.py", line 64, in request resp = super(CloudflareScraper, self).request(method, url, *args, **kwargs) File "/home/hd17/yangjustinc/apps/mylar/lib/requests/sessions.py", line 465, in request resp = self.send(prep, **send_kwargs) File "/home/hd17/yangjustinc/apps/mylar/lib/requests/sessions.py", line 573, in send r = adapter.send(request, **kwargs) File "/home/hd17/yangjustinc/apps/mylar/lib/requests/adapters.py", line 431, in send raise SSLError(e, request=request) SSLError: hostname 'worldwidetorrents.me' doesn't match either of '*.parkingcrew.net','parkingcrew.net'``

I've taken a look through past issues and have changed all the indexers so that they don't use https and it still doesn't seem to work. Is it possible that the error indicated above could be causing issues in retrieving NZBs from the indexers? I can that I can find the correct NZBs when I directly go to the indexer websites to search.

I really apologise if this is unintelligible! Happy to take on suggestions on how I can help you help me. Thank you so much in advance!

Lusiphur commented 5 years ago

"/home/hd17/yangjustinc/apps/mylar/lib/requests/adapters.py", line 431, in send raise SSLError(e, request=request) SSLError: hostname 'worldwidetorrents.me' doesn't match either of '*.parkingcrew.net', 'parkingcrew.net'``

From the first Google link ..

http://docs.python-requests.org/en/master/community/faq/

What are “hostname doesn’t match” errors?¶ These errors occur when SSL certificate verification fails to match the certificate the server responds with to the hostname Requests thinks it’s contacting. If you’re certain the server’s SSL setup is correct (for example, because you can visit the site with your browser) and you’re using Python 2.7, a possible explanation is that you need Server-Name-Indication.

Server-Name-Indication, or SNI, is an official extension to SSL where the client tells the server what hostname it is contacting. This is important when servers are using Virtual Hosting. When such servers are hosting more than one SSL site they need to be able to return the appropriate certificate based on the hostname the client is connecting to.

Python3 and Python 2.7.9+ include native support for SNI in their SSL modules. For information on using SNI with Requests on Python < 2.7.9 refer to this Stack Overflow answer.

What version of Python is on your cloud box? If less than 2.7.9 you will need to tweak as above.

evilhero commented 5 years ago

Basically, worldwidetorrents changed their domain name and now it's being held by someone else (parkingdomain..). The new worldwidetorrents domain name are in the development branch, but it looks like you're on the master branch. So you'd have to switch to development to utilise the changes and for things to work. We expect to merge dev to master by week's end as we're currently working thru a few outstanding bugs which we think we might have squashed earlier today, but we need to confirm before it gets released into the master branch.

Or as an alternative, you could use Jackett and have it go through that program instead of the built-in within Mylar and just set up your jackett instance as a torznab provider within Mylar.

yangjustinc commented 5 years ago

@Lusiphur @evilhero Thank you both so much for answering me so quickly! In that case, it sounds like the Python version isn't the problem (though I will separately confirm this with my provider). I'll hang tight for the new master branch release and manually retrieve comics in the interim. Thank you so much to you both -- I hope you are having an enjoyable start to your new year!