evilhero / mylar

An automated Comic Book downloader (cbr/cbz) for use with SABnzbd, NZBGet and torrents
GNU General Public License v3.0
978 stars 173 forks source link

Keeps downloading same torrent multiple times (no switch from wanted->snatched) #1646

Closed Lasborg closed 7 years ago

Lasborg commented 7 years ago

Just updated to the latest build (running on my synology)

Mylar Version: development -- git build 33b6de987b0c395a3efbd670b69fe023a6d7cce3. I am using 32p legacy mode.

When Mylar downloads a torrent mylar does not switch status from wanted to snatched. So mylar keeps downloading the same torrent over and over again.

evilhero commented 7 years ago

Do you have any notifications enabled within mylar for on snatch? Currently if the notifier fires off and encounters an error, it will not finish the process and change the status.

Not sure if thats the case here - if you could get debug logs showing the snatch and then what follows it would help alot more as atm I can't duplicate this.

Lasborg commented 7 years ago

No notifications and Local Watch dir download. Right now i have no logs but i will get some when it happen again

Lasborg commented 7 years ago

Is there a way I can send the log file to you in a mail maybe?

evilhero commented 7 years ago

Sure email it (I'll remove the addy after I get the email cause of spam).

Lasborg commented 7 years ago

Mail sendt

evilhero commented 7 years ago

Ok according to that log once it finds a match it's not even trying to send the torrent. So it's something you have configured that's causing it to get confused most likely, or its unable to determine what to do based on what you have in the config. Either way, it's probably necessary to put in a fix for mylar, only I don't know where it's erroring as it's not logging that part at all.

Can you paste in the config.ini you're using(remove the api keys or blank them out) - there's something amiss somewhere as local watchdir should work as expected as I had just tested it a few days ago for something else and it was working fine at that point.

Lasborg commented 7 years ago

[General] config_version = 6 dbchoice = sqlite3 dbuser = "" dbpass = "" dbname = "" dynamic_update = 4 comicvine_api = xxxxxxxxxxxxxxxxx cvapi_rate = 2 cv_verify = 1 http_port = 9500 http_host = 0.0.0.0 http_username = lasborg http_password = xxxxxxxxxxxxx http_root = / enable_https = 0 https_cert = "" https_key = "" https_chain = "" https_force_on = 0 host_return = "" api_enabled = 0 api_key = "" launch_browser = 0 auto_update = 0 log_dir = /usr/local/mylar/var/logs max_logsize = 1000000 git_path = /usr/local/git/bin/git cache_dir = /usr/local/mylar/var/cache annuals_on = 1 cv_only = 1 cv_onetimer = 1 check_github = 1 check_github_on_startup = 1 check_github_interval = 360 git_user = evilhero git_branch = development destination_dir = /volume1/downloads/comics multiple_dest_dirs = "" create_folders = 1 delete_remove_dir = 0 enforce_perms = 1 chmod_dir = 0777 chmod_file = 0777 chowner = "" chgroup = "" usenet_retention = 1500 alt_pull = 2 search_interval = 360 nzb_startup_search = 0 add_comics = 0 comic_dir = "" blacklisted_publishers = None imp_move = 0 imp_rename = 0 imp_metadata = 0 enable_check_folder = 0 download_scan_interval = 5 folder_scan_log_verbose = 0 check_folder = "" interface = default dupeconstraint = filesize ddump = 0 duplicate_dump = "" pull_refresh = 2017-06-04 17:03:00 autowant_all = 0 autowant_upcoming = 1 preferred_quality = 0 comic_cover_local = 0 correct_metadata = 0 move_files = 0 rename_files = 0 folder_format = $Series ($Year) setdefaultvolume = 0 file_format = $Series $Issue ($Year) blackhole_dir = "" replace_spaces = 0 replace_char = . zero_level = 0 zero_level_n = none lowercase_filenames = 0 ignore_havetotal = 0 snatched_havetotal = 0 syno_fix = 0 allow_packs = 0 search_delay = 1 grabbag_dir = /volume1/downloads/comics highcount = 0 read2filename = 0 send2read = 0 maintainseriesfolder = 0 tab_enable = 0 tab_host = "" tab_user = "" tab_pass = "" tab_directory = "" storyarcdir = 0 copy2arcdir = 0 arc_folderformat = $arc ($spanyears) arc_fileops = copy use_minsize = 0 minsize = "" use_maxsize = 0 maxsize = "" add_to_csv = 1 cvinfo = 0 log_level = 0 enable_extra_scripts = 0 extra_scripts = "" enable_snatch_script = 0 snatch_script = "" enable_pre_scripts = 0 pre_scripts = "" post_processing = 0 post_processing_script = "" file_opts = move weekfolder = 0 weekfolder_loc = "" weekfolder_format = 0 locmove = 0 newcom_dir = "" fftonewcom_dir = 0 enable_meta = 0 cbr2cbz_only = 0 ct_tag_cr = 1 ct_tag_cbl = 1 ct_cbz_overwrite = 0 unrar_cmd = None cmtag_volume = 1 cmtag_start_year_as_volume = 0 update_ended = 0 indie_pub = 75 biggie_pub = 55 upcoming_snatched = 1 enable_rss = 1 rss_checkinterval = 20 rss_lastrun = 2017-06-04 18:44:20 failed_download_handling = 1 failed_auto = 1 provider_order = 0, tpse, 1, 32p nzb_downloader = 2 torrent_downloader = 0 [Torrents] enable_torrents = 1 auto_snatch = 0 auto_snatch_script = "" local_torrent_pp = 0 minseeds = 0 torrent_local = 1 local_watchdir = /volume1/downloads/Torrent torrent_seedbox = 0 seedbox_host = "" seedbox_port = "" seedbox_user = "" seedbox_pass = "" seedbox_watchdir = "" enable_torrent_search = 1 enable_tpse = 0 tpse_proxy = "" tpse_verify = True enable_32p = 1 search_32p = 0 mode_32p = 0 passkey_32p = xxxxxxxxxxxxxxxxxxxxxxxxx rssfeed_32p = https://32pag.es/feeds.php?feed=torrents_all&user=xxxxxx&auth=xxxxxxxxxxxxxxx&passkey=xxxxxxxxxxxx&authkey=xxxxxxxxxxxxxxxx username_32p = "" password_32p = "" verify_32p = 1 snatchedtorrent_notify = 0 rtorrent_host = "" rtorrent_authentication = basic rtorrent_rpc_url = "" rtorrent_ssl = 0 rtorrent_verify = 0 rtorrent_ca_bundle = "" rtorrent_username = "" rtorrent_password = "" rtorrent_startonload = 0 rtorrent_label = "" rtorrent_directory = "" [SABnzbd] sab_host = http://http: sab_username = "" sab_password = "" sab_apikey = "" sab_category = "" sab_priority = Default sab_to_mylar = 0 sab_directory = "" [NZBGet] nzbget_host = "" nzbget_port = "" nzbget_username = "" nzbget_password = "" nzbget_category = "" nzbget_priority = Default nzbget_directory = "" [NZBsu] nzbsu = 0 nzbsu_uid = "" nzbsu_apikey = "" nzbsu_verify = 1 [DOGnzb] dognzb = 0 dognzb_apikey = "" dognzb_verify = 1 [Experimental] experimental = 0 altexperimental = 1 [Torznab] enable_torznab = 0 torznab_name = "" torznab_host = "" torznab_apikey = "" torznab_category = "" torznab_verify = 0 [Newznab] newznab = 0 extra_newznabs = , [uTorrent] utorrent_host = "" utorrent_username = "" utorrent_password = "" utorrent_label = "" [Transmission] transmission_host = 192.168.1.125:5000 transmission_username = xxxxxxxxxxx transmission_password = xxxxxxxxxxx transmission_directory = /volume1/downloads/ [Deluge] deluge_host = "" deluge_username = "" deluge_password = "" deluge_label = "" [qBittorrent] qbittorrent_host = "" qbittorrent_username = "" qbittorrent_password = "" qbittorrent_label = "" qbittorrent_folder = "" qbittorrent_startonload = 0 [Prowl] prowl_enabled = 0 prowl_keys = "" prowl_onsnatch = 0 prowl_priority = 0 [NMA] nma_enabled = 0 nma_apikey = "" nma_priority = 0 nma_onsnatch = 0 [PUSHOVER] pushover_enabled = 0 pushover_apikey = "" pushover_userkey = "" pushover_priority = 0 pushover_onsnatch = 0 [BOXCAR] boxcar_enabled = 0 boxcar_onsnatch = 0 boxcar_token = "" [PUSHBULLET] pushbullet_enabled = 0 pushbullet_apikey = "" pushbullet_deviceid = None pushbullet_onsnatch = 0 [TELEGRAM] telegram_enabled = 0 telegram_token = "" telegram_userid = "" telegram_onsnatch = 0

Lasborg commented 7 years ago

I suspect that it could be "failed download handling" that causes issues with my setup. I only use mylar to get torrents, and make no post-processing or renaming. So mylar downloads the torrent and my dl client gets the file, then i copy the files to my PC where i do all my renaming and so on. So maybe "failed download handling" can't see the file as mylar is not managing it in its database, and then assumes that the download had failed.

evilhero commented 7 years ago

It could actually be a problem due to using TPSE. It looks like they recently changed from having the torrent file being downloadable, to only allowing magnet links. Mylar's expecting a torrent file, and it's receiving a magnet link and it's throwing off all the internal calculations due to it. I suspect it's erroring out, and because of it's assuming it's a bad/incomplete link, marking it as Failed and trying to attempt to continue. Any future attempts for the same file will be ignored if it's the same one that got marked as Failed.

I'm trying to come up with a fix right now, but because it deals with restructuring the code to accomodate the magnet links across several different torrent clients it's taking abit of time (plus personal life). Hope to get a fix out soon to allow for TPSE usage again - maybe tonight if I can swing some extra time to this....

Lasborg commented 7 years ago

But i use 32p and not TPSE

evilhero commented 7 years ago

Tpse is enabled in your provider search order

Lasborg commented 7 years ago

provider_order = 0, tpse, 1, 32p

Can it be changed in the gui?

evilhero commented 7 years ago

No its a config.ini only option, but now that I think bout it more it doesn't matter since it knows it's not enabled (if it was it would be first in your order search tho).

You're right about the post-processing tho. If Mylar isn't able to post-process the file in some way it has no way of knowing if it it downloaded ok as it's not being monitored.

It sends it to your client and marks it as snatched, and that's where mylar stops because it can't do anything else. It doesn't monitor your client or anything. If you were to post-process the issues or at least have them in the series directory then when it does a 'recheck files' it would pick up the downloads and mark it as Downloaded.

Can mylar not do the post-processing for some reason that you need to manually intervene every file?

There's also an included 'torrent auto snatcher' option now that will monitor a torrent download for completion after a snatch on a remote client and when it's completed will download the torrent to the local machine running mylar and post-process/metatag as per settings.

Lasborg commented 7 years ago

It unfortunately was not "failed download handling". After i disabled it, torrents are still being downloaded and not marked as snatched.

I have no settings enabled in postprocessing, It is fine for me if Mylar just marks as snached.

I have never used post-processing as mylar runs on synology and i do not know how to setup comictagger and configparser on it, so i just autocopy the cbr's to my PC and do my tagging, renaming and so on in comicrack on the PC and then sync comics to my ipad for reading. When i'm done reading read status i synced back to comicrack on the PC and all read comics will be archived on the synology. If i could setup mylar to convert to cbz, tag, rename, and place the files in the correct folder i would be very happy.

Could it be an error with "Automatically Mark Upcoming Issues as Wanted"? Maybe it is downloading the torrent before the "cover date" and is then "fixed" by the upcoming issue to be set as wanted again.

evilhero commented 7 years ago

Well setting up comictagger /config parser hasn't been needed in a very long time. It's all built-in now and the config parser requirement was even removed. All you need to do is enable metatagging and it'll handle the rest. As far as renaming, it's pretty much the same - choose your format for file format, save the config, restart mylar and thats it. It will then metatag, and rename, and then move during each post-processing run. Set up your folder monitor to monitor given folder for new files and as they'll be post-processed automatically (only complete files, if they're being transferred over and the folder check fires off it won't post-process those until they're finished being copied/moved into the monitor folder).

The automatically mark upcoming as wanted has no effect on what you're experiencing, nor the cover date/upcoming issue that you mention. Mylar uses an algorithm to calculate all of the necessary information to deem if an issue is to be set to wanted, and if it's in a snatched state it wouldn't search - but failed status it will re-search as it deems to have been failed.

Lasborg commented 7 years ago

I have now setup post processing and imported all my archived comics. It took a while and seems to be mostly successful (the comics already had metadata with comicvine id's) I have set mylar up to monitor my download folder, and postprocess, rename and move the comics into the comics folder. Mylar sees the files but nothing happens and the files are left in place. I have tried manual post processing and it is the same. So naturally myler still does not change the status to downloaded, and continues to download the torrentfile over and over again.

evilhero commented 7 years ago

Can you get a debug log of a manual post-process / folder monitor run and get it to me?

evilhero commented 7 years ago

There's something amiss going on then - according to the logs there are 2 files in your folder monitor, both of which are in a Downloaded status. Because you have the dupecheck set to retain based on filesize, it retains the one in your series directory and leaves the one in the folder monitor alone (since the filesize in the series directory is either large or equal to the one in the monitored folder). Mylar will only search for cbr/cbz/webp files - the pdf you have in that directory will never be post-processed by Mylar.

The other 2 files in the folder monitor are not being post-processed due to the volume check failing.

FOLDERMONITOR : [POST-PROCESSING][ISSUE-VERIFY][Lone Volume FAILURE] Volume label of v5 indicates that there is more than one volume for this series, but the one on your watchlist has no volume label set

Basically you have to make sure the volume label for the Mylar series is set to the correct volume label - it's a check Mylar has to do in order to ensure that the issue being post-processed belongs to the correct series. Setting the Volume label for the given series (in this case setting it to v5), and then post-processing should allow things to post-process as normal. Normally however, if you have more than one series on your watchlist that have the same name (ie. different volumes of the same series), you would have to denote the volume label in the filename so that Mylar knows where to put it (occasionally you could just use the issue year, but it usually depends on the series and when the next volume was started).

Once the issue is in a Snatched status, Mylar will no longer look for it as it assumes your client has taken over the downloading aspect and it then just expects to post-process the given item at some point via some method. Can you confirm the status of the issues you're trying to post-process from the series detail page for the given series?

Make sure you have your post-processing action set to the appropriate type for what it is that you want to do - if you have it set to copy, Mylar will always leave a copy in the original folder. If you have it set to Move, Mylar will move the files into the series folder. If you enable the duplicate dump option, anytime Mylar encounters a duplicate when post-processing, it will move the duplicate in the dump folder location so that it can be manually deleted / checked before removing.

Lasborg commented 7 years ago

I did so more prodding around with importing comics, an there were some issues with the correct folders of watched series. So I went in and manually changed the folder locations to the correct ones. i have also changed the permission of the folder to 777, and now the

I do not know what have happened, but the issue with repeated downloads have stopped (since it now post processes and moves the files to the correct folder)

One thing i have encountered in the importing process is that the import result page can not be ordered by clicking in the column headers.