TrenxT / rutorrent

Automatically exported from code.google.com/p/rutorrent
0 stars 0 forks source link

PluginRSS: Episode Tracking #299

Open GoogleCodeExporter opened 9 years ago

GoogleCodeExporter commented 9 years ago
What steps will reproduce the problem?
Setup an RSS feed that provides multiple duplicate episode entries,
irrelevant of group - that appear multiple times over the course of a month.

What is the expected output? What do you see instead?
Due to the match appearing more than once, you end up downloading duplicate
episodes of a show. Also - because the episode may show 2, 3 or even 4
times a month OR week - the "Match only..." drop-down does not help.

Due to the releases being from different groups - there's no way to
explicitly define the RegEx against a group due to fear's of missing an
episode. (Maybe one month a particular group misses an episode from a
specific show).

This is using the current 3.0 Beta SVN.

The solution would be to add season/episode tracking to the filters.
uTorrent already does this behind the scene's, along with other Python
based broadcatching clients.

For example - you could setup a simple filter that looks for the show LOST.
Whenever a match is made, the plugin would mark the season and episode of
that show. Whenever the next match is made, the episode would increment by
one. Whenever a season increment is found, it would increment the season
counter by one.

uTorrent provides no GUI (as far as I'm aware) for this - but does it
behind the scenes. TorrentWatch-X provides this, and gives you the ability
to modify the current season and episode counters for each filter. rssdler
(a python script) does this too, however it modifies the filter expression
automatically dependant on the matched episode. This way is a little more
clunky as it requires you to use a specific RegEx expression.

The best approach I think is TorrentWatch-x's - as it allows you to modify
the values. However, if written well - there's no reason why it couldn't be
done behind the scenes like uTorrent.

Original issue reported on code.google.com by craig.ba...@gmail.com on 25 Feb 2010 at 12:38

GoogleCodeExporter commented 9 years ago
Sorry - this should have been marked as an Enhancement. :)

Original comment by craig.ba...@gmail.com on 25 Feb 2010 at 12:39

GoogleCodeExporter commented 9 years ago
Here's a scenario for you. This example is for one show, being Lost. And it's 
what my
feed may offer over the period of 4 weeks. Please keep in mind that the release 
group
isn't always the same, I've just used these ones as an example - so I can't 
really
filter the group with the regular expressions.

Week 1;
Lost S06E02.HDTV.XviD-NoTV.torrent
Lost S06E03.HDTV.XviD-P0W4.torrent

Week 2;
Lost S06E03.HDTV.XviD-2HD.torrent
Lost S06E04.HDTV.XviD-P0W4.torrent
Lost S06E02.HDTV.XviD-P0W4.torrent
Lost S06E03.HDTV.XviD-NoTV.torrent

Week 3;
Lost S06E05.HDTV.XviD-P0W4.torrent
Lost S06E04.HDTV.XviD-2HD.torrent
Lost S06E02.HDTV.XviD-2HD.torrent

Week 4;
Lost S06E05.HDTV.XviD-2HD.torrent
Lost S06E04.HDTV.XviD-NoTV.torrent
Lost S06E05.HDTV.XviD-NoTV.torrent

Original comment by craig.ba...@gmail.com on 25 Feb 2010 at 12:55

GoogleCodeExporter commented 9 years ago
Also - it may be worth mentioning that "PROPER'S" should be downloaded perhaps 
once
against any given episode - unless of course the exclusions RegEx specifies to 
not do so.

Original comment by craig.ba...@gmail.com on 25 Feb 2010 at 12:57

GoogleCodeExporter commented 9 years ago
In my opinion, before this is done, we need to address the other RSS features 
in a
few earlier tickets.  The main one being the ability to create multiple groups 
of RSS
filters.  This would be very nice as well.  I think it would be nice to be able 
to
LABEL a specific filter as being a TV show and perhaps having the ability to 
enable
episode tracking based on this.

It should be fairly easy to track episodes in this way because they mostly 
follow a
default naming scheme.  I think this should be a fairly simple modification to 
the
current "history" addon if we can filter that as well and just look for the 
name and
episode.  This way, if the show is LOST and you've already downloaded LOST 
S06E01 and
it is in your history as:
Lost S06E01 720p CTU

it will be able to keep any torrent with 
Lost
and
S06E01 or s06e01 from downloading again.  I agree this would be nice but it 
shoduld
be tunable, as in, you should be able to enable it or disable it and it should 
NOT be
the default.

I still think that we should add the grouping of rss filters first though, as 
per
ticket 280 

and i'd also like to see ticket 279 addressed first as well, which is another 
RSS
suggestion (deferred downloading)

Original comment by Wonslung@gmail.com on 25 Feb 2010 at 12:59

GoogleCodeExporter commented 9 years ago
I am also in favor of this enhancement.

Original comment by Saxfus...@gmail.com on 25 Feb 2010 at 1:08

GoogleCodeExporter commented 9 years ago
I agree with everyone above, i would also like to see a "smart episode filter" 
currently im using a private tracker that manages this themself by flagging the 
torrent 
differently if its released multiple times, however, for the future it would be 
nice to 
have a backup in case this doesnt get done by the tracker ;-)

Original comment by emielvan...@gmail.com on 25 Feb 2010 at 7:57

GoogleCodeExporter commented 9 years ago

Original comment by novik65 on 25 Feb 2010 at 4:33

GoogleCodeExporter commented 9 years ago
This would be great.  I have a few times had an episode download 3 weeks later 
because someone decided to upload an iTunes rip.

There rips generally have a 480 and 720 versions, so it's like 1.6GB for a file 
I 
have to attempt to seed 1:1 and then delete because I've already seen it.

It would be great if it would ignore these files because it's already 
downloaded that 
specific episode.

I don't mind the iTunes version downloading as long as I don't already have a 
copy of 
the episode, which is why I don't want to set an exclude for it.

Original comment by TJHar...@gmail.com on 11 May 2010 at 7:30

GoogleCodeExporter commented 9 years ago
[deleted comment]
GoogleCodeExporter commented 9 years ago
To breathe some life into this interesting topic. Would it really need a 
database? you really just need to add a local compare option which would mean

eg. 

Bones S05E01.HDTV.720p-MYTH

would match with e.g. Bones.*S[0-9][5-9]E[0-9][0-9].*720p.*

But in the new type break down the regex to something that did

Bones.*S([0-9][5-9])E([0-9][0-9]).*720p.*

the following results in exporting the values of the results in the () groups
so you'd get group[1] = 01 and group[2]=01

then the same regex could be set to run in the download folder with the 
variables filled in so that you, if you specified season and episode to output 
variables could end up doing a local match for

$localkey = "Bones.*S".$group[1]."E"$group[2].".*720p.*";
which essentially means: Bones.*S01E01.*720p.*

This would essentially allow the script to make an intelligent local match 
based descision that'd be customizable in detail level by what the user chooses 
to specify

So 

1. Bones.*S([0-9][5-9])E([0-9][0-9]).*720p.* (Modified for variable output for 
Season and Episode Finds Match Bones S05E01.HDTV.720p-MYTH
2. plugin runs Bones.*S01E01.*720p.* against $DownloadFolder, Finds Match 
against Bones S05E01.WEBTV.720p-Other and decides that the user allready has 
the episode so it doesn't download it again.

This would also allow for; with a proper enough regex that people could compare 
the group they had locally to see if a proper was relevant to them.

Original comment by david.st...@gmail.com on 2 Oct 2010 at 9:29

GoogleCodeExporter commented 9 years ago
I'm on openelec where the choice of torrent clients with a RSS system is pretty 
scarce. rTorrent hits the mark on nearly everything except this cool feature 
that utorrent has - the episode filter. So I support this enhancement, for a 
few reasons, one of them being that on my wireless install, torrenting is 
already taking enough bandwidth without having to download something several 
times.

Original comment by nicolas....@gmail.com on 9 Jun 2014 at 7:18