damienhaynes / moving-pictures

Moving Pictures is a movies plug-in for the MediaPortal media center application. The goal of the plug-in is to create a very focused and refined experience that requires minimal user interaction. The plug-in emphasizes usability and ease of use in managing a movie collection consisting of ripped DVDs, and movies reencoded in common video formats supported by MediaPortal.
12 stars 6 forks source link

Scrapper engine : Allow get all information (detail + covers + fanart) in One Time #702

Closed GoogleCodeExporter closed 9 years ago

GoogleCodeExporter commented 9 years ago
Hi,

When all information about a movie is available in one request (xml
download), it is unnecessary to re download the file in get_cover_art and
get_backdrop action.

Can you allow fill the variables of covers and fanart in the get_detail
action ?

Thanks and very sorry for my bad English ...

Original issue reported on code.google.com by titoftit@gmail.com on 21 Dec 2009 at 1:55

GoogleCodeExporter commented 9 years ago
Well the reason that these tasks are split up is because they are not always 
done at 
the same time. For example it is common to want to update movie details without 
pulling 
new cover art. Or perhaps redownload covers without affecting the movie 
details. 
Combining these tasks would reduce the flexibility of the scraper system.

I realize this in some situations creates redundancy, but bandwidth is not 
exactly a 
scarce resource and I think the minor hit in speed is worth the increased 
flexibility 
of the current system.

Sorry titoftit. We definately appreciate the feedback, but I don't think we 
will be 
changing this.

Original comment by conrad.john on 21 Dec 2009 at 5:47

GoogleCodeExporter commented 9 years ago
The real problem about this way of acting is that the plugin take 3 times the
necessary ressource on the scraped site. In this case our database server.

It would be great if your application can add some sort of support to not hammer
target site. Like file cache since at this time on our site the plugins ask 3 
times
the same file.

Original comment by Tolriq on 21 Dec 2009 at 6:25

GoogleCodeExporter commented 9 years ago
Ok I understand.

another idea : Can you add a node for write a file in movie directory ?

With this node, I can write the xbmc file directly in get_detail action and use 
it in
get_cover_art (like your XBMC script).

What do you think ?

Original comment by titoftit@gmail.com on 21 Dec 2009 at 6:27

GoogleCodeExporter commented 9 years ago
I wouldn't be opposed to some sort of caching of details, but this would be a 
non 
trivial change. It would have to be pushed into 1.1 which is at the very least 
a month 
or so away.

What site are you from tolriq? If you can jump onto #moving-pictures at 
irc.freenode.net we can discuss further.

Original comment by conrad.john on 21 Dec 2009 at 6:36

GoogleCodeExporter commented 9 years ago
I'm one of the author of the ciné passion scraper.

This site is one of the most used database for french movies. (Ok perhaps not 
but it
will :p it's mainly used for XBMC, Medioos , Windows Media Player , ...)

http://passion-xbmc.org/scraper/index2.php

Collaborative and with a open API incomming.

I'm currently not at home but it will be a pleasure to speak when i go back 
home,
even if i'm not affilied with your software and it's utilisation.

Original comment by Tolriq on 21 Dec 2009 at 7:44

GoogleCodeExporter commented 9 years ago
Okay, I have scheduled an enhancement (Issue #705) for our next feature release 
(1.1) 
to allow retrieved documents to be cached by a scraper script. This will 
eliminate the 
need for multiple retrieves and therefore reduce traffic to the website being 
scraped.

Original comment by conrad.john on 21 Dec 2009 at 8:27

GoogleCodeExporter commented 9 years ago
Many Thanks !!! ;)

Original comment by titoftit@gmail.com on 21 Dec 2009 at 9:18