septor / twits

e107v2: Twitter Status Plugin
2 stars 1 forks source link

Idea for performance improvement #5

Closed nlstart closed 11 years ago

nlstart commented 11 years ago

Suppose one has a website with a 50 visitors each hour (I wish...). And let's assume they each see 2 pages of your website on average. That's a 100 page views each hour. Also, let's assume we have activated the twits-menu plugin in a menu area that is displayed on each page. On each page view, the xml is triggered and retrieves 25 tweets each time. That adds up to 2500 retrieved tweets each hour. Now unless one is glued to twitter and cranks out tweets by the minute, this retrieval rate is kind of ridiculous. It would be much better to have the tweets be stored in the e107 cache, for let's say a maximum time of 15 minutes. Then instead of 2500 tweets an hour, only 4 *25 = 100 tweets an hour are retrieved. That will boost performance of the website using twit-menu and also save on (costly) bandwidth.

The downside of it would be that a posted tweet might be 'out of sight' with a maximum of 15 minutes before it gets picked up. But it will save a whole lot of overhead to make up for it. :-)

Best regards, Henk / nlstart

septor commented 11 years ago

Yeah, caching is on the list of things to do. I had thought about just pushing the tweet information into a database table, which would cut down needing to pull 25 tweets after the first initial gather. Then decreasing the amount of tweets pulled to 5 or maybe 10. Caching could still be used, but even if it weren't the amount of tweets gathered would still be cut down a lot.

Ideally I wish Twitter allowed you to gather a set number of tweets without including replies, which would take the total possible tweets gathered per request to 5. Sadly, their exclude_replies parameter only works on the number of tweets gathered (so getting 5 total tweets and having 2 of them be non-replies would only net you 2 tweets displayed).

nlstart commented 11 years ago

Personally, I would not create a database table for this; from a functional but also a technical point of view the cache should be the best approach.

nlstart commented 11 years ago

This might be a more urgent request than I initially thought; on my own website http://e107.webstartinternet.com/index.php twits-menu v0.3.3 already doesn't show the twitter feed anymore (and it worked fine this morning). Maybe Twitter.com got tired of so many similar xml request within a short time. But that's just pure speculation... Fact is that my tweets are not being displayed if getting the xml feed fails; with a buffered xml that would not have been the case. A possibility could even be to store a retrieved xml temporary on the website and keep track of a date-time stamp. If the xml locally is present and not older than X minutes, the local one will be used, otherwise a fresh xml attempt will be made.... Sounds like an idea?

septor commented 11 years ago

I'll work on caching next. Having it save a local copy of the XML file and only getting a new one if it is older than X minutes is how I'll handle it.

septor commented 11 years ago

Caching is added. I was going to try and use the built in e107 cache handler, but I haven't the slightest where to begin, so file_set_contents() and file_get_contents() will work for now.