loadletter / mangaupdates-urlfix

Userscript that adds website and IRC links on the groups pages of mangaupdates.com
63 stars 10 forks source link

Improvement #8

Closed loadletter closed 9 years ago

loadletter commented 10 years ago

I thinking of doing something like this to make things easier:

  1. Slice the groups files in pieces based on the group id, so loading a page doesn't have to load a single 3500+ groups file
  2. Deprecate and remove the offline version because of 1, since probably people using the website are already connected to the internet
  3. Replace the 'undefined' with something nicer
  4. Change the links in the old online version to point to the new version or show a 'you must update the script' message and remove definitely groups.js after a while

Opinions?

@rhung @brunoais @jrdp18 @inkochan @tallos @zazuge @claireanlage @projectifinity @Syntonic

MilesWilford commented 10 years ago

What's the issue with just dropping the groups in a user's LocalStorage? That'd keep the benefit of one tight file while also preventing obnoxious load issues after the first go.

It sounds like your other suggestion is to have a groups/ID###.json file for each group. Is that correct?

jrdp18 commented 10 years ago
  1. I don't really feel or notice any issue on loading the script. I don't know about older hardware computer so if it'll be better for them, go for it.
  2. As I mentioned here, I don't see the point of keeping an offline version anymore. So yes, remove it.
  3. Hmmm. "Add website"?
  4. I didn't get the last part but you want to make a notification for update right? Is it possible to just make it update by itself like the offline version? Sorry, I've been using the offline version until now.
brunoais commented 10 years ago
  1. Looks good as long as there's no more than ~20 files.
  2. What if the script host (even if github) is offline? Can it be considered a non-issue?
  3. Yeah... it is weird. Maybe go with "no info"?
  4. I'll check to check how that works.

About 1. Maybe separating into files with 500 number possibilities for groups each would do. Something like:

file 000.json has, at most, the first 500 (assuming it starts with 0). file 005.json has, at most, between 500 and 999. file 010.json has, at most, between 1000 and 1499. etc...

It's true that the distribution will be uneven but that's for the sake of easily hitting the right file faster.

loadletter commented 10 years ago

@MilesWilford I could store those in the localstorage with some kind of timeout, but since even the simple offline version isn't that reliable i'm not sure. As for the splitting i was planning on just doing groups/$(ID % N).js where N is some number maybe 10 or 20 like @brunoais suggested.

@jrdp18

  1. There are some tablets using the script, this may give them a speedup
  2. Ok
  3. Ok
  4. The online version just downaloads the groups.js file everytime, since the url never changed it always worked, if the new version is going to use different files i'd add some please update message to the old version before breaking it (giving it a month or so to update) by removing the file.

@brunoais

  1. That was the main concern for keeping the offline version, but since it now uses github pages which is cached on some kind of content delivery network it should hopefully work.
brunoais commented 10 years ago

If you use localStorage (don't forget to namespace!!!) then point 2 is, most probably, a non-issue. If it is offline, it just uses the one on localStorage.

When it is time to update, you may just use the one on localStorage until it gets the new file with the updated version and then replace them in the DOM when an update comes.

loadletter commented 10 years ago

So after a timeout check if the urlfix_grouplist variable is defined otherwise load the last file from localStorage? What would a reasonable timeout value be?

brunoais commented 10 years ago

No timeout. Think like this:

Page loads:

  1. Get info from localStorage for group.
  2. Check if info is considered up to date.
  3. If outdated. 3.1. Initiate a request to get the updated version (use XMLHttpRequest, if possible).
  4. Update the DOM with the information gathered from localStorage.
  5. Wait for the answer from the remote request to complete.
  6. Update the information on localStorage based on the new information available.
  7. Update the DOM based on the new information available.

Makes sense?

loadletter commented 10 years ago

But i didn't use XMLHttpRequest because of

XMLHttpRequest is subject to the browser's same-origin policy: for security reasons, requests will only succeed if they are made to the same server that served the original web page.

I could use GM_XMLHttpRequest, but doing it with the script like it does now seems like the most portable way.

brunoais commented 10 years ago

It's alright then. Just use the onload callback to trigger the 6. and 7. I mention above

loadletter commented 10 years ago

And add a warning when the fallback is being used and the use clicks on 'Suggest an update' like 'Could not fetch latest info, the website might be outdated, consider refreshing the page before submitting a new entry'

brunoais commented 10 years ago

Don't use a warning for the fallback is used. Use it when the "suggest an update" is clicked and the fallback is under use, instead. If you think the users should be warned when the fallback is being used, try using a small image that demonstrates that.

loadletter commented 10 years ago

Use it when the "suggest an update" is clicked and the fallback is under use, instead. Yes, that's what i meant

brunoais commented 10 years ago

Hum... I mean the situation where it is both at the same time :) (or was it like that that you understood?)

loadletter commented 10 years ago

Done, after updating to a dummy version both of the old versions should switch to the new one.

xrb-rick commented 10 years ago

Since Chrome is always blocking the file whenever it's reopened, I'd like to ask you if you could provide a .crx version of it or at least the instructions to make it, since I don't know much about it...

loadletter commented 10 years ago

Does this work for you? http://a.pomf.se/srmckc.crx I'll need some time to create one with autoupdating and all

xrb-rick commented 10 years ago

Unfortunately, it still happens... I don't get it, I have the Sad Panda crx applied and it has never been blocked by Chrome so far...

loadletter commented 10 years ago

Blocked how? Like this: http://www.ghacks.net/2012/06/12/chrome-fix-extensions-apps-and-user-scripts-cannot-be-installed-from-this-web-site/ ?

xrb-rick commented 10 years ago

Nope, I can still toss it directly from a folder into the Extensions Section and have it do it's magic but when I close Chrome and reopen it, it pops up a message stating that since the App wasn't from Google Store, it may be harmful content added against my will and simply stops it and blocks it's reactivation, forcing me to exclude it from the Extensions Section and toss it again, way troublesome... The Sad Panda .crx, an App that allows me to check a certain site free of troubles, ever since it was added in the Extensions Section, has never been deemed as harmful by Chrome or stopped or blocked... Hence me thinking that .crx files may be capable of steering clean from Chrome's antics...

brunoais commented 10 years ago

@xrb-rick Just open the crx file with chrome. It should work that way.

xrb-rick commented 10 years ago

It works, that's not the problem. Whenever I reopen Chrome, it stops and blocks the reactivation of the app, forcing me to exclude it from the Extensions Section and add it again and again...

loadletter commented 10 years ago

Does it work if you install tampermonkey from the chrome store and install the script with that?

xrb-rick commented 10 years ago

I never tried to get Tampermonkey, I'll give it a try.

EDIT: Had to get Tampermonkey, now it seems to be working just fine.

loadletter commented 10 years ago

This is what happens when your freedom is removed to force you into a walled garden.

xrb-rick commented 10 years ago

That was a good one but could you state it in more proper words, please?

loadletter commented 10 years ago

Preventing direct installation of userscripts and profile directory tampering should be enough to block malware, disabling everything is just evil.