blakearchive / archive

GNU General Public License v2.0
5 stars 7 forks source link

speeding things up #445

Open ghost opened 7 years ago

ghost commented 7 years ago

@queryluke i wonder if we might speed things up by populating the tabs, the 'i' window, and the overlay only when the user goes to them. the default open tab could be the Copy Information, which seems less computation intensive than Objects in Copy.

we would still have to first compute what tabs to show as available, which is pretty quick

queryluke commented 7 years ago

I see 4 opportunities for speeding up the site:

  1. Minifying Javascript
  2. Reducing image size for thumbnails
  3. Optimizing Postgres queries and database
  4. Refactoring code

Using Chrome, if you open up developer tools, click on the network tab, then reload the page, you can see the log of everything that is loading. I used http://www.blakearchive.org/copy/jerusalem.e?descId=jerusalem.e.illbk.32 as my example, for no particular reason

  1. Minifying Javascript Total time to load all javascript is 300-500ms. Back in october or november, I looked an minifying and the resulting file took about 200ms to load. So minifying would help some, but it wouldn't be significant.

  2. Compressing / croping / reducing image size for the thumbnails This is probably the biggest bang for your buck. I'm showing a total load time of 6-9sec.

  3. Optimizing postgres queries The query to http://www.blakearchive.org/api/copy/jerusalem.e/objects takes 2.5s. A query to the DB that only returns a json string shouldn't take that long. So there are clearly some opportunities there. As David said, he and I have been looking into this since we returned from the holiday break.

  4. Optimizing code (for example, your suggestion above) This will probably have the least amount of RTI. I think it would take quite a bit of developer time. My major concern is how those tabs interact with the comparison view. I think it would take a lot of time to make this work. But I could be wrong and you're free to try it. But I don't think it will reduce load time significantly. I'm sure there are other micro optimizations like this that will increase site speed, but you won't notice an increase form any single one.

ghost commented 7 years ago

ok, about 2 and 3, respectively: how does my browser load so many images (handprints) at once in a google image search? it seems like there's another opportunity for optimization here that doesn't involve reducing the image size, but if not, then i don't mind trying that. thanks for looking into the queries.

ghost commented 7 years ago

also, which of these would account for the inconsistencies in speed? sometimes i see the site hang for a very long time.

queryluke commented 7 years ago

In response to how google does it: Google has teams devoted to developing new ways to deliver content in a search as quickly as possible. If you look at their code, you'll notice the images aren't actually images, but base64 encoded data strings that are loaded via javascript. Scroll down quickly and notice the very quick load time. Notice the placeholder of a single color. The short answer is money, resources, and genius developers.

Inconsistency in speed is likely do to traffic and/or long db queries.

ghost commented 7 years ago

yes, that is interesting, thanks for pointing it out.

ok, before i set someone to the task--is this the right thing to do:

make copies of the 100 dpi images reduced to a fixed handprint size (the size of the handprint frame as it is now) and change the references in the code?

re: traffic--the application seems too simple for the need for load balancing, but is this something to think about?

queryluke commented 7 years ago

RE: images, I would say crop the images to the same ratio and compress them. I've used pixlr in the past. You can choose the "quality" of your jpeg save. Less than 50kb would be better, less than 10kb would be best, but it depends on the quality of the image you can get. If you have photoshop there are probably more advanced ways to do this. I would say append the new file with "-thumbnail", then update the code.

re: traffic. No, it's too small to need load balancing, but if the queries aren't optimized too many people running heavy db queries could cause the rest of the site to slow.