gohugoio / gohugoioTheme

A [wip] theme for the gohugo.io home page, docs and theme sites.
MIT License
106 stars 63 forks source link

Don't inline ALL SVGs #65

Closed XhmikosR closed 7 years ago

XhmikosR commented 7 years ago

https://gtmetrix.com/reports/gohugo.io/1afmLJr4

This is bad as you can see. The initial HTML size is quite big. Netlify supports http/2 so it's not performant inlining everything like you do.

budparr commented 7 years ago

@rdwatters and I have had several conversations on this topic. I think he felt it was early, browser-wise, to go all in on http2 (and we did some tests where the page weight was faster than the request).

Also, It looks like the assets aren't gzipped, which is odd, because I think that Netlify does that automatically (I'm getting ready to switch them to hashes, so will look at it then). The other big fail is a piece of content, a PNG.

Maybe the title of the issue should be performance review?

budparr commented 7 years ago

confirming there is no content-encoding:gzip

XhmikosR commented 7 years ago

My main concern was the huge HTML side due to you inlining SVGs, the logo especially.

The rest are still issues of course but I thought you were aware of those.

Anyway, means of generating the CSS and JS files are missing from the repo too.

budparr commented 7 years ago

I wasn't aware of the gzip issue. I was aware of the issue you raised, but it's something we discussed and decided to go with this approach, for the moment.

Anyway, means of generating the CSS and JS files are missing from the repo too.

I might be missing your meaning, but the CSS/JS files are in the SRC folder and they're generated via Webpack (or, specifically, an NPM script that generates them via Webpack). Is that what you're looking for?

At any rate, thanks for bringing up the issue.

XhmikosR commented 7 years ago

Ah, you are right. I was looking at the root of the repo.

You should also look into hooking uncss or at least purifycss. And after that critical-css and make sure you async load all JS.

XhmikosR commented 7 years ago

BTW this is why I had an argument with rdwatters on https://github.com/gohugoio/hugo/issues/3700.

This is just overdoing things. When you make the HTML size so big (remember what you see is with gzip) there is extra delay before anything else starts downloading. For small files it's OK but this is too much.

budparr commented 7 years ago

I'm quite happy with Tachyons, thank you. It gzips to around 15kb. If you want submit a PR with critical CSS or the other things, that'd be great, but I don't think it moves the needle on a 15kb CSS file.

rdwatters commented 7 years ago

@XhmikosR I don't believe we had an argument, but FWIW, it seems like a lot of people on the forums and in GH threads are very passionate about matters of performance. To be honest, I usually just defer to @jhabdas on these matters.

But that's not why I'm responding to this thread, which has quickly grown from a minor suggestion into an unsolicited, discursive performance lecture that could have been handled more effectively via simple PR. (For example, you add async here). EDIT: Bud just made a good point that this actually isn't needed.

As @budparr mentioned, we had multiple conversations about performance. The Logo SVG was optimized using SVGOMG and we inlined selected svgs for flexibility and reduced HTTP requests. @budparr has been very mindful about keeping our asset sizes to a minimum, so I assure you perf has been a hot topic since Bud was nice enough to start with this redesign. For context, the previous design included an animated SVG on the homepage, which was only possible inline since you can't hit it with CSS via <img> tag.

That said, I don't think this thread is about inline SVGs. I think it's an opportunity for you to engage in a spirited performance debate about http2 and multiplexing, so I'll do my best to keep you happy as a user of Hugo, but I have to tell you that I think people who consider "speed" as the only metric of a good website are generally just nincompoops. It seems you've fallen into the camp that thinks of http2 as panacea. From everything I've read by many people much smarter than me, it's not. Multiplexing is a wonderful advancement, but it's a consideration and not a singular solution to every use case.

You seem to contradict yourself by saying that we should reduce the HTML page size because of the advantages of http2 but then later say we should focus on CRP, which would make HTML pages larger, no?

It might be worth your time looking into how gzipping works in more detail, but basically it's a lossless compression algorithm that replaces repeat data with jump and length values. I'm not here to expand on gzipping--I'm a content strategist and not an engineer--but it's important to realize that such compression algorithms tend to be more effective on larger files since, obviously, larger files have more repeat values. This is also a consideration of stylesheets in critical render path--another one of your suggestions unrelated to SVGS--that you may not have heretofore considered.

So, when do you minify/concatenate/bundle vs compress multiple individual files vs inline vs make external calls to static assets?

The true, considerate, professional answer is as follows:

It depends.

As Bud already mentioned, Tachyons is pretty damn tiny as it is, and most of the pages on gohugo.io are very small. We're also distributing the entire site from a global CDN, which helps for reasons I hope you already know. Not that it's part of performance, but my favorite part is that now content is more manageable because we treat content as discreet entries rather than standalone pages and keep it all in the world's most amazing version control.

In addition to your suggestions, do you know what else we could do?

OR we could do the following:

Stop talking about ideal states and make decisions in the light of constraints, including, but not limited to, staffing, support, maintainability, manageability, architecture, analytics/usage, etc.

This is what I meant before when I said "considerations":

With respect to your other comments, please open up individual issues for each (e.g, switching to CRP, async JS, changing the SVG partials) if you think they're worth the effort.

Thank you for your continued support of Hugo, and I'll make sure to reach out to Netlify to find out about the gzipping.

ghost commented 7 years ago

@XhmikosR thanks for contributing. @rdwatters thanks for the ping.

Lots of chat here. Anything immediately actionable on this awesome creation from @budparr? Always room for improvement of course. Personally I try to rely on the following guiding principles before deciding to take action on my websites:

  1. Pareto principle. Is the improvement worth it now given broader context?
  2. DMAIC. How to approach the improvment/change in a methodical nature. I've written a little about DMAIC here if you're interested: https://habd.as/talks/screaming-fast-wordpress-redis-vultr/

Re: HTTP/2 here's a recent article you may be interested in with some good comments forming from the dev community: https://jakearchibald.com/2017/h2-push-tougher-than-i-thought/

I'm not sure how @budparr what's to handle this issue. But I personally tend to try and keep my issues pruned in my repos unless there's an immediate and pressing need of the majority.

Cheers!

ghost commented 7 years ago

@XhmikosR almost forgot to mention, you may find the following of interest for your own sites and prototypes using http/2 and server push: http://caddyserver.com/

XhmikosR commented 7 years ago

Just to clarify: I'm not here to educate, nor did I claim I know everything about performance.

That being said, there was a reason that I didn't CC @rdwatters, and it's this one exactly. I don't have the time to engage is such discussions; I just wanted to bring to your attention the issues of your current implementation.

Now, back to the topic:

  1. I never claimed http/2 is panacea. BUT it's what it does by its nature, read about http/2 multiplexing
  2. That being said, what you have now is wrong, and try to understand why I say the HTML size is big. Read about TTFB, First meaningful paint etc

lighthouse

This is my last post on the subject. No hard feelings, it's your issue, I just wanted to bring it to your attention and not having to go through all this.

ghost commented 7 years ago

It's all good @XhmikosR. Thanks for the info.

@rdwatters @budparr I strongly advocate a DMAIC approach to performance and UX. In order to do so it becomes necessary to start measuring performance. Snapshots of performance are almost never useful—it'd be like looking at a stock price on any given day and trying to determine how well the stock is doing.

If you'd like to measure ongoing performance of the site I wrote an article on how to use Speed Tracker (FOSS tool using Web Page Test API and Lighthouse) to measure perf: https://habd.as/monitor-pwa-website-performance/

I used it to monitor the theme site and, if you look here, you'll see what happened after I added image lazyloading: http://speed.habd.as/themes_gohugo_io-cable/?period=year

budparr commented 7 years ago

Thanks for the tips, everyone. And thanks @XhmikosR for reporting the issue. The issue around gzipping is of far more concern to me, tbh.

When we initially had the discussion I compared the performance of requesting the image or inlining , and the inline outperformed (this was done considering that not all users are going to have browsers that support http2, thus making a concession that wasn't a factor). So I made a decision based on that test. @jhabdas has subsequently brought up a technique of lazy-loading images even when they're in the initial viewport view, so that's something I may look at, because no matter how it's loaded the first thing on that page is >100kb image.

As far as these other things...

You should also look into hooking uncss or at least purifycss. And after that critical-css and make sure you async load all JS.

Can you show me how they would improve performance in concrete terms? I tested critical CSS on the site. For a ~15kb gzipped CSS file it did nothing to improve performance and created a slightly less good user experience. Perhaps you were making the decision thinking that our CSS file was much larger, I don't know, but critical css, as in all performance rules, need to be used in context and not as absolutes, in my view.

My JS is loaded at the bottom of the page. Can you point me to a test or something to show that it needs to be asynced there to improve performance? I truly don't understand.

I think these other things you suggested strip out unused css(?) So what you don't know is that we don't build our CSS/JS with every site build. It's not necessary. The nature of Tachyons is that a developer can come in and make changes without writing any CSS, necessarily. Believe me, I agonized over the size of the uncompressed file in terms of its efficiency before I decided to use it, but I think it's a good trade-off, and the way it's written is such that it compresses quite nicely. I don't think uncss is appropriate here. I'd be happy to be proven wrong. Webpack has some features that might help in this regard, but I haven't been down that road and again, not sure its worth it.

At any rate thanks again everyone for your concern and taking the time to write here.

rdwatters commented 7 years ago

Hey guys. Thanks for the intel to both @XhmikosR and @jhabdas. (And @budparr.) As an FYI reached out to Netlify and Jessica was nice enough to check on the gzipping issue for me and wrote a few follow-up email. Man, they have great customer service.

Also, I learned of this site, which is probably old hat at this point for everyone on this thread and great if you're too lazy to check dev tools 😆:

https://checkgzipcompression.com/?url=https%3A%2F%2Fgohugo.io

Anyways, our compressed size at ≈43k. That's still a single packet, no? Wouldn't making this an external call to an tag add an additional 1400 byte header? (This is a serious question for @jhabdas).

Have we looked at other pages? Most are coming in between 24k and 32k for the HTML. Seems like perf geeks (not an insult) love to check the homepage, which stands to reason since it's the single most frequently accessed page, but it only accounts for ≈18% of traffic for 2017.

ghost commented 7 years ago

@rdwatters I know that I know nothing, except that I know one thing—we can use DMAIC. To introduce the methodology here is very simple and does not require deep knowledge of any particular browser, the users... none of that. All we gotta do is:

If this seems worthwhile I can take a PR or work up a commit to add the site, and investigate Netlify's h2 (http2) options to run a RUM test.

rdwatters commented 7 years ago

Awesome, I'm a supporter of DMAIC (although hearing "Six Sigma" gives me the howling fantods but only because of personal corporate reasons). I'm deferring to @budparr since he is the guru of the theme itself. Cheers and thanks again @jhabdas!

ghost commented 7 years ago

howling fantods

I'm stealing that. ><

rdwatters commented 7 years ago

I'm stealing that. ><

I did too:

https://www.brainpickings.org/2012/09/04/words-david-foster-wallace-mom-invented/

You're a fellow Land-o-Lincoln-er, yeah? RIP, DFW 😢

budparr commented 7 years ago

@rdwatters Interesting about the gzip. I think there's a gap in my knowledge about what the headers look like with http2/push. Webpage test gives me a warning about gzip not working, yet now I see in dev tools that the initiator is push/otherso I guess it's not the issue I thought it was. Thanks for checking with them.

I'm going to close this issue so folks who don't want the notifications don't get them. I'm happy to chat about performance on gitter or slack.