Closed rianfowler closed 2 years ago
Setting up Speedcurve to monitor the top 10 landing pages for va.gov (May 2019-May 2020)
Top 3 landing pages for incoming traffic - 37.33% of total incoming traffic Top 10 landing pages for incoming traffic - 48.3% of total incoming traffic
Problem
The platform needs a proper performance monitoring process for the website to ensure we can have standardized ways to measure, evaluate and govern website monitoring so that we can promote valuable improvements and ensure stability.
VSP Q1 + Q2 2020 OKRs
O5: All customers comply with the Platform's standards.
Measuring Success
Details and Notes
copied from: https://github.com/department-of-veterans-affairs/vets.gov-team/issues/13265#issue-360863830
We don't do a good job of monitoring front end performance at Vets.gov. In an ideal world, we would look at our metrics and set a performance budget, then calculate backwards to get rough bundle size requirements, which can then be enforced. Then the performance of our site would be regularly checked or monitored through RUM tools and the bundle size requirements adjusted regularly.
But given that we don't have all that, a simpler approach would be to decide a performance budget that we think is realistic and work towards that. One calculation comes out to about 170kb of compressed HTML, JS, and CSS on a page (https://infrequently.org/2017/10/can-you-afford-it-real-world-web-performance-budgets/). Vets.gov is at 485kb for the homepage, so a 250kb budget might be an achievable goal for content pages.
The above is just an example, we need to put more thought into how to come up with those numbers.
Hypothesis
But in general, we serve too much Javascript and CSS to users on Vets.gov, especially on content pages. We also serve too many small JS files, which pushes back the start of the main JS bundle, further slowing down load times. This primarily impacts disadvantaged users on older mobile devices on slow networks. See https://twitter.com/slightlylate/status/1039694738905952256
Hitting performance goals like the above is hard for React based apps, but can definitely get closer:
Areas for Discovery
We could cut down on our polyfills by changing the Babel polyfill config to only polyfill for modern browsers and split our polyfills for non-modern browsers into a separate file that's loaded only by browsers that need them. We can do this with the "nomodule" technique. This may slightly increase the JS downloaded by IE11 users, but will be an improvement for all mobile users and most desktop users.
We should start removing lodash/fp. We can start in the static pages bundle, but we should look to do this in other places as well.
We should do a thorough audit of our bundles and determine other places where we can remove code.
We should do a thorough audit of our CSS and remove more dead/unused styles.
We should evaluate combining some of the small JS scripts we load, which push down the main assets in priority.
We should reevaluate how the navigation menus work. There are plans to make these work with a dynamically loaded set of menus, but if we can make this work at build time and go back to CSS menus, we could save some JS.
The login menu items are also in React. We could possibly remove React from this and have it only load the login modal and sign in flow when a user clicks. This would take some more investigation.
Look for more opportunities to do code splitting
Do some initial service worker investigation
Re-evaluate the cost of server-side rendering and any benefits we'd gain
There are also some suggestions in this GH issue which we should look into: department-of-veterans-affairs/vets-website#8391