w3c / sustyweb

Sustainable Web Design Community Group
https://www.w3.org/community/sustyweb/
Other
154 stars 9 forks source link

New SC for 2.26: Performance Testing #67

Closed AlexDawsonUK closed 5 months ago

AlexDawsonUK commented 5 months ago

Issue: We currently have guidelines on Value Testing, Usability Testing, and Compatibility Testing Into Each Release-Cycle; however at this time we do not have a strict guideline on Performance Testing.

We do have 2.26 which covers the journey itself but not the underlying performance of the website or its infrastructure. As such increasing the coverage to monitoring, research, and analysis to implement solutions would be ideal.

Solution:

  1. Change 2.26 Title to "Incorporate Performance Testing Into Each Major Release-Cycle"
  2. Change description to "Try to ethically measure how efficient a visitor's experience is by analyzing the performance of the website or application and how it has been constructed, by doing so you might be able to reduce any issues they may have encountered previously, decrease loading times, and reduce the burden of loading unnecessary pages."
  3. Add SC "Performance Testing": Regularly measure with each release-cycle (using tooling or through research and auditing) the performance of a website or application to identify and resolve bottlenecks or issues in the underlying code or infrastructure which could ultimately impact the sustainability of a website or application.
  4. Add links to tooling which can assist with performance metric analysis.

Credit: @fullo

mgifford commented 5 months ago

I was thinking about performance budges this morning.  It's related to 5.27 Define Performance and Environmental Budgets 

Can we encourage folks to measure energy intensive data different than the material that is easily delivered. 

1MB of JS is going to consume a long more energy, on average, than an 1 MB JPG. 

How you weight this is something we'd still need to determine, but we can recommend that automated measurement tools start to collect both values. 

This is something that should be able to be done to identify what pages fit over a WSG recommended threshold, and those which are over an organization specific threshold.

AlexDawsonUK commented 5 months ago

Absolutely, my research proved it can be done using standard developer tools so I would be happy to draft a second Success Criteria around this. It should hopefully get people thinking more holistically.

AlexDawsonUK commented 5 months ago

Note: While measuring data intensity could be done under 5.27 (Budgets), I thought it made better sense to put it under 3.1 as the type of content being handled by the browser is more an issue of rendering than of scoping for budgets (it's semantic I know but the budget section is pretty crowded already and I didn't want to make it even more technical than it already is, plus it's something that can be considered when deciding how to implement solutions to meet said budget). Hopefully this makes sense, we can always expand upon this later with further guidance when research can issue more direct causality.

Both of these SCs have been added into the living draft, they will be published in the next edition.