Closed AlexDawsonUK closed 5 months ago
I was thinking about performance budges this morning. It's related to 5.27 Define Performance and Environmental Budgets
Can we encourage folks to measure energy intensive data different than the material that is easily delivered.
1MB of JS is going to consume a long more energy, on average, than an 1 MB JPG.
How you weight this is something we'd still need to determine, but we can recommend that automated measurement tools start to collect both values.
This is something that should be able to be done to identify what pages fit over a WSG recommended threshold, and those which are over an organization specific threshold.
Absolutely, my research proved it can be done using standard developer tools so I would be happy to draft a second Success Criteria around this. It should hopefully get people thinking more holistically.
Note: While measuring data intensity could be done under 5.27 (Budgets), I thought it made better sense to put it under 3.1 as the type of content being handled by the browser is more an issue of rendering than of scoping for budgets (it's semantic I know but the budget section is pretty crowded already and I didn't want to make it even more technical than it already is, plus it's something that can be considered when deciding how to implement solutions to meet said budget). Hopefully this makes sense, we can always expand upon this later with further guidance when research can issue more direct causality.
Both of these SCs have been added into the living draft, they will be published in the next edition.
Issue: We currently have guidelines on Value Testing, Usability Testing, and Compatibility Testing Into Each Release-Cycle; however at this time we do not have a strict guideline on Performance Testing.
We do have 2.26 which covers the journey itself but not the underlying performance of the website or its infrastructure. As such increasing the coverage to monitoring, research, and analysis to implement solutions would be ideal.
Solution:
Credit: @fullo