web-platform-tests / interop

web-platform-tests Interop project
https://wpt.fyi/interop
278 stars 28 forks source link

Interop 2024 Dashboard updates #615

Open jensimmons opened 8 months ago

jensimmons commented 8 months ago

It seems that the Interop 2023 Dashboard and the changes we made a year ago from the previous dashboards have served us well. It's likely, however, that there are additional tweaks that we will want to make going into 2024. Let's articulate what those are here, and debate them.

I have a few:

  1. I believe I heard a proposal on a call to add a column for Edge, and break its scores out separately from Chrome.
  2. I believe I also heard a proposal to switch Chrome on the Experimental dashboard from Dev to Canary.
  3. I believe we agreed to round scores off according the most commonly understood way to round numbers off, but then the dashboard was not built with rounding. Let's discuss.
  4. Shall we remove the "Total" from the "Previous Investigations" section? It seems unnecessary and unclear.
  5. It'd be great to add descriptions of what's in each focus area to the page — perhaps lower down, where people can scroll to read more. I know descriptions where published here on Github for Interop 2023, but that information was not easy to find.
jgraham commented 8 months ago

Our priorities for the dashboard in 2024 are:

I've done some of the backend work to support these use cases in https://github.com/jgraham/wpt-interop/tree/score_2024

foolip commented 8 months ago

We issued a call for change requests in https://github.com/web-platform-tests/interop/issues/524 and discussed in https://github.com/web-platform-tests/interop/issues/599. Quick/easy changes can probably still be done, but for anything substantial we need to make sure there's someone who can commit to implementing the changes before launch.

cc @DanielRyanSmith @past FYI

About the specifics that @jensimmons listed:

I believe I heard a proposal on a call to add a column for Edge, and break its scores out separately from Chrome.

Yes, I made a partial mockup of this in https://github.com/web-platform-tests/wpt.fyi/issues/3580#issuecomment-1833914988. In a way it's fallout from the next item.

I believe I also heard a proposal to switch Chrome on the Experimental dashboard from Dev to Canary.

Indeed, the Chrome team wants to switch to Chrome Canary as the experimental Chrome on wpt.fyi in general. https://github.com/web-platform-tests/wpt.fyi/issues/3579 tracks this. @DanielRyanSmith is working on this.

I believe we agreed to round scores off according the most commonly understood way to round numbers off, but then the dashboard was not built with rounding. Let's discuss.

Looks like https://github.com/web-platform-tests/interop/issues/257 is when we discussed this. Looks like I never filed an issue on wpt.fyi, so I suspect this request never even reached @DanielRyanSmith. Let's discuss again.

Shall we remove the "Total" from the "Previous Investigations" section? It seems unnecessary and unclear.

Sounds good to me.

It'd be great to add descriptions of what's in each focus area to the page — perhaps lower down, where people can scroll to read more. I know descriptions where published here on Github for Interop 2023, but that information was not easy to find.

This should be trivial to add in if we have the descriptions, so we'd need a volunteer for that.

jensimmons commented 8 months ago

Also...

  1. It'd be good to make sure we are set for the Interop 2023 dashboard to be fully frozen at the end of the Interop 2023 cycle. That the scores in the big numbers don't change. The tests that were "counted" do not change. That the list of score details does not change. This might be best accomplished by simply disconnecting the page from the backend, and hard coding the numbers into the HTML as an archive.

Perhaps this is already happening for prior years, but I wanted to check.

  1. When does the Interop 2023 cycle officially end? One could assume Dec 31, 2023. But it likely makes more sense to end it the day or day before Interop 2024 launches. Since Interop 2023 was launched at the end of January 2023, extending to the end of Jan 2024 gives the 2023 dashboard the full 12 months of time to reflect scores. Instead of freezing it while it's still the main project dashboard during Jan 2024.
foolip commented 8 months ago

I've filed these issues after discussion in #619:

kbrilla commented 8 months ago
  1. It'd be good to make sure we are set for the Interop 2023 dashboard to be fully frozen at the end of the Interop 2023 cycle. That the scores in the big numbers don't change. The tests that were "counted" do not change. That the list of score details does not change. This might be best accomplished by simply disconnecting the page from the backend, and hard coding the numbers into the HTML as an archive.

As a user I would prefer there to be up to date score - "fixed" number does not give any usefull information - I don't care what score whas on the end of 2023 I want to know what it is now.

Also graph select options could include "Interop 2022" and "Interop 2023" apart from "All Active Focus Areas"

jensimmons commented 8 months ago

As a user I would prefer there to be up to date score - "fixed" number does not give any usefull information - I don't care what score whas on the end of 2023 I want to know what it is now.

The previous focus area section of the Interop dashboard is the right place to see the current scores for things that were in previous years. That will have the current score of the current versions of tests — and let you judge how each browser is doing on those technologies.

The previous year's pages truly are archives — to see what was accomplished in 2021, 2022, 2023, frozen in time. Simply as an archival record. If we let those pages update (as tests are changed, for example), we won't have any record of the real-time-from-the-past scores from each year. Having an archive is helpful.

The archive doesn't tell us the current test pass rates for Cascade Layers or Aspect Ratio. The current dashboard does:

Screenshot 2024-01-02 at 11 05 25 AM

And no, there's no "giant number" for the things that were included in 2021 vs 2022, etc... but that seems very unimportant. It doesn't matter which year something was added to the Interop project, or how many years it was included.... what matters is the current test pass rate.

If you want to see who's "winning", you can see the current totals — right now it's Chrome 99.2%, Firefox 97.2%, Safari 99.6%, and the number that matters the most: Interop 96.1%.

dbaron commented 7 months ago

For what it's worth, another argument in favor of freezing the dashboard is that in some cases the tests are going to change, either by adding new tests that weren't part of Interop2023 or by updating to match clearer/stricter requirements in specifications.

One large example of this is that after we freeze the dashboard for Interop2023, we'd like to reland web-platform-tests/wpt#42857 (which was reverted in web-platform-tests/wpt#43212) which added thorough test coverage for the new transition-behavior CSS property by adding that coverage into the existing property-specific animation tests. (This is the best way to get that test coverage because we get tests for transition-behavior automatically whenever a test author uses the existing javascript functions for adding tests for CSS animations of particular properties or values, but it means that the tests are added into existing test files, many of which are part of Interop2023.) Freezing the Interop2023 dashboard means that these test results won't suddenly change when we change the tests.

This is also probably not the only example where we want to change the tests after Interop2023 is done but have been waiting -- though it may be one of the larger ones.

rianmurnen commented 7 months ago

The Interop initiative and the efforts of the browser teams have been incredible. As an observer, I check the dashboard a few times a month out of curiosity. I have a few requests about the information on the dashboard going into 2024:

The meaning of “Previous Focus Area” on the dashboard is unclear. In 2022, there were 15 areas. For 2023, 7 of 15 were listed as “Previous” and the other 8 were included in “Active”. “Active” implies that the “Previous” areas are inactive… meaning out-of-scope for the current year…?

Description Requests:

In addition to explaining the overall purpose, goal, and how to contribute to Interop in the extra-description text:

Other requests:

zcorpan commented 7 months ago
foolip commented 7 months ago

@zcorpan the smooth colors issue is tracked in https://github.com/web-platform-tests/wpt.fyi/issues/3694, PR at https://github.com/web-platform-tests/wpt.fyi/pull/3668. I'll point out the contrast issue there again.

gsnedders commented 6 months ago

I believe I heard a proposal on a call to add a column for Edge, and break its scores out separately from Chrome.

There was prior discussion in https://github.com/web-platform-tests/interop/issues/96.