Brainstorming session on developer, user and compatibility signals
nsull:
Signals from internal bug tracking systems on the given feature
User signals - things that will break for users(like security, accessibility) or be highly used. Painful areas that are missing in one browser alone
Developer signals - listening to what they are telling us. Data at scale is also important - like State of * surveys
Sites adopting performance hacks could be another signal we look at
foolip:
Developer signals - surveys like state of CSS, HTML, JS etc.
Signals (“Star” (+1) counts as an example) on the bug tracking systems
Use counter data has not been very helpful in the prioritization process
bkardell
Have always considered a mix of many signals. Happy to give this more thought and come back
dandclark
Formal survey data - like the state of * surveys
Bug reports
Developer discussions like stackoverflow, blog posts etc.
Polyfill usage, if that signal is available
Issues with direct user pain on areas like accessibility need to be considered, but wondering about the signals for those
jgraham
Bug reports for a specific feature, especially if it is breaking a site
Short-term, medium-term compat risk - are we seeing a lot of demand for this feature? Is it implemented in other browsers? Signals from surveys would be useful here
Use counters have not been very helpful/useful
Standards position
bkardell: There is some value in separating the signals discussion and aligning on that before the engineering prioritization conversation happens
nsull: I dont think we will agree on the priority of signals, but it would be good to understand and recognize the different perspectives.
jgraham: +1 The idea is not to create a formula for prioritization. Last time, we tried to look at broader areas where we can make meaningful progress. That would still be a meaningful discussion to have.
foolip: don't actually have tangible user signals. Site-impact would be different from how/what user benefit a feature would have. The second part is where we dont have enough signals
jgraham: for broken sites, it is quite possible that an issue is caused by an implementation, that is not spec-aligned. Those are ripe candidates for Interop. If something is very easy to polyfill, it might not be a big win for developers.
nsull: it's also really important for the eng teams to stay motivated and have an impact on the Interop score.
bkardell: +1. As a team we want to stretch and get agreement as much as possible. However, we also need to optimize what is achievable in a given iteration
dandclark: +1 on not biting off too much, but still finding room for alignment
jgraham: prioritization is just as important as volume. If we work on the same things jointly (as opposed to each org doing 10 different things), it benefits developers by giving them something more use-able
dandclark: The trap we should not fall into is Interop becoming a reflection of where there was already some overlap in planning. We should be steering to build consensus on areas where there might not have been overlap, if not for Interop
jgraham: this year’s selection process is aimed at providing an avenue for more discussion and consensus building. We can add more clarity on the signals that we look at, in the proposal template
nsull: Understanding signals that each org considers internally would be helpful when we champion proposals and need to build consensus with others
foolip: a shared artifact of signals would be useful
jgraham: agreed, it should be a collaborative effort where participants can add their data points like bug signals as an example
Next step: Include signals surfaced in this discussion, on the Interop 2025 proposal template, with enough clarity on the importance of each. Create a shared artifact to capture additional signals.
foolip: We could use with some more clarity on areas that have mixed rankings
jgraham: The idea is not to come up with an algorithm. There has to be some flexibility for us to get to an agreed ranking
nsull: capacity assessment should be the last step after rankings are done
dandclark: the visibility part was dropped from the original proposal. Our position is that we should emulate the standards process and be public about the decisions
nsull: by all means, it should be possible for organizations to surface what they are working on. The concern is on sharing Interop positions.
Here is the proposed agenda for the meeting on June 27th, 2024