jgraham: meant to be a starting point for discussion
bkardell: aside from some details maybe I think this is a lot more inline with what I had been suggesting as well
slightlyoff: it's unclear how this addresses our fundamental concern about silent vetoes. our problem hasn't been "quite a lot of consensus", it has been total consensus, right?
nsull: Thinking more about this and coming back with comments makes sense to me. A ranking algorithm would be helpful, if that has not been tried before
foolip: we did try it last time, but there was not a lot of appetite for it at the time
nsull: reasonable for organizations to say we are doing “x” this year as an alternative to publishing vetoes.
jgraham: buckets vs ranked choice - takes a long time to do granular ranking. The alternative is to spend more time upfront
dandclark: agree on the direction to have more discussion. However, the heavy # of vetoes at the end was concerning. Wish we could spend more time trying for consensus as opposed to using vetoes at the end. Are we avoiding that outcome at the end, with the new process?
bkardell: curious as to how we resolve that problem. Would be helpful to know why an organization is championing a specific area/proposal, so as to make it less contentious
dandclark: there will be disagreement on the size of the process. Hiding the process inside a blackbox is what’s more concerning
nsull: the value of an accepted proposal is that developers feel confident that the work will get done. Prioritizing areas that can be completed/achieved is very motivating for both developers and the organization.
nairnandu: agreement on the evidence categories and types - is that a prerequisite?
jgraham: we tried doing that last year with the proposal template. It would be helpful for organizations to share data that is used for prioritization. We dont need to have a formal list of data points.
nsull: could we have a subset of data-points that we can agree on - survey data, impact on accessibility etc.? Core beliefs for what matters most.
nsull: suggest that we do a quick response to proposals that do not meet the basic criteria
foolip: better if we can take some of that burden. Survey data as an example..
jgraham: we should be able to help the proposer supplement the gaps
nairnandu: Is grouping of proposals into focus areas, a prerequisite?
jgraham: yes, we should potentially do that at the beginning
nairnandu: as a next step, would recommend everyone to review and digest the proposal and come back with suggestions/comments.
nairnandu: lolaodelola did express some interest in helping. Will circle back.
Next step: jgraham will create a PR for the backend work
Monitoring test changes - Avoid unintended scope changes to active focus areas #346
gsnedders: what would be the intent? Is it to flag the PR or doing a retrospective action?
jgraham: simple thing would be to put a label on the PR. The difficult thing is to get a start on this work (write an action). We can always change the implementation.
foolip: we dont really have an option to block a PR, but we could leave a review and/or add a label
jgraham: required status could be an option here.
gsnedders: last year we had significant changes. This year, we don’t seem to have as many changes.
Next step: Write down the specifics for the Github action. Owner is TBD.
Here is the proposed agenda for May 2nd, 2024
615