HHS / simpler-grants-gov

https://simpler.grants.gov
Other
46 stars 13 forks source link

Engagement Sessions * #2204

Open widal001 opened 1 month ago

widal001 commented 1 month ago

Summary

Members of the Co-Design Group and open source community have attended at least one event (per group), and user research has been conducted.

Press release

We’re excited to announce the successful completion of the Collaborative Coding Challenge Pilot, which has laid the groundwork for a scalable framework to support future hackathons. This pilot event, conducted in a fully remote environment, brought together participants who engaged in innovative problem-solving and collaboration. The feedback gathered from this event will be instrumental in refining our processes and ensuring that future hackathons are even more impactful and inclusive. Participants praised the streamlined structure and the opportunity to contribute to the Grants.gov Open Source community, marking this pilot as a significant milestone in our ongoing efforts to foster innovation and collaboration.

We're also very happy to report on a successful first Co-Design Group session, where we engaged community members – especially those currently underserved and overburdened by the grantmaking process – in a collaborative workshop to get community input on which user needs we should prioritize for Simpler.Grants.gov. Co-Design sessions included more than 12 members, and most everyone indicated that they would be excited to attend another session in the future.

Lastly, we're thrilled to report back on a successful round of usability testing for our new grant search feature. We conducted 10 sessions with grant applicants, grantors, and HHS staff to prove out the new design. These studies yielded several tangible issues to resolve, which are now prioritized in the team's development backlog.

Acceptance criteria

Metrics

Assumptions

Dependencies

mxk0 commented 1 month ago

"Participation: Number of attendees for all events combined is greater than 25." <-- should be "equals 25"

widal001 commented 1 month ago

"Participation: Number of attendees for all events combined is greater than 25." <-- should be "equals 25"

@mxk0 Good catch! I updated it to "greater than or equal to 25" to reflect the "at least" in the ACs and in case more folks wind up participating, but feel free to make it "equals 25" if you want to shoot for that number exactly.

lucasmbrown-usds commented 1 month ago

I added "and user research has been conducted" to the summary, maybe we can add a couple sentences to the press release that reference the user research part of this Agile Deliverable? Otherwise the summary and press release only mentioned 2 of the 3 activities that are part of this Agile Deliverable.

mxk0 commented 1 month ago

On it. Done.

widal001 commented 1 month ago

Press release edits look good to me! Thanks for adding the references to the user research.

andycochran commented 1 month ago

[ ] User research: At least 10 users participate in user research sessions and are compensated for their time and expertise.

Does this mean we must compensate 10 participants or only those of the 10 who can accept compensation?

The Search UI testing plan calls for speaking to 6–8 grantors and 6–8 applicants:

Compensation plan Grantors & internal stakeholders: Incentives will not be provided to federal employees/grantors, internal stakeholders, or HHS staff. Applicants: We will compensate participants in the Applicant user type (8 participants total, $100 for 60-minute sessions).

Since we can only compensate the applicants (not grantors or internal staff), our plan does not account for compensating 10 participants. If we test w/ 6 grantors and 6 applicants, we will have >10 participants but <10 compensated. Would this fulfill the AC?

Also want to confirm the assumption that the 10 participants under "User Research" are separate from the 10 participants under "Co-Design Group."

@margaretspring @lucasmbrown-usds @mxk0 @crystabelrangel

rishalee commented 3 weeks ago

Is it a requirement for co-design group that the format must be a single group meeting of at least 10 people?

my recommendation is to conduct 1:1 interviews with at least 10 selected co-design group participants before meeting as a group to build trust with folks as individuals and to source information that i will then consolidate to optimize the first group exercise. my understanding is that this would meet the above requirement for the co design group, but of course i could be wrong. hoping we don't have to rush this process and wanting to align with people here on format of group. thanks!

mxk0 commented 3 weeks ago

@rishalee we'll get back to you asap.

margaretspring commented 3 weeks ago

@andycochran - I'll get clarity one the 10 compensated user research participants. I'd guess bc we're meeting with the grantors there will be flexility. I would agree that the 10 participants under "User Research" are separate from the 10 participants under "Co-Design Group." I'll also follow up on your question @rishalee - Thank you both!

lucasmbrown-usds commented 3 weeks ago

As discussed today, @juchang111 and I approve changing the language of Acceptance Criteria #1 to refer to 10 individual meetings with Co-Design Group members, if that's the best direction to build trust and support the long-term goals of the CDG. Ideally, we'll get a group meeting on the calendar by 12/10 too, but maybe that's putting something on the calendar for early/mid January or something like that.

mxk0 commented 3 weeks ago

Sprint 1.2 updates

Status: 🟢 On track

Sprint goal

The main goal for this deliverable in Sprint 1.2 was to run the CC&DC.

Accomplishments

Rollover

Risks

Sprint 1.3 goals

mxk0 commented 2 weeks ago

@lucasmbrown-usds and @juchang11 here are a few proposed Acceptance Criteria/Metrics wording tweaks we'd like to review with you tomorrow (Thurs Nov 7) at our SGG Leads meeting. Please take a look in advance of the meeting and we can talk things through live. cc @andycochran @widal001 @margaretspring @mdragon

Co-Design Group acceptance criterion

Existing AC: At least one session has met of at least 10 representatives of communities currently underserved and overburdened by the grantmaking process, and attendees are compensated for their time and expertise.

Proposed new AC: 1:1 or small group kickoff calls are conducted covering at least 10 representatives of communities currently underserved and overburdened by the grantmaking process, and attendees are compensated for their time and expertise.

Notes: This is a slight tweak to Lucas’s proposed changes from last week, since we may have one or more small group kickoff calls and not just 1:1s.

User Research acceptance criterion

Existing AC: At least 10 users participate in user research sessions and are compensated for their time and expertise.

Proposed new AC: At least 10 users participate in user research sessions. Users who are eligible for compensation are compensated for their time.

Notes: The user research plan targets 14-20 users for interviewing. That group is made up of 2-4 internal staff, 6-8 grantors, and 6-8 applicants. At minimum, we will run 14 user interviews.

When we drafted the User Research AC, I think we missed that we can’t compensate government employees for user research. With the current research plan we can only compensate the 6-8 applicants. We believe that the current research plan more than addresses the spirit of the User Research AC: interviewing a wide swath of our users and compensating users who are eligible for compensation. We are requesting a tweak to the AC since we think the research plan as-is best represents our research needs; we'd prefer to tweak the AC wording than change the plan.

Metrics

Existing metric: At least 50% of the attendees have indicated interest in attending another session in the future.

Proposed rewording: At least 50% of the Co-Design and Collaborative Coding & Design Challenge attendees have indicated interest in attending another session in the future.

Notes: Our assumption was that this doesn't apply to user research participants, and we wanted to clarify.

lucasmbrown-usds commented 2 weeks ago

Thanks for the extremely clear formatting of this update!

  1. "Co-Design Group acceptance criterion" change - LGTM.

  2. "User Research acceptance criterion" - I had thought/hoped we would push to do 10 applicant interviews and compensate them. If the team doesn't think this is a good use of time, we can adjust it to your suggestion.

  3. "Metrics" - I had thought/hoped this would include UR participants too, so we have some indicator of whether UR sessions are going well. But on further reflection, I understand that asking them "would you like to attend another session" would be confusing if we don't plan to invite them to attend any other sessions (b/c our UR is mostly one-off).

Perhaps there's another way of framing this question, in line with Net Promoter Score: "would you recommend participating in user research with Simpler Grants.gov to a friend or colleague?" So we could either ask a question like that at the end of each UR session (and amend the Metric to reflect that), or we could change to your proposal.

Overall comments Thanks for the thoughtful writeup. As everyone knows, in future quads, we hope to really minimize any changes like this to Acceptance Criteria or Metrics for Enhanced Deliverables. We're all learning by doing right now, but let's go through the Quad 2 ACs/Metrics with a keen eye towards making sure they're accurate to our core goals and won't need to be edited later. We really don't want to set the precedent of editing these late in the game.

Of course, for now, we're all learning-by-doing in the first run of this, so it's fine, and there's no big problem. Thanks for the thoughtful and effective work on all this.

mxk0 commented 2 weeks ago

User Research AC: 👍 to move forward with Nava proposed change

Satisfaction metric:

Goal for Quad 2: few to zero changes to Deliverable specs after they're finalized

mxk0 commented 1 week ago

Sprint 1.3 updates

Status: 🟢 On track

Sprint goal

The main goals for this deliverable in Sprint 1.3 were to begin outreach to potential Search API users, plan and ticket remaining work for this quad's Co-Design effort, and schedule/run usability testing sessions.

Accomplishments

Rollover

Risks

Sprint 1.4 goals

lucasmbrown-usds commented 1 week ago

Nice updates. Are we asking all the user research participants some version of the "Satisfaction" question, to inform our metrics?

andycochran commented 1 week ago

Are we asking all the user research participants some version of the "Satisfaction" question

^ #2860