Open hlashbrooke opened 4 years ago
This is a high priority for the next iteration, but we need to have some discussion about what questions we'll be asking. I'll comment here later with some thoughts.
The WordCamp speaker feedback tool asks attendees to submit a star rating for the session and then answer three questions:
This feedback is vetted by the organisers and approved, with approved feedback being automatically shared with the speakers.
This same flow would be great for Learn WordPress, but I think we should modify the questions to be more focused on the objective of learning something tangible, so here is my first pass at updating these questions:
These would be submitted along with a star rating. Since the answers would be seen by the presenters, these questions would allow them to get an idea of how well their content is being received and if there's anything they need to tweak in order to make it more effective.
Some questions coming out of this:
Adding to @hlashbrooke's suggestions for question modifications, can we add something along the lines of "Were learning objectives clearly presented?"
Related to the Collecting and Reporting Stats for Learn WordPress Discussion Groups P2 I posted, I'm wondering if Learn's implementation of the speaker feedback tool could help us determine the attendee's competency level growth on the topic by asking two question:
Both of the questions above would be answered with a numeric value from 1-10.
I'd also suggest we can ask how valuable the user found the following components:
For both of the components above, we could present 3 options to select from: not valuable, somewhat valuable, very valuable.
I like that! How about we go with this set of questions then:
Is that too many questions? Only 3 of them are text questions - the others are ratings only.
At the time then we're asking people to fill in this survey, they would have just completed the workshop and not taken part in a discussion group yet. We have a separate survey for discussion group attendees, so asking about the value of the group at this stage isn't really practical.
Since this is a set of questions to be asked between the completion of the workshop, but before the discussion group, I wonder how quickly the discussion group leaders would be able to see these answers. The last two questions proposed by Hugh above would be good for the discussion group leaders to focus on.
- Is there anything else you expected to learn that wasn't included?
- Do you have any additional feedback?
I'd like to revisit this as a high priority item. Each workshop could have a button in the sidebar that links to a survey asking the following questions:
We would also need to have an identical survey after each course as well.
@hlashbrooke could this be done by linking to a Crowdsignal survey? Or maybe a Jetpack form?
That's not impossible and could be a fine solution.
To my mind, there are two problems with a survey or Jetpack form:
Those aren't necessarily blockers though and a Crowdsignal survey would be the better option there since we can then at least analyse the data to some degree - it would just create an extra step when publishing a workshop to add the new workshop into the list on the survey.
Given how many other dev items we would like to get through, perhaps we can simply do that and then if we can build a more integrated tool in the future we can look into it again then. Happy to hear some other thoughts and opinions here too!
The speaker feedback tool was built for WordCamps. It would be great to activate it on the Learn site for workshops. https://make.wordpress.org/community/handbook/wordcamp-organizer/speaker-feedback-tool/
CC @coreymckrill