mozilla / openbadger

badger badger badger badger
Mozilla Public License 2.0
33 stars 24 forks source link

Need a way to define assessment rubric (for aestima) #120

Closed cmcavoy closed 11 years ago

cmcavoy commented 11 years ago

We need a way to assign n-many boolean questions to a badge class. Those will be sent to aestimia and displayed to the person who is handling the assessment.

I don't think that we necessarily need to care about the answer to those questions in openbadger, so we can probably use some sort of the text/markdown list data. @toolness (who will be implementing this) seems to agree.

toolness commented 11 years ago

I was thinking that in the criteria area, since it already takes markdown, we could either borrow from github's task list markdown, or we could even just interpret any markdown bulleted list item as a rubric, and search for the string (required) in every bulleted item to detect whether it was required. However, Brian said that rubrics and criteria are not one-to-one... Which I understand, I suppose, but isn't a rubric a subset of the criteria?

Also, if the proper rubric markdown isn't found, CSOL could just add a single rubric item saying "satisfies all the criteria listed at ...", which is Chris' idea.

cmcavoy commented 11 years ago

Examples of the current criteria (rubrics) are listed here columns Y and Z (that link is a copy of the original data, it's just an example, it is not the final list).

An example:

Has the PDF been uploaded of the certificate earned from the Blobz Guide to Electric Circuits? Has a screenshot of an e-fashion example been uploaded? Have the reflection questions been answered in a thoughtful way?

I'd translate that to:

Has the PDF been uploaded of the certificate earned from the Blobz Guide to Electric Circuits? Has a screenshot of an e-fashion example been uploaded? Have the reflection questions been answered in a thoughtful way?

Where each line item is a different criteria question. I'd default them to required, unless they're explicitly set as optional, like this:

(optional) Has the PDF been uploaded of the certificate earned from the Blobz Guide to Electric Circuits? Has a screenshot of an e-fashion example been uploaded? Have the reflection questions been answered in a thoughtful way?

The only thing missing is providing a count of minimum number of questions passed. Reading through the list, it seems like all of them are required, so for now, I think marking all questions as required will be enough.

toolness commented 11 years ago

Ah, ok. I talked to Chloe and she said that the "minimum number of questions passed" thing isn't actually required, it was just a thing she put in her mockups, so I didn't actually implement it in Aestimia. Hopefully that is not a dealbreaker.

toolness commented 11 years ago

Hmm, looking at the spreadsheet, it seems like the ASSESSMENT field is just the CRITERIA field listed in question form, rather than statement form. It seems like we could achieve both by just having the criteria field in openbadger be a bulleted list of statements, with some of them containing the string (optional), and then parsing that as the rubric (indeed, the sample rubric provided in Aestimia actually contains statements rather than questions, and nobody seemed confused by that).

I only took a quick look at the assessment and criteria columns though, so I could be wrong in my assumption that they appear to be the same thing phrased in different ways.

carlacasilli commented 11 years ago

Hey all, I think what you're looking for is actually found on the DYN spreadsheet in columns Q, R & S: https://docs.google.com/spreadsheet/ccc?key=0Atr9GCn8Bz1_dHR0RjAwOGVITXZCdVg5dDR3MzlTZ1E#gid=6

It has not been copied over into the canonical spreadsheet, but it could be.

cmcavoy commented 11 years ago

Carla, this is the first time I've seen there's specific reflection questions for the submitters. It's not on any of our mock-ups, or been discussed.

On Mon, May 27, 2013 at 12:05 PM, Carla Casilli notifications@github.comwrote:

Hey all, I think what you're looking for is actually found on the DYN spreadsheet in columns Q, R & S: https://docs.google.com/spreadsheet/ccc?key=0Atr9GCn8Bz1_dHR0RjAwOGVITXZCdVg5dDR3MzlTZ1E#gid=6

It has not been copied over into the canonical spreadsheet, but it could be.

— Reply to this email directly or view it on GitHubhttps://github.com/mozilla/openbadger/issues/120#issuecomment-18507470 .

@chmcavoy http://twitter.com/chmcavoy http://lonelylion.com

carlacasilli commented 11 years ago

Okay, quick, easy way to solve this: provide the mentors with this spreadsheet when they're reviewing content and that can inform the reflection that's shared with the badge applicant.

chloeatplay commented 11 years ago

@ all - we definetely have considered "criteria" to be the "rubric" all along in our design. I would not confuse the two, also I think that questions are easy to turn into statements, right? Happy to help do that if you need a hand @carlacasilli @cmcavoy @toolness actually in the mockups- we had a reflection question for the submitters- a prompt for them to add something in the text field when they submit. Similarly there is a prompt line for the assessors when they add their feedback. @threeqube has the copy for that.

cmcavoy commented 11 years ago

@cmcavoy @toolness actually in the mockups- we had a reflection question for the submitters- a prompt for them to add something in the text field when they submit. Similarly there is a prompt line for the assessors when they add their feedback. @threeqube has the copy for that.

@chloeatplay Yes, we've had a place for reflection since the beginning, but they were never prompted by specific questions. At least, not that I've seen. I could be wrong, but at this point I'm not sure how we're going to prompt the submitters with specific questions to reflect on. Everything that we've built is along the lines of "upload evidence, include a written reflection on that evidence".

I think we'll have to include the specific questions in the body of the badge description or criteria, and then ask the learner to answer those questions in a text field. Not one at a time, but all at once.

carlacasilli commented 11 years ago

Perfect way to solve this, especially since the criteria for the self-paced activities asks for this, e.g., "Badge is earned by (1) answering the reflection questions thoughtfully, and (2) including evidence and/or details from what you watched."

Good call.

threeqube commented 11 years ago

@carlacasilli @chloeatplay are we all agreeing on handling this through text prompt in the open text field here in this mockup?

aestimia-assessor

chloeatplay commented 11 years ago

yes! agreed!

On Monday, 27 May 2013 at 21:15, threeqube wrote:

@carlacasilli (https://github.com/carlacasilli) @chloeatplay (https://github.com/chloeatplay) are we all agreeing on handling this through text prompt in the open text field here in this mockup?

— Reply to this email directly or view it on GitHub (https://github.com/mozilla/openbadger/issues/120#issuecomment-18519998).

threeqube commented 11 years ago

Sounds good. Just like in @chloeatplay's original mockup. Cool! :dancers:

carlacasilli commented 11 years ago

Just to be clear, there are two things happening here. 1) We need to ensure that folks know how to get the badges and this means including the reflection questions in the badge criteria 2) The open text field allows the mentors to review and respond to the badge applicant's work on those reflection questions.

I'm totally down with that. :dancer:

chloeatplay commented 11 years ago

@carlacasilli yup- happens already when you apply for the badge, you see those criteria ;) at the mockup at least. for 2) ditto! 3) I salute your emoji use- :ghost:

toolness commented 11 years ago

Ok, in toolness/clopenbadger@74c183b9308fde9c7cfc0224ab13987821446730 I added some functionality that processes the criteria text to create a rubric. From the commit log:

This method parses the criteria content by looking for bullet points (lines that start with *) and converting them into rubric items that match Aestimia's rubric item schema. If a bulleted line contains the text (optional), the corresponding rubric item is marked as non-required; otherwise, it's required.

If no bulleted lines are found, the method returns an array with one required rubric item, consisting of the text Satisfies the following criteria:\n followed by the raw criteria text.

We still need to make API endpoints return the rubric information, but hopefully this helps us get closer to a solution for this issue.

cmcavoy commented 11 years ago

:+1: