Closed bennettscience closed 5 years ago
The ways to grade multiple answer questions are modular and the logic is contained within a function, currently lines 831-991 of the qw-engine.js file. I don't calculate the number of items that should be selected, so you would need to add some logic to do that in the rightwrong function.
After adding the logic to calculate the number of items that should be selected, you can change the purpose of an existing method by probably changing just a few lines.
Of course, you'll need to make sure you're using your local copy rather than loading mine across the network. I have a local copy that I use for testing and it's essentially the metadata from a user script, the qw-engine.src file without the very last section, and then a line to invoke quizwiz.
What I think you're asking for is to give 50% of the points because they selected 50% of the items that should be selected. If so, then I have no plans to implement that as it would make it easy for people to use a fundamentally flawed logic in grading. It doesn't penalize for incorrect answers, so a student could select all of the answers and get 100% every time without knowing anything. If you want that logic, then it should be a survey rather than a graded quiz.
If Canvas ever gets their quizzes.next done correctly, the multiple answer questions should not just be all or nothing. I heard that they are going to allow partial credit in a user-specified way for multiple choice questions, and I would like to see that extended to multiple answers. If that happens, it would make it easy for your instructor to make the questions worth 50%, 50%, and 0% of the points.
Thanks for the pointers on where to look in the engine. I'll make a local copy and play around with it some more.
I agree on the logic...the same could be said for not selecting wrong answers. In fact, playing with the current version, not selecting anything awards 0.3 points based on my example when you leave everything blank on the question. I think for our purposes, auto-scoring the essays, etc, to speed up will be a major help and teachers will still have to go in and manually score multiple answer questions as they've done normally.
And yes, I hope Quizzes.Next builds out more fully. I show teachers what it can do and it always seems to be a half-step away from what it should do to be really helpful.
If you haven't seen it, read Understanding Multiple Answers Questions. I wrote it as part of the QuizWiz development and it digs into how the way Canvas does it is pretty much the only way it should be done if you want to keep the interface simple and award partial credit. All of the other techniques have issues and the only reasons I included them when I wrote QuizWiz was because they were available in other LMS software and people were asking for it.
As far as selecting nothing, you should get points with that implementation. You correctly answered 1/3 of the questions by leaving them all blank. There's no good way to know the reasoning behind not answering a question (had no clue, thought there weren't any right answers, ran out of time, etc.)
What burned me the first time was I thought it was a way to ask a series of T/F questions, but it's not and it cannot be. If you want that, you need to use the multiple dropdowns and then you're forcing the student to choose.
More of a question than an issue.
I'm working with teachers on assigning partial credit for multiple answer style questions. I think the closest I can get is setting
ma_correct
toenabled
, but if they have set a multiple answer question to multiple points, it enters the percentage (as designed). Here's an example:Multiple answer question with three items worth two points (two correct answers).
The teachers would like this scored as 1 out of 2 instead of 0.67. I thought the
ma_difference
method would work instead, but it subtracts for incorrect answers, scoring as 0. Is there a way to work around this without a full rewrite?