rstudio / learnr

Interactive Tutorials with R Markdown
https://pkgs.rstudio.com/learnr
Apache License 2.0
705 stars 236 forks source link

Multiple Choice: Individual feedback for all answer options #564

Open petzi53 opened 2 years ago

petzi53 commented 2 years ago

It is essential to give individual feedback for all answer options, either correct or incorrect. Rich feedback is an important feature of Multiple-Choice Questions (MCs) because teachers do not want only to give individual feedback for wrong answers ("distractors") but also often want to explain why correct options are correct.

As far as I can see, individual feedback for all answer options is not possible for MCs in the learnr package. Only for wrong answers, a message can be displayed if students choose this wrong option. Therefore, the overall false feedback wording in try_again or incorrect must cover all possible combinations, including partly correct answers. The result is awkward wording that can never cover all combinations.

Look for instance of the pizza topping example.

question(
  "Select all the toppings that belong on a Margherita Pizza:",
  answer("tomato", correct = TRUE),
  answer("mozzarella", correct = TRUE),
  answer("basil", correct = TRUE),
  answer("extra virgin olive oil", correct = TRUE),
  answer("pepperoni", message = "Great topping! ... just not on a Margherita Pizza"),
  answer("onions"),
  answer("bacon"),
  answer("spinach"),
  random_answer_order = FALSE,
  allow_retry = TRUE,
  try_again = "Be sure to select all toppings!"
)

The general wrong feedback "try_again = 'Be sure to select all toppings!'" does not make sense if students choose all correct options with one or more wrong options. General wrong feedback like "Incorrect" is not a solution as choosing several correct options (but not all) is better characterized as "Only partly correct" than incorrect. I could not think of wording for a general wrong answer which covers all possible outcomes.

Another side effect of the MC design in learnr is that you cannot grade the answer partially correct. Neither fine grading with points is possible by (for instance) calculating the chosen correct options minus the chosen the wrong options.

I believe that MCs need individual feedback for all chosen options and fine grading. To demonstrate the difference and how it could be implemented, I have prepared an example of the pizza toppings MC with precisely the same learnr choices but designed with H5P.

H5P is a free and open-source content collaboration framework based on JavaScript. H5P is an abbreviation for HTML5 Package and aims to make it easy for everyone to create, share and reuse interactive HTML5 content.

If you scroll the MC exercise down, then you see a screenshot that demonstrates that teachers can set in H5P different options and messages for every choice:

But for my example, more important: After answering, students get feedback in three different ways:

Teachers can optionally calculate points via correct/incorrect options or give maximal just one point by requiring that all possibilities be chosen correctly.

So my questions are:

PS.: I'm using the GitHub version of learnr: 0.10.1.9009.

gadenbuie commented 2 years ago

@petzi53 Thanks for your message and your helpful comparison with the H5P multiple choice component. I agree that the current MC question design is a bit restrictive in learnr.

Currently you can provide item-level message for both correct and incorrect answers, but the messages are only shown when the student's answer is also correct or incorrect, respectively. The messages are also pasted together into a single paragraph, and I agree that it takes some thought and effort to avoid awkward sentences in the feedback.

Thanks again for your thorough issue, it will be very helpful when we revisit the multiple choice question design.

rpruim commented 2 years ago

I've also been investigating H5P a bit. @gadenbuie and/or @petzi53, do you know of any examples where people have combined learnr and H5P? Might there eventually be an opportunity for a {learnrh5p} package that makes it easy to use H5P functionality from within {learnr} documents? Are there any gotchas to watch for as I explore H5P?

I'm brand new to H5P, so I happy to hear anything (good/bad/otherwise) that might help me figure out how/if to use it.

gadenbuie commented 2 years ago

Related: https://github.com/rstudio/learnr/issues/145

petzi53 commented 2 years ago

To @rpruim: I have experimented with the combination of H5P with R with and without learnr. H5P is very powerful by providing many types of educational interactions (aka content types). There exist at the moment about 50 different exercise types. So it would be precious to be able to use both apps together! My main approach is/was how to write an accompanying statistics book using R and H5P. I came up with two feasible alternatives:

  1. Using bookdown. See my example here. There is no connection between R chunks and H5P content types, e.g., you would have to use H5P for theoretical issues without coding. Besides, you have to include the H5P exercises via iframes without the possibility of grading. A workaround would be to provide links or QR codes to an H5P server application (WordPress, Drupal, Moodle, etc.). Unfortunately, in this case, you can't use learnr as it does not work in bookdown. But all in all, I think that using H5P has some educational potential by providing some interactivity to textbooks.

  2. Using shinyapps.io. See my demo example here. Again you have to embed H5P via iframes. But it is a better alternative than with bookdown as you can use H5P and learnr together. Using learnr in RStudio together with H5P in R packages is not possible: Because of security reasons, you can't embed iframes with this approach.

Again in this second option, there is no connection between R and H5P. Ideally, I would therefore wish two things (I lack technical expertise, so I do not know if my ideas could be implemented):

Both features together would open up tremendous new educational possibilities for R stat education.

dtkaplan commented 2 years ago

@petzi53 @gadenbuie @rpruim I'm also interested in leveraging markdown and learnr for interactivity in textbooks. I'm designing a calculus course (I know. Calculus? With R?) which ran in a prototype version last year and is running with 400 students and 12 instructors this year. I'm giving you this background so that you'll understand the simple-minded work-arounds I've developed in the face of working with {learnr}.

Last year, materials were presented to students in the form of {learnr} documents served by shinyapps.io. There was one document for each class day. This worked well in many respects but presented some difficulties. The major one was that instructors were not proficient enough to be able to create and deploy their own material.

For year 2, I had to switch to a system that gave some room for instructors to create materials.

  1. Textbook materials are presented via bookdown, with multiple-choice questions embedded in the book. See, for instance, exercise 4.1 in http://www.mosaic-web.org/MOSAIC-Calculus/fun-describing.html#concavity. Hover the cursor just after the last character in an MC choice. After about 5 seconds, the feedback for that item will appear. I'm able to provide feedback for all items, right or wrong. Constructing the questions is easy; I've put an example below. All the functionality is Markdown/HTML/CSS. (With JavaScript, the layout of the questions could be greatly improved, but I don't know JavaScript.)
  2. Interactive computation is provided by a single {learnr} document: a sandbox app. https://maa-statprep.shinyapps.io/CalcZ-Sandbox

An advantage of the Markdown/HTML/CSS format is that all the source materials for the hundreds of questions can be handled using standard file-editing techniques within RStudio. A huge disadvantage is that there is no automatic integration with an LMS to record student answers or activity. Instead, the askMC() function below can be called in a way that generates content for insertion into an LMS. The instructor copies and pastes the LMS-formatted material into the LMS editor. That's not great, but at least it means that the instructors can set up their own courses without support from me or software that needs maintenance.

The sandbox system for computing has been working well. (Instructors find it easy to understand.) I would like to be able to inject code (via a hash or URL query string) into the code box, but this requires hacking {learnr} which, in turn, creates a maintenance nightmare.

askMC(
  prompt = "The gravitational force, F, between two bodies is inversely proportional to the square of the distance $d$ between them. Then ...",
  "$F = k d^{2}$" = "Inversely proportional to the square would be $d^{-2}$", 
  "+$F = kd^{-2}$+", 
  "$F = k d^{1/2}$" = "This is a square-root relationship.", 
  "$F = k d^{-1/2}$" = "This is inversely proportional to the square root."
  )
rpruim commented 2 years ago

@petzi53 and @dtkaplan : Thanks for your posts. (I may follow-up separately to avoid adding more clutter to this issue.)