Closed kra-y closed 6 years ago
Glad you find it useful! And filing an issue is better than 10 bags of candy.
The problem is not on your end--it looks like a pagination issue. Should be a relatively easy fix, but I'll have to tinker with it next week.
I did't know about all the functions in the package, but get_submissions() served as a good workaround for the problem. It's no rush. I figured out a perfectly workable solution. Thanks so much
You can use per_page=100(or more) parameter in your API call for more data. 10 records is a general system limit for JSON response. They do this to improve performance.
PR #17 reworked get_course_gradebook
.
In PR #17, I didn't really get this fixed, unfortunately--now the upper limit is 100--and I just stumbled over the problem myself. With luck I'll get a chance to figure out a fix this afternoon.
I'm really impressed by this package. It's super useful. I've discovered a problem however, and I'm not certain if it is on my end or not. When I tally the number of distinct user_ids associated with a each assignment id the maximum value it gives is 10. For example:
require(dplyr)
df%>%group_by(assignment_id)%>%summarise(n_distinct(user_id))
These are large sections, and while I can confirm that some of the students fail to submit work, the majority of the assignments have been submitted and therefore should have recorded scores. The only thing I can think of that might be causing this is that the default setting on my school's canvas website is to only show 10 entries at a time. This seems like it would be unlikely to really cause an issue, but I'm fresh out of ideas and nobody in my office really knows R well enough to figure this out. Any way you might be able to help would be awesome. I'll mail you a giant bag of candy if you can help me figure this out.