Open huwf opened 8 years ago
Having integration with the various LMSs is definitely something that would be nice to have at some point. I was also thinking it would be along a nbgrader export
command (or nbgrader post
, or whatever), which could be configured to export to various formats like csv, to Canvas, Blackboard, etc.
It would be great if you have this working with LTI/Canvas and wanted to make a PR!
If we're thinking more generally, would we want to separate the functionality into different commands? Say:
nbgrader export "assignment" --format=lti
nbgrader post "assignment" --server=example.com
Haven't started the code yet, but I [somewhat unwisely?] promised my manager I'd have something working by Sep 30! I have had a look at some of the code and I think I mostly understand what's going on. Will aim to start in the next week or so, though it may take a while to be ready for a PR! :) I'll update this thread.
I think it's probably fine to keep the functionality as a single command.
I am working on a simple nbgrader export
app that will export to CSV and will support plugins. That should make it a bit easier for you to build off of, I think! I should have it done later tonight.
Ok, I have opened #536 which (I think) should make it fairly straightforward to export to other formats by defining your own export plugin. Let me know if you have any questions about it or if it doesn't quite work with what you're planning here (I will go ahead and merge that PR once the tests pass, but I'm happy to make changes if they're needed).
I think we are looking at slightly different things. I was thinking of an individual student's grade, e.g., following nbgrader autograde
and the result gets POST
ed to the LMS (using LTI standard, + Canvas extension). It looks like you're thinking more about getting the grades for the whole class in CSV format?
I still think this will be very useful for me, it will allow me to easily convert to the format that I need and has given me more of an idea of how the code works! Maybe getting a PR for all my changes might be a bit tricky, depending on what information it is assumed that we can store about students etc. It's 11pm here though, so I'll try a more detailed look tomorrow. Thanks! :)
Ah, I see, I was thinking you would want to upload grades for the whole class to the LMS at once. I don't see in principle any reason why the exporter couldn't be able to handle both, though -- e.g. you could define another trait in your LTI/Canvas exporter that is the student id, and if that is given you only export for a single student otherwise you export for everybody. Or something along those lines; it's also 11:30pm for me right now š But I would certainly want the functionality for my class to upload grades for everybody at once, as we have 280 students this semester and uploading for individuals would really be a pain!
11:30 - Are you in the UK then?!
Yeah, it does make a lot more sense with a coursework situation to upload all at once. Which I totally hadn't thought of. I'll probably need to do that for one of our courses next semester (probably Blackboard rather than Canvas for that), but we're also looking at running CPD/MOOC courses where the student would submit an assessment, it gets autograded, and then they can see the feedback (using Canvas). I was impressed at the coursework submission on this course, and I wanted to do something similar. https://www.coursera.org/learn/introduction-to-algorithms
Yes, I am in the UK! (though only until the end of this week)
Ahh, I see, that makes sense that you would want to do the export individually if you're doing individual autogrades too. I think it's totally reasonable for nbgrader to be able to support both š
Taken a while, but I have made some progress on this. https://github.com/huwf/nbgrader. Mine is a bit different to your CSV exporter, it's not writing XML to a file unless a file name is specified, main purpose being to export it to a server. Well at the moment, it doesn't write to a file at all because there's a bug. But I will fix that! :grinning:
I plan to gradually expand it so that it can start adding some of the canvas extensions as well, e.g., for homework submissions but that is not the immediate priority. Although I guess linking to the formgrader might work quite well without too much work.
I struggled a bit with the unit tests. Had to mock/monkeypatch quite a bit because it does rely on HTTP request/response. Took a while to get my head around! Still need to do quite a bit more though.
What do I need to do to get the "help" test to pass? I guess I need to add stuff to the documentation about my plugin?
Also, there are extra dependencies. Notably an LTI library, which depends on oauthlib-requests, and lxml. Oauthlib and requests are simple enough, but the problem with lxml is that at least on Ubuntu Server 14.04 it doesn't work with just pip. If I do apt-get install python3-lxml
then it works on my current server, but in the past I have had to add another one too but I can't remember which one that was! May not be suitable for a PR.
Very cool! Just FYI, you don't necessarily need to fork all of nbgrader to use your own custom plugin -- you just need the file containing the plugin to be in your course directory, see http://nbgrader.readthedocs.io/en/latest/plugins/export-plugin.html#creating-a-plugin for details
What do I need to do to get the "help" test to pass?
I am not really sure without seeing the error, can you point me to it?
That makes me happier, I was wondering how the plugins should work. I just forked it assuming that I was going to need to to write a new app before you wrote the export
one! I've just [finally!] got access to a Canvas server today so I'll be able to start doing some tests on real data soon.
Should have been more specific about the help
test. I just thought that there was something specific I needed to add for the documentation. On closer inspection it's got a key error, but I have no idea where I'm supposed to declare that, or why it passes for just nbgrader export
.
__________________________________________________________________________________________ TestNbGraderExport.test_help ___________________________________________________________________________________________
self = <nbgrader.tests.apps.test_nbgrader_export.TestNbGraderExport object at 0x7f1da976deb8>
def test_help(self):
"""Does the help display without error?"""
> run_nbgrader(["export", "--help-all"])
/home/huw/jupyter/nbgrader-fork/nbgrader/tests/apps/test_nbgrader_export.py:60:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/home/huw/jupyter/nbgrader-fork/nbgrader/tests/__init__.py:150: in run_nbgrader
app.initialize(args)
<decorator-gen-13>:2: in initialize
???
/home/huw/.local/lib/python3.5/site-packages/traitlets/config/application.py:74: in catch_config_error
return method(app, *args, **kwargs)
/home/huw/jupyter/nbgrader-fork/nbgrader/apps/nbgraderapp.py:230: in initialize
super(NbGraderApp, self).initialize(argv)
<decorator-gen-8>:2: in initialize
???
/home/huw/.local/lib/python3.5/site-packages/traitlets/config/application.py:74: in catch_config_error
return method(app, *args, **kwargs)
/home/huw/jupyter/nbgrader-fork/nbgrader/apps/baseapp.py:397: in initialize
super(NbGrader, self).initialize(argv)
<decorator-gen-6>:2: in initialize
???
/home/huw/.local/lib/python3.5/site-packages/traitlets/config/application.py:74: in catch_config_error
return method(app, *args, **kwargs)
/home/huw/.local/lib/python3.5/site-packages/jupyter_core/application.py:239: in initialize
self.parse_command_line(argv)
<decorator-gen-4>:2: in parse_command_line
???
/home/huw/.local/lib/python3.5/site-packages/traitlets/config/application.py:74: in catch_config_error
return method(app, *args, **kwargs)
/home/huw/.local/lib/python3.5/site-packages/traitlets/config/application.py:488: in parse_command_line
return self.initialize_subcommand(subc, subargv)
<decorator-gen-3>:2: in initialize_subcommand
???
/home/huw/.local/lib/python3.5/site-packages/traitlets/config/application.py:74: in catch_config_error
return method(app, *args, **kwargs)
/home/huw/.local/lib/python3.5/site-packages/traitlets/config/application.py:426: in initialize_subcommand
self.subapp.initialize(argv)
<decorator-gen-8>:2: in initialize
???
/home/huw/.local/lib/python3.5/site-packages/traitlets/config/application.py:74: in catch_config_error
return method(app, *args, **kwargs)
/home/huw/jupyter/nbgrader-fork/nbgrader/apps/baseapp.py:397: in initialize
super(NbGrader, self).initialize(argv)
<decorator-gen-6>:2: in initialize
???
/home/huw/.local/lib/python3.5/site-packages/traitlets/config/application.py:74: in catch_config_error
return method(app, *args, **kwargs)
/home/huw/.local/lib/python3.5/site-packages/jupyter_core/application.py:239: in initialize
self.parse_command_line(argv)
<decorator-gen-4>:2: in parse_command_line
???
/home/huw/.local/lib/python3.5/site-packages/traitlets/config/application.py:74: in catch_config_error
return method(app, *args, **kwargs)
/home/huw/.local/lib/python3.5/site-packages/traitlets/config/application.py:500: in parse_command_line
self.print_help('--help-all' in interpreted_argv)
/home/huw/.local/lib/python3.5/site-packages/traitlets/config/application.py:360: in print_help
self.print_options()
/home/huw/.local/lib/python3.5/site-packages/traitlets/config/application.py:331: in print_options
self.print_alias_help()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <nbgrader.apps.exportapp.ExportApp object at 0x7f1da977fef0>
def print_alias_help(self):
"""Print the alias part of the help."""
if not self.aliases:
return
lines = []
classdict = {}
for cls in self.classes:
# include all parents (up to, but excluding Configurable) in available names
for c in cls.mro()[:-3]:
classdict[c.__name__] = c
for alias, longname in iteritems(self.aliases):
classname, traitname = longname.split('.',1)
> cls = classdict[classname]
E KeyError: 'LtiExportPlugin'
/home/huw/.local/lib/python3.5/site-packages/traitlets/config/application.py:295: KeyError
Ah, yes, this is because you've defined command line flags (e.g. 'secret': 'LtiExportPlugin.secret'
) for the ExportApp
but then when you do nbgrader export
it doesn't know where to find LtiExportPlugin
(long story why this is the case...). To fix it you'd need to add the LtiExportPlugin
to the list of classes here: https://github.com/huwf/nbgrader/blob/master/nbgrader/apps/exportapp.py#L58
Though in general I'd recommend against having command line flags that only work for some exporters and not others.
Ah, I see what you mean. This might make it more difficult, what do you suggest as the best way around that?
I see you're defining the behaviour of the LateSubmissionPlugin
in the config file, but if I've understood that properly, it looks like something which won't change much whereas this would change every time! And if I just want to have the plugin rather than fork the whole thing, then I would not have access to those command line flags.
Some of the things it seems won't change every time (e.g. key
and secret
?). The other ones you can still pass on the command line, it's just more bulky, e.g.
nbgrader export --exporter=LtiExportPlugin --LtiExportPlugin.assignment=ps1
Ah that's quite obvious actually, thanks. I'll just stick to that! :smile: I was just trying to figure out how to change the class and copied what you had done for the CsvExportPlugin
! Just passing it in like this will make my life a lot simpler :) And yes, key
and secret
will be staying the same. My guess is that it is usually replace
which will be used so that can go in the config as well unless it's specified otherwise.
One value which I will have to be careful of is the lis_result_sourcedid
, I just came up against an issue for my deployment actually. It points to a specific cell in the Canvas gradebook but there's no saying that the student would necessarily choose to submit that particular assessment! I've hacked around with ln -s
for now so they can only work on the single notebook that's in that one but I'll probably have to try and come up with something more robust in the future.
Hi, would any one be able to give an update on this? We're very interested in a JupyterHub integration with Canvas for our institution. Our ideal flow would be something like this for students:
Browsing around JuptyerHub, NBGrader, and this conversation, it sounds like other people have done a lot in this space. Is the flow I described at all viable? I get how we can use LTI to do the authentication dance, but can we have students launching assignments from Canvas, and can we have this post grades back when they click Validate?
@yuvipanda is developing a large hub that doesn't use nbgrader, but does launch notebooks from Canvas using LTI. The class is about to launch soon so he may not have time at the moment to contribute to this thread.
Hi @acbart I did develop a version which launches from Canvas, but I was unable to find a way to properly post it back. The trouble is, if there is more than one assignment available at any one time, there is no way to tell what the student submitted.
If you had, say, a series of labs without persistent storage, then the flow you describe should work. I wrote some code on that, although I don't think I entirely finished it, and wasn't able to integrate it properly as a plugin. I'll see if I can find that, it was for NBGrader 0.3 so might not work. https://github.com/huwf/nbgrader-export-plugin I think the main problem I had was trying to figure out how to get some unit tests to work, but since it was unnecessary for use in my use case given the constraints I mentioned, I never got the time to finish it (work project).
For simple LTI Authentication, I wrote an LTIAuthenticator which might serve your purpose https://github.com/huwf/ltiauthenticator but it needs some TLC because I believe it has some specific/hardcoded stuff for my purpose.
I've encountered a similar headache when developing LTI integration for our BlockPy tool. We solved it by storing the submission postback URL in our database. Wouldn't another solution be to store the Submission URL in the IPYNB notebook file that's created for the student?
We also have a solution for a large(ish) jupyterhub
instance - though we have a Django front-end which does the LTI integration (for us, Blackboard)..... and we run our Notebooks in a swarm.
Question for @huwf & @acbart : how are you managing resources? We find that any more than about 30 concurrent users on a VM, and it becomes pretty unusable.
We saw two solutions:
The other interesting question is how you tied the Jupyterhub identity to the LTI identity.... is that code available on GitHub?
I ended up going to the first option, and yes, it is (a lot!) more expensive. We don't have enough users to make it worthwhile adopting the second solution in the end. There are spawners which manage autoscaling for you, such as https://github.com/jupyterhub/kubespawner. I found Kubernetes too complicated to figure out properly, so my plan was to make use of a swarm and write my own simple autoscaler but I never quite had time.
For the Jupyterhub identities, I added a database table which mapped canvas ID to Jupyterhub ID. Because the Canvas IDs are too long for unix names, I called them user-1, ... user-n
get_user
and add_user
deal with this (called after authentication has been confirmed) https://github.com/huwf/ltiauthenticator/blob/master/ltiauthenticator/lti_db.py#L141
@huwf - at one point, we had 150 simultaneous users... and given our standard notebook
Docker Image uses about 0.5GB Memory, real tin with 16GB Ram doesn't last long!
Our hope [in the first instance] is to be able to support somewhere in the region of 300 simultaneous users (which is just 1% of the Universities undergraduate population) - our current problem is out cloud stack complains with more than 4 [notebook] worker nodes
I think I remember reading that this might be a plan at some point but I can't immediately find the link again. It is something I need to do for my current project, and I plan to contribute some code for it anyway, just wondered whether any work had been done already or whether there were other plans.
From what I can see, we would need to:
lis_outcome_service_url
andlis_result_sourcedid
from the initial LTI authenticationPOST
the XML back to the lis_result_sourcedid valueThe Coursera document seems to describe it quite well https://building.coursera.org/app-platform/lti/
I have already made a bare bones version of a custom LTI authenticator which uses
OAuthLib
, so I guess I would need to link that so that it saves in the NBGrader database. A final point is that Canvas have extensions to LTI, so that would be a natural subclass of a simple LTI one. I imagine calling it withnbgrader post
, but have no particular ties to that.In my deployment, we're not planning to manually grade any assignments, so I'm planning to run a cron job every minute to search for new submissions, and then collect/autograde/feedback and post them. If I ever get it working I'll put the code up on GitHub :)
I'm guessing this might be related to #436 ?