Closed kdmccormick closed 1 year ago
@regisb Here's another one ready for a look when you have a chance. This would be post-V1-Plugin-API if we did it.
I'm not sure how valuable these unit tests would be? But maybe I'm not the right audience? My feeling is that integration tests would be more valuable for the maintenance of plugins. If we want to make developers more confident in their plugins, maybe we can provide a tutor plugins debug
command? There are different solutions, but we first need to figure out what the actual problem is.
There are different solutions, but we first need to figure out what the actual problem is.
@regisb Right, good point. The actual problem as I see it is:
Other than linting and type-checking, there isn't an obvious way to automatically validate that a PR against a Tutor plugin repo won't break the plugin in some way, whether that breakage be:
patches
instead of a dict)On the other hand, perhaps plugins are generally small enough that the a combination of static analysis and manual testing is sufficient to catch those sort of bugs.
My feeling is that integration tests would be more valuable for the maintenance of plugins.
@regisb are you thinking "integration of a plugin with Tutor" or "integration of multiple Tutor plugins together"? If the former, then I agree entirely: my suggestion in this issue is that we provide a standard way for plugin authors to ensure that their plugin works with Tutor's plugin API. If the latter is what what you mean, then I do not disagree, but I had not yet thought of testing interactions between plugins.
Anecdotally I can say I think having testing like this would be useful when creating new plugins to make sure the config variables all get set to what I think they get set to. When I was dealing with the event bus plugin, for example, it took me several tries to figure out whether or not I was setting my new Studio environment variable correctly. Having to bring containers up and down a bunch to check this was pretty time consuming, so I can imagine unit tests saving some time there.
Wow I totally missed your comment from six months ago Kyle, sorry about that. I guess at the time I was head down in the conference prep.
I'd like to suggest two solutions to develop plugins more confidently:
make test
.tutor plugins debug PLUGIN
command, which I loosely define here as follows:$ tutor plugins debug forum
Init tasks:
`bundle exec rake se...`
Images:
Build:
forum: overhangio/openedx-forum:14.0.2
Pull:
forum: overhangio/openedx-forum:14.0.2
Push:
forum: overhangio/openedx-forum:14.0.2
...
We should add extra information there to match the needs of developers. Do you think this would be sufficient?
(as a side note, the debug command above could be added via a plugin, as usual :stuck_out_tongue:)
Of course I'm not opposed to adding unit tests to plugins. In my experience they would not be very helpful, but if other developers need them I would totally understand it.
All good @regisb .
Stepping back, I see three general categories of mistakes plugin devs could make:
CLI_DO_INIT_TASKS
.
tutor plugins debug
idea would help a lot here. I also think it would help plugin developers form a mental model for what they are doing as they build out their plugin. I'll make an issue for it!yamllint
on the generated docker-compose files or running pylint
on the generated settings files. So, where does that leave us? I am thinking that unit tests may not be the right approach; rather I'd like to:
tutor plugins debug
, and then,tutor plugins debug validate
, which would:
yamllint env/local/*.yml
.PLUGINS_DEBUG_VALIDATORS
?) so that we're not hard-coding anything openedx specific into Tutor.run a set of validation scripts
That's a job for a tutor dev do validate-config
job ;)
That's a job for a
tutor dev do validate-config
job ;)
Good point.
I'm going to close this in favor of two follow-up tickets:
Context
As far as I can tell, Tutor doesn't make any guarantees about the stability of its Python API, which is probably for the best.
However, this makes it hard to write tests for plugins. For tutor-contrib-coursegraph, I wrote a shell script that installs the plugin from the CLI and inspects rendered environment files. Better than nothing, but it's a little convoluted and brittle, and it clobbers config in your Tutor environment when you run it.
What if Tutor exposed a minimal stable Python API for the express purpose of writing unit tests in plugins? For example:
myplugin/tests/test_plugin.py
Acceptance
Note: we would also want to add documentation on how to test plugins using this new API
TBD