dodona-edu / judge-java

JUnit-based Java judge for the Dodona learning environment
MIT License
1 stars 1 forks source link

Dodona JUnit Judge

This is a judging system for Dodona, to judge Java exercises using the JUnit 4 testing framework.

Creating new exercises

For exercises descriptions, please check out this wikipage.

To write tests for this judge, please read the general instructions on describing exercises first. I'll assume you have worked with JUnit before.

Exercise configuration

Each exercise has a JSON configuration file, for example:

{
  "description": {
    "names": {
      "nl": "Dag Wereld",
      "en": "Hello World"
    }
  },
  "evaluation": {
    "filename": "World.java",
    "handler": "java"
  },
  "visibility": "hidden"
}

Please use descriptive names in description/names/..., these will be shown to the user, often without context (it's not very informative to have an exercise called "1" in your list of recent exercises if you're following 3 courses at a time).

evaluation/filename is the name of the file the students should submit. As this is a Java judge, that's the name of the public class they should submit + .java. evaluation/handler indicates which judge the exercise should use.

Other configuration values used by the Java judge are:

Where to put which code?

When running the judge, the code is compiled in the order given below. This means each step can use the classes defined in the steps above. During the compilation of these files, the jars in lib are available.

After compilation, the judge is executed. It will assume the presence of a TestSuite class, containing a JUnit 4 testsuite, which it will run. While the source file for TestSuite can be anywhere, I'd advise evaluation/TestSuite.java.

Good practices

While above description leaves a lot of interpretation, here is how a new exercise is written by myself:

Test files

As mentioned, test files can be pure JUnit test files, including all features in JUnit 4 (I especially recommend Parameterized). Some extras are available though:

Running Exercises

Creating/modifying and exercise, commiting your changes, pushing them to your remote, and submitting the solution on Dodona makes for quite a slow development cycle. To run you tests locally, various methods are possible.

Using shellscripts

If you're familiar with the command line, this repository provides some shell scripts to test your exercises on your POSIX-compatible machine. With testing as working directory, run ./test.sh path/to/the/exercise/. This path points to a directory with a config.json exercise file in it. It will "submit" the solution in the solution subdirectory. Reading the JSON output might take some getting used to.

To handle JSON in the shell, this script (and the judge) require jq to be installed.

Using IntelliJ (or other IDE's)

Since the tests are mostly just jUnit tests, IntelliJ and other IDE's provide support for running them. The directory structure makes things slightly more complicated, though. This describes one method to get things running, people more familiar with IntelliJ might know a better way.

Per exercise, create a new project. The config.json file should be in the project root. Mark (or create) the workdir, evaluation and solution directories as "sources root". Add jUnit 4 as a dependency of the project.

If your exercises use some Dodona-specific features, such as the TabTitle or the AssertionStubber, add the Judge as a dependency. Opening this repository as an IntelliJ project should allow you to create a JAR.