Closed simonlindholm closed 3 years ago
I don't know about this. It is not at all clear to me that one wants to be told (even if only "softly" in the form of a warning) to explicitly set a value for show_test_data_groups
as soon as there are more groups than just sample+secret.
You definitely do. The majority of problems with subgroups below secret want test data groups to be shown, but the default is to not show them (for backwards compatibility reasons). And since verifyproblem doesn't tell you about this misconfiguration it only shows up during live contest. (Twice for PO, now.)
The majority of problems with subgroups below secret want test data groups to be shown
I think your assessment here suffers a bit from selection bias. I personally have a selection bias in the other direction (for most problems I come into contact with having subgroups the default is the right thing).
If you think this warning should be added, then why not also add a warning for unspecified values of for example:
If you get these wrong, problemtools won't (and can't) tell you and you'll probably only discover it too late.
(To clarify I am not arguing that these things should give warnings, I am more trying to understand what makes show_test_data_groups
different from them.)
Yes, those sound useful to warn about as well (well, not objective from our POV since the default fits us; it would be hard to use minimize
in a contest). The reason we haven't been bit by those is that we auto-generate data/testdata.yaml, but not problem.yaml.
for most problems I come into contact with having subgroups the default is the right thing
Oh, I'm surprised you've actually come into contact with problems that use subgroups. Are they scoring problems? Do they use custom graders? Or do they just use subgroups for grouping testdata?
The reason we haven't been bit by those is that we auto-generate data/testdata.yaml, but not problem.yaml.
I guess then you could also simply auto-generate the problem.yaml files as well?
Oh, I'm surprised you've actually come into contact with problems that use subgroups. Are they scoring problems? Do they use custom graders? Or do they just use subgroups for grouping testdata?
Mostly the last option, standard pass-fail problems without custom graders just using subgroups for grouping.
I guess then you could also simply auto-generate the problem.yaml files as well?
I guess we could, but then we'd need to invent another problem.yaml-like source format.
Mostly the last option, standard pass-fail problems without custom graders just using subgroups for grouping.
Would you be fine with warning only for scoring problems?
Updated to only affect scoring problems (which arguably was a bug in the PR; show_test_data_groups is only supported for scoring problems).
Can we get this merged? It's a real pain point for us.
I think this looks good. I did some testing.
I am going to try to push the "Update branch" button in github, I don't usually use that, to bring this branch up to date. If that works, then I'll merge it.
The travis-ci build passed; the status did not make it back to github. Merging.
This has managed to bite us repeatedly in the short time show_test_data_groups has been a thing.