Open cos opened 10 years ago
Have you tried -v
option?
If something goes wrong, I don't think people would be bothered by the verbosity.
@cos the problem is that kast is typically run with hundreds of tests, and you often want to see how many passed and how many haven't quickly, one per line. So we tried to have minimal output by default. One thing we can do, though, is to store the -v option text into a file and tell the user to consult that whenever there are errors, and then remove it if there whenever there are no errors. If you think this is worth doing, then please add an issue.
Alternatively we can do something like this: By default ktest
writes error messages to a log file(something like /path/testcase.log
while executing /path/testcase.xml
) and only prints to screen(stdout
) when -v
is provided.
So when some tests failed you can refer to the log file to see the details without running ktest one more time with -v
.
Does that sound good?
Considering how long it takes to run ktest, saving all the information somewhere is imo a must. I feel awful after I run ktest for 30 minutes and see that I forgot to add the -v option.
OK so do you like the idea of writing to a log file by default and to stdout when -v
is provided? @grosu @dwightguth @daejunpark and others what do you think? We can implement it easily.
@grosu, you can find out how many tests failed/passed by simply looking at the statistics table displayed at the very end -- no scrolling needed. If nothing goes wrong, almost nothing is displayed. If something does go wrong, I don't anybody minds the extra verbosity. For what it's worth, this is the way other testing frameworks work, e.g., junit.
By the way, is there a particular reason ktest is not built over junit?
By the way, is there a particular reason ktest is not built over junit?
No reasons, we just didn't consider that(because nobody suggested) while designing it.
guys, ktest is an incredibly complex tool. And that complexity is not artificial or an artifact of a bad design. It is absolutely needed. And it looks like we may even need more cases to add to it. I doubt that junit provides all that functionality neatly, because junit was designed for something else.
It is OK with me to save the log and tell the user where it is.
Cosmin is right, I would naturally expect ktest to print more info about errors by default. As it is, what is output by default if there's an error is not useful at all. On May 27, 2014 1:21 PM, "Grigore Rosu" notifications@github.com wrote:
guys, ktest is an incredibly complex tool. And that complexity is not artificial or an artifact of a bad design. It is absolutely needed. And it looks like we may even need more cases to add to it. I doubt that junit provides all that functionality neatly, because junit was designed for something else.
It is OK with me to save the log and tell the user where it is.
— Reply to this email directly or view it on GitHubhttps://github.com/kframework/k/issues/503#issuecomment-44315084 .
I doubt that junit provides all that functionality neatly, because junit was designed for something else.
I think what @cos had in mind was this: We can run arbitrary computations and report junit a success/failure. junit would then present the results in it's well-known UI. So junit would be used only as a front-end/GUI of ktest.
I'm not sure if it's worth the effort but it certainly has benefits. For example, most Java IDEs have junit integration so we could see results in our IDEs.
I see. But we do have junit integration to some extent, I think, because we needed it for integration with jenkins. Just run ktest with the option --report. We want ktest to have a very simple default interface, at the same time to provide the functionality that experts need, too.
1) Please try the option --report and see whether that gives you want you want/need in terms of junit.
2) I've been using ktest a lot with hundreds of different definitions and exercises in cs422 and the tutorial. There was a time when all the error was printed and I had a very hard time finding which tests failed and where one failure ended and another started. Life became a lot simpler after printing one error per line with a short comment. So I would say to keep each error on one line with a very brief message as a default, but I agree that we should give the users a quick way to check the entire error.
3) @osa1 Why /path/testcase.log
and not /path/ktest.log
? If we do this then we should display a short message when ktest starts saying where the log is, something like:
Check /path/ktest.log for detailed output
The --report actually generates JUnit xml output (which is understood by Jenkins). I would be nice though to have other visualization tools which can just display something based on this xml ouptut.
2014-05-30 2:14 GMT+03:00 Grigore Rosu notifications@github.com:
I see. But we do have junit integration to some extent, I think, because we needed it for integration with jenkins. Just run ktest with the option --report. We want ktest to have a very simple default interface, at the same time to provide the functionality that experts need, too.
— Reply to this email directly or view it on GitHub https://github.com/kframework/k/issues/503#issuecomment-44597520.
@grosu I had something like this in mind: if we start giving different names to xml files for tests(or somehow have a field in XML file that gives the tests a name), we can use that name as name of log file.
For example, for lc_substitutions.xml
we can generate lc_substitutions.log
.
If you don't like this idea of course we can always generate ktest.log
.
I'm going to implement this feature over the weekend but @cos and @radumereuta does that work for you? Basically we'll be generating a ktest.log
file that shows all error messages in detail and print error messages to console/stdout only when -v
is provided(as usual). (so ktest output will work exactly as before but an extra file will be generated)
Now with a fresh mind, I actually think that we should not change the functionality of ktest. This all started pretty much because @cos did'n know or didn't try the -v and -report options. I would really not like it for ktest to start infesting my folder with files which I would then immediately need to delete, or put into .git/.ignore, etc. This problem would all go away if we print a short message at the beginning of the execution of ktest called with the -v or -report options of the form:
Run ktest with option -v or -report for detailed output.
Then even experienced users, like @radumereuta, would see it, stop it and run it with the -v or -report option if that's what he wanted to see.
Guys, we have major problems to fix in K, even in ktest (for example #518). Let us please focus on those.
@grosu I'm just pointing out things that may bother newcomers. I can only give feedback as a newcomer once :) Shall we close the issue then?
I still think something positive came out of this: 1) Let's have ktest output something like "Run ktest with option -v or -report for detailed output." as its first output; and 2) Let's add these ideas to the manual-to-be. Can I assign 1 to @osa1 and 2 to @daejunpark ? Thanks.
To me, this just seems like a missed opportunity. If you are able to store more information about specific errors, why not? I find out that I forgot to add the -v option after 15 minutes, when krun starts failing.
If this is to hard to implement, then I can live without it, but I don't feel ok about missing out on an improvement that may save me a lot of time.
But Radu, that's the point, that it does not save you time. For a small advantage in 1% of the cases when you forget to use -v in spite of ktest telling you to run it with -v or -report (after we make the change proposed above), you will have to always then remove the ktest.log files that will infest all your folders and language definitions otherwise. Not to mention that some people will add them to the git by mistake, etc. I could only approve such a thing if that log file is stored in a temporary folder, like /tmp/ktest.log, where one never sees it unless one really wants to. If we do that, then ktest should report upfront where that file is with more details. But that should work on all platforms well; that is, one should put it in whatever folder windows uses for temporary files, ktest should not break if one does not have write access to that folder, etc.
I created a wiki page of ktest design: https://github.com/kframework/k/wiki/KTest-Design
Please feel free to update the page whenever you have a new idea.
How about we always display the error to standard error, but truncate it to a couple of lines when -v
is not on? This should give at least an idea of the problem while not crowding the output.
ktest doesn't display the error when kompile fails.
It would be nice to show
[Error] Critical: Creating multiple kompiled definition in the same directory
somewhere after the rather uninformativefailed with error (time: 350 ms)
.