utPLSQL / utPLSQL

Testing Framework for PL/SQL
https://www.utplsql.org/
Apache License 2.0
562 stars 186 forks source link

Possible introduction of performance criteria to tests. #698

Open lwasylow opened 6 years ago

lwasylow commented 6 years ago

We could introduce new annotation that could define a performance criteria regarding time execution of test. E.g. - - %timelimit(5s). This could fail the test after it passes the criteria regarding of the assertion result. We should allow to pass probably at least two time units : s as seconds and ms as millisecond. currently discussed solution was 1) use resource manager - this got implications of higher privs required to be granted to framework owner. https://blog.pythian.com/oracle-limiting-query-runtime-without-killing-the-session/ This could be possibly added as optional feature in installation step to allow users who are not able to install with more grants. 2) executables to start running as plsql tasks https://oracle-base.com/articles/11g/dbms_parallel_execute_11gR2

Shoelace commented 6 years ago

if you dont require the test to stop after the specified time has elapsed then this could be included as an additional check "after_test"

probably want option to mark as fail or warning

jgebal commented 6 years ago

I am afraid we will not be able to tests in separate threads as we would need to rework how we gather coverage. Each thread would need to get coverage data separately. We would need to combine coverage from each thread into common report. What would/should happen with coverage from a thread that has timed-out?

Lots of challenges. Not sure if the complexity, maintenance cost and effort is balancing the value of the feature.

Edit We have separate coverage reporting in threads since 3.1.11

lwasylow commented 6 years ago

Probably not worth too much effort spend on that one. Ultimate you can see a timing of the tests anyway so you can workout issues anyhow. closing for now.

pesse commented 5 years ago

It's clear that we won't be able to abort tests after a given timeout. But what do you think about the following:

New annotation --%fail_on_overtime(TimeInMilliSeconds) The test will not abort when the time is reached, but it will produce a failure when the execution time of the test exceeds the given threshold. It would be a very similar annotation to --%throws, which makes it far easier to catch exceptions. This new one will make it far easier to check for some max runtime situations (the need is definitely there, I can talk from personal experience :D)

lwasylow commented 5 years ago

That sounds like a good compromise and gives a nice feature, but maybe we should throw a warning instead if test pass functionality but not pass performance metrics.

pesse commented 5 years ago

By specifying --%fail_on_overtime I would make a specific performance metric into a condition of "passes the test". I don't see the value in treating performance checks different to other checks. If I expect a certain test to never take longer than 3 seconds, a warning would be too little for me. If I want more buffer, I'd increase the threshold.

jgebal commented 5 years ago

I would call it --%timeout and either allow for two formats without decimal places

I agree with sam that behavior shoyld be obvious and mark test as failed.

We could even introduce timeout on other elements like before/after/context/suite but for those it would be harder to establish the correct behavior on timeout.

lwasylow commented 5 years ago

I think timeout indicate too much behaviour. I like Sam idea or something like expected time threshold, performance limit etc.

pesse commented 5 years ago

While I like --%timeout for its shortness, it will make additional education necessary, because the annotation will not cause a timeout (in matters of aborting) after the given time.

I'd also prefer the numeric format (again - less education needed). I see the possibility to introduce this annotation for other elements, too, but would suggest to start with it only being a test-annotation (there will be some additional questions around before/beforeall scenarios)

lwasylow commented 5 years ago

Maybe time_tolerance?

PhilippSalvisberg commented 5 years ago

I agree with @pesse . --%timeout does not make it clear that the test run completes and then will fail. --%fail_on_overtime is longer, but the behaviour of the annotation is clearer. time_tolerance sound like some addition to --%timeout to make it more robust (I would not go that way, complicated, almost no value, the time should include the tolerance IMHO).

JUnit uses also the term timeout, but the behaviour is slightly different. JUnit aborts the the test execution after the timeout is reached and reports the test as failed. Making timeout a very suited term. Personally I'd like to have the very same functionality within utPLSQL. I know it is difficult to let a test fail in a session while keeping all the database session context, but nonetheless this is the feature in its final state. So the question is do we think that we will be able sometime in the future to abort a running test and keeping the database session for subsequent tests? If the answer is yes, then I suggest to use the term timeout. Otherwise I'd use something like --%report_completed_test_as_failed_after(10s). This should make it clear that such a test may run 30 minutes and then reported as failed.

I like the idea of different time formats (1s, 1000ms, 1.000) as @jgebal suggested.

SebGt commented 3 years ago

Hi,

Any luck that this very good idea will be added in a next release ? if it can help, what do you think about --%max_time(0.15s) ?

Thanks

ExtendHomoSapien commented 3 years ago

Also hoping for this to be prioritized. Thanks @SebGt for bumping the thread.