apache / lucene

Apache Lucene open-source search software
https://lucene.apache.org/
Apache License 2.0
2.67k stars 1.04k forks source link

Replace ant macros for running concurrent tests with ant-junit4. [LUCENE-3785] #4858

Closed asfimport closed 12 years ago

asfimport commented 12 years ago

ant-junit4 is an ANT task for running tests in parallel (on slave JVMs). Its advantages over the current macros:

More documentation and info will follow as I roll out a patch.

NOTE. The code for this issue is being developed at: https://github.com/dweiss/lucene_solr/tree/LUCENE-3785


Migrated from LUCENE-3785 by Dawid Weiss (@dweiss), resolved Mar 02 2012 Parent: #4857 Attachments: failure-cases.patch, junit4-1.txt, junit4-2.txt, line-per-test.txt, LUCENE-3785.patch, lucene-tests.txt, trunk1.txt, trunk2.txt Linked issues:

asfimport commented 12 years ago

Dawid Weiss (@dweiss) (migrated from JIRA)

This patch is not intended for inclusion, it provides all kinds of test failure points for testing/ debugging.

asfimport commented 12 years ago

Robert Muir (@rmuir) (migrated from JIRA)

jvm-crash resilience (I wrote tests that actually crash a slave jvms to make sure such an event is reported appropriately),

Its really annoying with our current setup that you have no clue which test the jvm crashed in... you just see (crashed)

asfimport commented 12 years ago

Dawid Weiss (@dweiss) (migrated from JIRA)

An example improvement over trunk. I've designed a "super-failing" set of classes that fail at various points (initializations, rules, hooks). The output from the current trunk is interleaved and hard to read (compare trunk1.txt and trunk2.txt – they're executions with the same seed; it's hard to tell which line of output is associated with which test). The output from junit4 is also reordered (because parallel tests execution is effectively a race condition...), but the OUTPUT passed to listeners (in this case the console output) is always fully consistent.

Compare junit4-1.txt and junit4-2.txt. Every suite block is atomic. Like this one:

   [junit4] Running org.apache.lucene.util.junitcompat.failures.TestFailOn07After
   [junit4] ERROR   0.09s S3 | TestFailOn07After.testMethod
   [junit4]    > Throwable #1: java.lang.RuntimeException: Failure on `@After`.
   [junit4]    >    at org.apache.lucene.util.junitcompat.failures.TestFailOn07After.after(TestFailOn07After.java:13)
   [junit4]    >    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
   [junit4]    >    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
   [junit4]    >    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
   [junit4]    >    at java.lang.reflect.Method.invoke(Method.java:601)
   [junit4]    >    at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:45)
   [junit4]    >    at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
   [junit4]    >    at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:42)
   [junit4]    >    at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:36)
   [junit4]    >    at org.apache.lucene.util.LuceneTestCase$SubclassSetupTeardownRule$1.evaluate(LuceneTestCase.java:700)
   [junit4]    >    at org.apache.lucene.util.LuceneTestCase$InternalSetupTeardownRule$1.evaluate(LuceneTestCase.java:599)
   [junit4]    >    at org.apache.lucene.util.LuceneTestCase$TestResultInterceptorRule$1.evaluate(LuceneTestCase.java:504)
   [junit4]    >    at org.apache.lucene.util.LuceneTestCase$RememberThreadRule$1.evaluate(LuceneTestCase.java:562)
   [junit4]    >    at org.junit.rules.RunRules.evaluate(RunRules.java:18)
   [junit4]    >    at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:263)
   [junit4]    >    at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:68)
   [junit4]    >    at org.apache.lucene.util.LuceneTestCaseRunner.runChild(LuceneTestCaseRunner.java:165)
   [junit4]    >    at org.apache.lucene.util.LuceneTestCaseRunner.runChild(LuceneTestCaseRunner.java:57)
   [junit4]    >    at org.junit.runners.ParentRunner$3.run(ParentRunner.java:231)
   [junit4]    >    at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:60)
   [junit4]    >    at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:229)
   [junit4]    >    at org.junit.runners.ParentRunner.access$000(ParentRunner.java:50)
   [junit4]    >    at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:222)
   [junit4]    >    at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
   [junit4]    >    at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:30)
   [junit4]    >    at org.junit.runners.ParentRunner.run(ParentRunner.java:300)
   [junit4]    >    at org.junit.runner.JUnitCore.run(JUnitCore.java:157)
   [junit4]    >    at com.carrotsearch.ant.tasks.junit4.slave.SlaveMain.execute(SlaveMain.java:124)
   [junit4]    >    at com.carrotsearch.ant.tasks.junit4.slave.SlaveMain.main(SlaveMain.java:186)
   [junit4]    >    at com.carrotsearch.ant.tasks.junit4.slave.SlaveMainSafe.main(SlaveMainSafe.java:12)
   [junit4]    > 
   [junit4]   2> NOTE: reproduce with: ant test -Dtestcase=TestFailOn07After -Dtestmethod=testMethod -Dtests.seed=-737472ec6455983d:77a4c0da1ce834dd:1a2ea2ed4eba1655 -Dargs="-Dfile.encoding=Cp1252"
   [junit4]   2>
   [junit4] Tests run:   1, Failures:   0, Errors:   1, Skipped:   0, Time:  0.25s <<< FAILURES!

A block like this is composed of sections. It begins with the suite name, followed by tests:

   [junit4] Running org.apache.lucene.util.junitcompat.failures.TestFailOn07After

If anything fails, it is reported as a consistent block within a suite. For example here, an @After hook failed on testMethod:

   [junit4] ERROR   0.09s S3 | TestFailOn07After.testMethod
   [junit4]    > Throwable #1: java.lang.RuntimeException: Failure on `@After`.
   [junit4]    >    at org.apache.lucene.util.junitcompat.failures.TestFailOn07After.after(TestFailOn07After.java:13)
   [junit4]    >    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
...
   [junit4]    >    at com.carrotsearch.ant.tasks.junit4.slave.SlaveMainSafe.main(SlaveMainSafe.java:12)
   [junit4]    > 
   [junit4]   2> NOTE: reproduce with: ant test -Dtestcase=TestFailOn07After -Dtestmethod=testMethod -Dtests.seed=-737472ec6455983d:77a4c0da1ce834dd:1a2ea2ed4eba1655 -Dargs="-Dfile.encoding=Cp1252"
   [junit4]   2>

An output section like this also follows a pattern. The first line contains a test case's status, time, slave JVM identifier (S3) and cause description:

   [junit4] ERROR   0.09s S3 | TestFailOn07After.testMethod

then follows (">"-indented) throwable cause and stack:

   [junit4]    > Throwable #1: java.lang.RuntimeException: Failure on `@After`.
   [junit4]    >    at org.apache.lucene.util.junitcompat.failures.TestFailOn07After.after(TestFailOn07After.java:13)
...

finally, standards output and standard error streams come at the end. They are also indented and prefixed with "1>" (stdout" or "2>" (stderr). stderr and stdout calls come in the order they actually happened on the slave (they're synchronized). In this case, only stderr was used:

   [junit4]   2> NOTE: reproduce with: ant test -Dtestcase=TestFailOn07After -Dtestmethod=testMethod -Dtests.seed=-737472ec6455983d:77a4c0da1ce834dd:1a2ea2ed4eba1655 -Dargs="-Dfile.encoding=Cp1252"
   [junit4]   2>

Note that even when when LuceneTestCase fails to provide reproduce string (because the exception happened outside of @Rule boundaries), the output exception cause is still nicely formatted and provided. As in this suite class, which even fails to load properly:

   [junit4] Running org.apache.lucene.util.junitcompat.failures.TestFailOn01ClassLoad
   [junit4] ERROR   0.00s S0 | TestFailOn01ClassLoad (suite)
   [junit4]    > Throwable #1: java.lang.ExceptionInInitializerError
   [junit4]    >    at java.lang.Class.forName0(Native Method)
   [junit4]    >    at java.lang.Class.forName(Class.java:186)
   [junit4]    >    at com.carrotsearch.ant.tasks.junit4.slave.SlaveMain.instantiate(SlaveMain.java:138)
   [junit4]    >    at com.carrotsearch.ant.tasks.junit4.slave.SlaveMain.execute(SlaveMain.java:116)
   [junit4]    >    at com.carrotsearch.ant.tasks.junit4.slave.SlaveMain.main(SlaveMain.java:186)
   [junit4]    >    at com.carrotsearch.ant.tasks.junit4.slave.SlaveMainSafe.main(SlaveMainSafe.java:12)
   [junit4]    > Caused by: java.lang.ArithmeticException: / by zero
   [junit4]    >    at org.apache.lucene.util.junitcompat.failures.TestFailOn01ClassLoad.<clinit>(TestFailOn01ClassLoad.java:11)
   [junit4]    >    ... 6 more
   [junit4]    >
   [junit4] Tests run:   0, Failures:   0, Errors:   1, Skipped:   0, Time:  0.00s <<< FAILURES!

The summary line contains "Tests run: 0" which may seem incorrect, but in fact no test even began – the error comes from the suite loader (the status line contains no method name, just the suite name).

asfimport commented 12 years ago

Dawid Weiss (@dweiss) (migrated from JIRA)

This is an output for all lucene tests (junit4).

asfimport commented 12 years ago

Dawid Weiss (@dweiss) (migrated from JIRA)

An output for full lucene tests under junit4. Note interesting corner cases:

1) assumption failed (status: IGNOR/A; ignored by assumption). It is reported twice – once by junit4, then on stderr by LuceneTestCase.

   [junit4] Running org.apache.lucene.index.TestIndexWriterReader
   [junit4] IGNOR/A 0.22s S1 | TestIndexWriterReader.testNoTermsIndex
   [junit4]    > Throwable #1: org.junit.internal.AssumptionViolatedException: got: <org.apache.lucene.util.InternalAssumptionViolatedException: failed assumption: PreFlex codec does not support ReaderTermsIndexDivisor!>, expected: null
   [junit4]   2> NOTE: Assume failed in 'testNoTermsIndex(org.apache.lucene.index.TestIndexWriterReader)' (ignored): got: <org.apache.lucene.util.InternalAssumptionViolatedException: failed assumption: PreFlex codec does not support ReaderTermsIndexDivisor!>, expected: null
   [junit4]   2>
   [junit4] Tests run:  20, Failures:   0, Errors:   0, Skipped:   1, Time: 17.25s

2) ignored tests, all reported under a suite:

   [junit4] Running org.apache.lucene.util.junitcompat.TestReproduceMessage
   [junit4] IGNORED 0.00s S0 | TestReproduceMessage.testAssumeAfterClass
   [junit4] IGNORED 0.00s S0 | TestReproduceMessage.testAssumeInitializer
   [junit4] IGNORED 0.00s S0 | TestReproduceMessage.testAssumeBeforeClass
   [junit4] IGNORED 0.00s S0 | TestReproduceMessage.testFailureBeforeClass
   [junit4] IGNORED 0.00s S0 | TestReproduceMessage.testFailureInitializer
   [junit4] IGNORED 0.00s S0 | TestReproduceMessage.testFailureAfterClass
   [junit4] IGNORED 0.00s S0 | TestReproduceMessage.testErrorBeforeClass
   [junit4] IGNORED 0.00s S0 | TestReproduceMessage.testErrorInitializer
   [junit4] IGNORED 0.00s S0 | TestReproduceMessage.testErrorAfterClass
   [junit4] Tests run:  21, Failures:   0, Errors:   0, Skipped:   9, Time:  0.14s

3) I've added past statistics for tests for load balancing; they seem to do a fairly good job:

   [junit4] Expected execution time on slave 0:    89.37s
   [junit4] Expected execution time on slave 1:    89.37s
   [junit4] Expected execution time on slave 2:    89.37s
   [junit4] Expected execution time on slave 3:    89.37s

and the result was:

   [junit4] Slave 0:     0.47 ..    81.83 =    81.36s
   [junit4] Slave 1:     0.47 ..    84.37 =    83.89s
   [junit4] Slave 2:     0.54 ..    81.92 =    81.38s
   [junit4] Slave 3:     0.47 ..    84.75 =    84.28s
   [junit4] Execution time total: 1 minute 24 seconds
   [junit4] Tests summary: 284 suites, 1587 tests, 25 ignored (12 assumptions)
asfimport commented 12 years ago

Dawid Weiss (@dweiss) (migrated from JIRA)

An alternative output from lucene tests; I like it but it may be too orthodox for some who're used to maven-like output. Feel free to tweak yourself if you feel like it, the console formatter is in common-build.xml. For example test-per-line formatting, no intermediate summaries:

                <report-text
                    showThrowable="true" 
                    showStackTraces="true" 
                    showOutputStream="true"
                    showErrorStream="true"

                    showStatusOk="true"
                    showStatusError="true"
                    showStatusFailure="true"
                    showStatusIgnored="true"

                    showSuiteSummary="false"
                />
asfimport commented 12 years ago

Dawid Weiss (@dweiss) (migrated from JIRA)

I've found and fixed the gson issue. I re-run ant test (no nightly though) 10 times, everything went fine – 7 successful builds and 3 errors (BasicZkTest.testBasic, BasicDistributedZkTest.testDistribSearch), but these are not caused by the testing framework I guess.

I'm ready to merge this work with the trunk once I receive acks (this will probably break your existing forks/ branches/patches if you have any – there's quite a lot of tiny changes).

I will attach a patch to this issue for completeness, but it'll be a huge blob (just warning).

asfimport commented 12 years ago

Dawid Weiss (@dweiss) (migrated from JIRA)

A non-binary patch for overview. Most of the changes are around encapsulating random as a method instead of a field access.

asfimport commented 12 years ago

Dawid Weiss (@dweiss) (migrated from JIRA)

I talked to Robert a bit and I've decided to postpone the commit until I have the backport on 3x branch ready too. It would make everyone's life very difficult if I applied it to the trunk only because of differences in acquiring the Random instance from LTC.

I'll work on the backport and come back when I'm ready.

asfimport commented 12 years ago

Dawid Weiss (@dweiss) (migrated from JIRA)

Subsumed by #4881.