apache / datafusion-comet

Apache DataFusion Comet Spark Accelerator
https://datafusion.apache.org/comet
Apache License 2.0
549 stars 105 forks source link

build: Upload test reports and coverage #163

Closed advancedxy closed 2 months ago

advancedxy commented 2 months ago

Which issue does this PR close?

Partially closes #120

Rationale for this change

Upload test reports and test coverage reports so that it could be accessed after workflow runs.

What changes are included in this PR?

  1. refactor MAVEN_OPTS to maven_opts which is more consistent with others
  2. add jacoco plugin to generate test coverage for both Java and Scala test
  3. Upload test reports and test coverage reports for Ubuntu-Java17-Spark-3.4 combination.

How are these changes tested?

existing CI runs.

codecov-commenter commented 2 months ago

Welcome to Codecov :tada:

Once merged to your default branch, Codecov will compare your coverage reports and display the results in this comment.

Thanks for integrating Codecov - We've got you covered :open_umbrella:

advancedxy commented 2 months ago

cc @sunchao

advancedxy commented 2 months ago

Test coverage for rust is not included in this PR. Maybe we can enable that later or not since there's not that much unit test code in the rust side..

We may have to find a way to generate rust coverage from Spark side's tests.

sunchao commented 2 months ago

We may have to find a way to generate rust coverage from Spark side's tests.

I think we can use cargo2junit for this. Example:

cargo test --release --color=never -- --color=never -Zunstable-options --report-time --format json | cargo2junit > target/cargo-reports/TEST-all.xml

cc @snmvaughan

sunchao commented 2 months ago

There are some merge conflicts. Please try to rebase this @advancedxy and I'll merge it afterwards.

advancedxy commented 2 months ago

I think we can use cargo2junit for this. Example:

There's also other coverage tool for rust, for example: arrow-rs uses tarpaulin in https://github.com/apache/arrow-rs/blob/master/.github/workflows/coverage.yml. We can leverage it too.

What I mean was since most of the test code are written in Spark's side, it would be ideal that after running tests on the JVM/Spark side, the test coverage for the native comet lib is generated automatically.

sunchao commented 2 months ago

What I mean was since most of the test code are written in Spark's side, it would be ideal that after running tests on the JVM/Spark side, the test coverage for the native comet lib is generated automatically.

I see what you mean. Not sure whether this is possible.

sunchao commented 2 months ago

Merged, thanks!

advancedxy commented 2 months ago

Not sure whether this is possible.

It might be possible, see https://stackoverflow.com/questions/70410715/how-to-collect-rust-code-coverage-when-running-remote-tests and https://rustc-dev-guide.rust-lang.org/llvm-coverage-instrumentation.html for related approaches. But I think it's in a low priority and we can always evaluate that when needed.