Closed geserdugarov closed 3 days ago
I made 3 attempts in my PR.
test-spark-java17-java-tests (scala-2.12, spark3.4, hudi-spark-datasource/hudi-spark3.4.x)
Error: Tests run: 256, Failures: 0, Errors: 1, Skipped: 18, Time elapsed: 1,164.568 s <<< FAILURE! - in JUnit Vintage
Error: [1] tableType=COPY_ON_WRITE(testSecondaryIndexWithClusteringAndCleaning(HoodieTableType)) Time elapsed: 11.981 s <<< ERROR!
org.scalatest.exceptions.TestFailedException: Expected Array([cde$row2,false], [def$row3,false], [fgh$row2,false], [xyz$row1,false]), but got Array([def$row3,false], [efg$row2,false], [fgh$row2,false], [xyz$row1,false])
test-spark-java17-java-tests (scala-2.12, spark3.4, hudi-spark-datasource/hudi-spark3.4.x)
Error: Tests run: 256, Failures: 0, Errors: 1, Skipped: 18, Time elapsed: 1,216.821 s <<< FAILURE! - in JUnit Vintage
Error: [1] tableType=COPY_ON_WRITE(testSecondaryIndexWithClusteringAndCleaning(HoodieTableType)) Time elapsed: 12.119 s <<< ERROR!
org.scalatest.exceptions.TestFailedException: Expected Array([cde$row2,false], [def$row3,false], [fgh$row2,false], [xyz$row1,false]), but got Array([def$row3,false], [efg$row2,false], [fgh$row2,false], [xyz$row1,false])
docker-java17-test (scala-2.12, flink1.20, spark3.4, spark3.4.0)
#16 [11/19] RUN wget https://archive.apache.org/dist/maven/maven-3/3.6.3/binaries/apache-maven-3.6.3-bin.tar.gz
#16 0.227 Connecting to archive.apache.org (65.108.204.189:443)
#16 133.9 wget: can't connect to remote host (65.108.204.189): Operation timed out
#16 ERROR: process "/bin/sh -c wget https://archive.apache.org/dist/maven/maven-3/3.6.3/binaries/apache-maven-3.6.3-bin.tar.gz" did not complete successfully: exit code: 1
test-spark-java17-java-tests (scala-2.12, spark3.4, hudi-spark-datasource/hudi-spark3.4.x)
Error: Tests run: 256, Failures: 0, Errors: 1, Skipped: 18, Time elapsed: 1,186.967 s <<< FAILURE! - in JUnit Vintage
Error: [1] tableType=COPY_ON_WRITE(testSecondaryIndexWithClusteringAndCleaning(HoodieTableType)) Time elapsed: 12.738 s <<< ERROR!
org.scalatest.exceptions.TestFailedException: Expected Array([cde$row2,false], [def$row3,false], [fgh$row2,false], [xyz$row1,false]), but got Array([def$row3,false], [efg$row2,false], [fgh$row2,false], [xyz$row1,false])
docker-java17-test (scala-2.12, flink1.20, spark3.4, spark3.4.0)
#16 [11/19] RUN wget https://archive.apache.org/dist/maven/maven-3/3.6.3/binaries/apache-maven-3.6.3-bin.tar.gz
#16 0.206 Connecting to archive.apache.org (65.108.204.189:443)
#16 134.6 wget: can't connect to remote host (65.108.204.189): Operation timed out
#16 ERROR: process "/bin/sh -c wget https://archive.apache.org/dist/maven/maven-3/3.6.3/binaries/apache-maven-3.6.3-bin.tar.gz" did not complete successfully: exit code: 1
integration-tests (spark3.5, spark-3.5.3/spark-3.5.3-bin-hadoop3.tgz)
a2a48f8c5740 Extracting [==================================================>] 1.43GB/1.43GB
a2a48f8c5740 Extracting [==================================================>] 1.43GB/1.43GB
failed to register layer: write /usr/local/trino-server-368.tar.gz: no space left on device
Error: Command execution failed.
test-spark-java-tests (scala-2.12, spark3.4, hudi-spark-datasource/hudi-spark3.4.x)
Error: Tests run: 256, Failures: 0, Errors: 1, Skipped: 18, Time elapsed: 1,214.484 s <<< FAILURE! - in JUnit Vintage
Error: [1] tableType=COPY_ON_WRITE(testSecondaryIndexWithClusteringAndCleaning(HoodieTableType)) Time elapsed: 12.897 s <<< ERROR!
org.scalatest.exceptions.TestFailedException: Expected Array([cde$row2,false], [def$row3,false], [fgh$row2,false], [xyz$row1,false]), but got Array([def$row3,false], [efg$row2,false], [fgh$row2,false], [xyz$row1,false])
test-spark-scala-tests (scala-2.12, spark3.4, hudi-spark-datasource/hudi-spark3.4.x)
- Test multiple partition fields pruning
2024-11-15 15:09:15.400:INFO:oejs.Server:ScalaTest-main-running-TestInsertTable: jetty-9.4.53.v20231009; built: 2023-10-09T12:29:09.265Z; git: 27bde00a0b95a1d5bbee0eae7984f891d2d0f8c9; jvm 1.8.0_432-b06
2024-11-15 15:09:15.402:INFO:oejs.Server:ScalaTest-main-running-TestInsertTable: Started @774694ms
778457 [spark-listener-group-shared] ERROR org.apache.spark.scheduler.AsyncEventQueue [] - Listener StageParallelismListener threw an exception
org.scalatest.exceptions.TestFailedException: Expected 1, but got 2
at org.scalatest.Assertions.newAssertionFailedException(Assertions.scala:472) ~[scalatest_2.12-3.1.0.jar:3.1.0]
at org.scalatest.Assertions.newAssertionFailedException$(Assertions.scala:471) ~[scalatest_2.12-3.1.0.jar:3.1.0]
at org.scalatest.funsuite.AnyFunSuite.newAssertionFailedException(AnyFunSuite.scala:1562) ~[scalatest_2.12-3.1.0.jar:3.1.0]
at org.scalatest.Assertions.assertResult(Assertions.scala:867) ~[scalatest_2.12-3.1.0.jar:3.1.0]
at org.scalatest.Assertions.assertResult$(Assertions.scala:863) ~[scalatest_2.12-3.1.0.jar:3.1.0]
at org.scalatest.funsuite.AnyFunSuite.assertResult(AnyFunSuite.scala:1562) ~[scalatest_2.12-3.1.0.jar:3.1.0]
at org.apache.spark.sql.hudi.dml.TestInsertTable$StageParallelismListener.onStageSubmitted(TestInsertTable.scala:2302) ~[test-classes/:?]
@codope , if you don't mind, could you, please, take a look at functional Scala test TestSecondaryIndexPruning @testSecondaryIndexWithClusteringAndCleaning
? It seems like combination of last commits broke this test.
This PR has no changes in comparison with master, but CI couldn't be passed.
Yeah I noticed it's flaky. I'll take a look.
Change Logs
I've started to face problems with Github Actions CI after force push in my PR: https://github.com/apache/hudi/pull/12245, and force push changes were related to changing new test and rebase on current master. Decided to check current state in this CI test MR.
Impact
No
Risk level (write none, low medium or high below)
No
Documentation Update
No
Contributor's checklist