NVIDIA / spark-rapids

Spark RAPIDS plugin - accelerate Apache Spark with GPUs
https://nvidia.github.io/spark-rapids
Apache License 2.0
822 stars 235 forks source link

[BUG] Premerge failing on databricks #11232

Closed revans2 closed 4 months ago

revans2 commented 4 months ago

Describe the bug

[2024-07-19T04:30:32.069Z] [INFO] --- scala-maven-plugin:4.9.1:compile (scala-compile-first) @ rapids-4-spark-delta-spark332db_2.12 ---
[2024-07-19T04:30:32.069Z] [INFO] Compiler bridge file: /home/ubuntu/spark-rapids/target/spark332db/.sbt/1.0/zinc/org.scala-sbt/org.scala-sbt-compiler-bridge_2.12-1.10.0-bin_2.12.15__52.0-1.10.0_20240505T232140.jar
[2024-07-19T04:30:38.598Z] [INFO] compiling 44 Scala sources to /home/ubuntu/spark-rapids/delta-lake/delta-spark332db/target/spark332db/classes ...
[2024-07-19T04:30:41.100Z] [ERROR] [Error] /home/ubuntu/spark-rapids/delta-lake/common/src/main/databricks/scala/com/databricks/sql/transaction/tahoe/rapids/GpuOptimisticTransactionBase.scala:103: not enough arguments for method apply: (number: Int)org.apache.spark.sql.catalyst.trees.TreeNode[_] in class TreeNode.
[2024-07-19T04:30:41.100Z] Unspecified value parameter number.
[2024-07-19T04:30:41.100Z] [ERROR] one error found

Others might also be failing but the other versions were canceled.

jlowe commented 4 months ago

I just spun up a 12.2 instance in the same Databricks workspace and performed a build successfully. The issue is specific to the changes in #11200. The signature of the GpuProjectExec apply method was changed but GpuOptimisticTransactionBase was still using the old signature.