NVIDIA / spark-rapids

Spark RAPIDS plugin - accelerate Apache Spark with GPUs
https://nvidia.github.io/spark-rapids
Apache License 2.0
822 stars 235 forks source link

[BUG] Fix spark400 build due to writeWithV1 return value change #11741

Closed gerashegalov closed 1 day ago

gerashegalov commented 1 day ago

Describe the bug spark400 build fails because of https://github.com/apache/spark/commit/f1b68d897e49e77308fb75bb60d054db10f6a90c

[ERROR] [Error] /home/user/gits/NVIDIA/spark-rapids/sql-plugin/src/main/spark320/scala/com/nvidia/spark/rapids/v1FallbackWriters.scala:104: type mismatch;
 found   : Unit
 required: Seq[org.apache.spark.sql.catalyst.InternalRow]
[INFO] [Info] : Unit <: Seq[org.apache.spark.sql.catalyst.InternalRow]?
[INFO] [Info] : false
[ERROR] [Error] /home/user/gits/NVIDIA/spark-rapids/sql-plugin/src/main/spark320/scala/com/nvidia/spark/rapids/v1FallbackWriters.scala:102: local val writtenRows in method run is never used
Applicable -Wconf / @nowarn filters for this fatal warning: msg=<part of the message>, cat=unused-locals, site=com.nvidia.spark.rapids.GpuV1FallbackWriters.run.writtenRows
[ERROR] two errors found

Steps/Code to reproduce bug

$ JAVA_HOME=/usr/lib/jvm/java-17-openjdk-amd64 mvnd clean package -pl sql-plugin  -Dbuildver=400 -f scala2.13 -am

Expected behavior should build

Environment details (please complete the following information) local

Additional context Add any other context about the problem here.