I'm testing out Z-Ordering with latest master branch code. It worked well with Spark 3.x. However, I found error when importing or using ZOrderCoveringIndexConfig class in Spark 2.4. Not sure if there is problem in my local setup. I know this new feature is not released yet. Reporting this issue just in case you want to double check before release. Thanks!
To Reproduce
1.Build and publish to local repo by sbt compile package publishLocal
2.Launch Spark 2.4 shell
3.Import
➜ spark-2.4.2-bin-hadoop2.7 bin/spark-shell --packages io.delta:delta-core_2.12:0.6.1,com.microsoft.hyperspace:hyperspace-core-spark2.4_2.12:0.5.0-SNAPSHOT --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" --conf "spark.sql.catalog.spark_catalog=org.apache.spark.sql.delta.catalog.DeltaCatalog"
...
Using Scala version 2.12.8 (OpenJDK 64-Bit Server VM, Java 1.8.0_312)
Type in expressions to have them evaluated.
Type :help for more information.
scala> import com.microsoft.hyperspace.index.zordercovering.ZOrderCoveringIndexConfig
<console>:23: error: object ZOrderCoveringIndexConfig is not a member of package com.microsoft.hyperspace.index.zordercovering
import com.microsoft.hyperspace.index.zordercovering.ZOrderCoveringIndexConfig
^
Expected behavior
This can work as expected as in Spark 3.x
➜ spark-3.0.3-bin-hadoop2.7 bin/spark-shell --packages org.apache.iceberg:iceberg-spark3-runtime:0.12.1,com.microsoft.hyperspace:hyperspace-core-spark3.0_2.12:0.5.0-SNAPSHOT
...
Using Scala version 2.12.10 (OpenJDK 64-Bit Server VM, Java 1.8.0_312)
Type in expressions to have them evaluated.
Type :help for more information.
scala> import com.microsoft.hyperspace.index.zordercovering.ZOrderCoveringIndexConfig
import com.microsoft.hyperspace.index.zordercovering.ZOrderCoveringIndexConfig
Environment
Please complete the following information if applicable:
Describe the issue
I'm testing out Z-Ordering with latest
master
branch code. It worked well with Spark 3.x. However, I found error when importing or usingZOrderCoveringIndexConfig
class in Spark 2.4. Not sure if there is problem in my local setup. I know this new feature is not released yet. Reporting this issue just in case you want to double check before release. Thanks!To Reproduce
1.Build and publish to local repo by sbt
compile package publishLocal
2.Launch Spark 2.4 shell 3.ImportExpected behavior
This can work as expected as in Spark 3.x
Environment
Please complete the following information if applicable: