apache / orc

Apache ORC - the smallest, fastest columnar storage for Hadoop workloads
https://orc.apache.org/
Apache License 2.0
689 stars 483 forks source link

ORC-1586: Fix IllegalAccessError when SparkBenchmark runs on JDK17 #1748

Closed cxzl25 closed 9 months ago

cxzl25 commented 9 months ago

What changes were proposed in this pull request?

Add java options --add-opens=java.base/sun.nio.ch=ALL-UNNAMED

Why are the changes needed?

java.lang.IllegalAccessError: class org.apache.spark.storage.StorageUtils$ (in unnamed module @0x5b2c883c) cannot access class sun.nio.ch.DirectBuffer (in module java.base) because module java.base does not export sun.nio.ch to unnamed module @0x5b2c883c
    at org.apache.spark.storage.StorageUtils$.<init>(StorageUtils.scala:213)
    at org.apache.spark.storage.StorageUtils$.<clinit>(StorageUtils.scala)
    at org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:121)
    at org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:358)
    at org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:295)
    at org.apache.spark.SparkEnv$.create(SparkEnv.scala:344)
    at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:196)
    at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:284)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:483)
    at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2888)
    at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:1099)
    at scala.Option.getOrElse(Option.scala:189)
    at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:1093)
    at org.apache.orc.bench.spark.SparkBenchmark$InputSource.setup(SparkBenchmark.java:129)
    at org.apache.orc.bench.spark.jmh_generated.SparkBenchmark_fullRead_jmhTest._jmh_tryInit_f_inputsource1_1(SparkBenchmark_fullRead_jmhTest.java:403)

How was this patch tested?

local test

Was this patch authored or co-authored using generative AI tooling?

No

paliwalashish commented 9 months ago

Not very clear at what point are we hitting this error? @cxzl25 Could you please elaborate the description with details

cxzl25 commented 9 months ago

Not very clear at what point are we hitting this error

Because Spark uses Unsafe API.

https://github.com/apache/spark/blob/branch-3.5/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala#L207-L213

https://github.com/apache/spark/blob/416b7b1cd5a6555a2d545d2f8f3cbd6cadff130e/pom.xml#L304-L322

dongjoon-hyun commented 9 months ago

Ya, sorry for the duplication, I realized that @cxzl25 already mentioned the above pom.xml. :)

dongjoon-hyun commented 9 months ago

Merged to main/2.0/1.9.