awesome-kyuubi / hadoop-testing

Testing Sandbox for Hadoop Ecosystem Components
Apache License 2.0
32 stars 13 forks source link

Remove or change the wrong JAVA_HOME configuration in /etc/spark/conf/spark-env.sh #13

Closed yanghua closed 10 months ago

yanghua commented 10 months ago

The wrong config is here.

It is used by spark-submit:

Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 3.2.2
      /_/

Using Scala version 2.12.15 (OpenJDK 64-Bit Server VM, Java 17.0.9)
Type in expressions to have them evaluated.
Type :help for more information.
yanghua commented 10 months ago

cc @pan3793

pan3793 commented 10 months ago

Yea, we should use java8 by default for consistent, it was used to test the spark 4.0-snapshot

pan3793 commented 10 months ago

BTW, Spark supports Java 17 since 3.3 and requires at least Java 17 since 4.0

yanghua commented 10 months ago

BTW, Spark supports Java 17 since 3.3 and requires at least Java 17 since 4.0

Got it. Currently, I fall back to Java 8 to take care of other components in the Hadoop ecosystem.

pan3793 commented 10 months ago

yea, using Java8 by default makes sense, let's remove the JAVA_HOME overriding in spark-defaults.conf