Homebrew / homebrew-core

🍻 Default formulae for the missing package manager for macOS (or Linux)
https://brew.sh
BSD 2-Clause "Simplified" License
13.79k stars 12.45k forks source link

ApacheSpark #15160

Closed kant111 closed 7 years ago

kant111 commented 7 years ago

when I do brew install apache-spark it says it is downloading the version 2.1.1 (which is good since that is the latest) however when I do spark-shell I see the spark-version displayed as 2.1.0 so I believe the binaries are incorrectly packaged since this doesn't happen when I manually download and install myself.

ilovezfs commented 7 years ago
iMac-TMP:~ joe$ brew install apache-spark
==> Using the sandbox
==> Downloading https://www.apache.org/dyn/closer.lua?path=spark/spark-2.1.1/spark-2.1.1-bin-hadoop2.
==> Best Mirror http://mirror.reverse.net/pub/apache/spark/spark-2.1.1/spark-2.1.1-bin-hadoop2.7.tgz
######################################################################## 100.0%
🍺  /usr/local/Cellar/apache-spark/2.1.1: 1,271 files, 219MB, built in 1 minute 21 seconds
iMac-TMP:~ joe$ brew test -vd apache-spark
/usr/local/Homebrew/Library/Homebrew/brew.rb (Formulary::FormulaLoader): loading /usr/local/Homebrew/Library/Taps/homebrew/homebrew-core/Formula/apache-spark.rb
Testing apache-spark
==> Using the sandbox
/usr/bin/sandbox-exec -f /tmp/homebrew20170630-46458-pxcb4m.sb /usr/local/Homebrew/Library/Homebrew/vendor/portable-ruby/2.0.0-p648/bin/ruby -W0 -I /usr/local/Homebrew/Library/Homebrew -- /usr/local/Homebrew/Library/Homebrew/test.rb /usr/local/Homebrew/Library/Taps/homebrew/homebrew-core/Formula/apache-spark.rb -vd
/usr/local/Homebrew/Library/Homebrew/test.rb (Formulary::FromPathLoader): loading /usr/local/Homebrew/Library/Taps/homebrew/homebrew-core/Formula/apache-spark.rb
==> /usr/local/Cellar/apache-spark/2.1.1/bin/spark-shell
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
17/06/30 12:01:07 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/06/30 12:01:10 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0
17/06/30 12:01:10 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException
17/06/30 12:01:11 WARN ObjectStore: Failed to get database global_temp, returning NoSuchObjectException
iMac-TMP:~ joe$ spark-shell
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
17/06/30 12:01:23 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/06/30 12:01:26 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0
17/06/30 12:01:26 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException
17/06/30 12:01:26 WARN ObjectStore: Failed to get database global_temp, returning NoSuchObjectException
Spark context Web UI available at http://10.0.1.15:4040
Spark context available as 'sc' (master = local[*], app id = local-1498849283508).
Spark session available as 'spark'.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.1.1
      /_/

Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_121)
Type in expressions to have them evaluated.
Type :help for more information.

scala>
kant111 commented 7 years ago

oops sorry I had env vars pointed to the binaries I downloaded long time ago. Closing the ticket!

ilovezfs commented 7 years ago

@kant111 no problem! Thanks for reporting the issue nonetheless.

kant111 commented 7 years ago

@ilovezfs Is there a way brew can spit out commands to launch a small cluster like one master, one worker and one executer just like it does when we do brew install kafka. I did install spark using brew however I am not sure where all the binaries are getting stored to launch a cluster locally?

ilovezfs commented 7 years ago

The files are in /usr/local/opt/apache-spark/libexec and there are wrapper scripts in /usr/local/opt/apache-spark/bin.