jaceklaskowski / spark-activator

Spark Streaming with Scala and Akka Activator template
https://typesafe.com/activator/template/spark-streaming-scala-akka
Apache License 2.0
44 stars 26 forks source link

jar is not built (Windows) #4

Open ishepherd opened 9 years ago

ishepherd commented 9 years ago

I downloaded the template bundle from http://www.typesafe.com/activator/template/bundle/hello-apache-spark. Unzipped and ran activator clean run

The project builds ok including 4 .class files at target\scala-2.10\classes. But the target\scala-2.10\spark-activator_2.10-1.0.jar is not created. Spark fails at runtime with a FileNotFoundException.

ishepherd commented 9 years ago
[info] Compiling 1 Scala source to C:\...\hello-apache-spark\project\target\scala-2.10\sbt-0.13\classes...
[info] Set current project to hello-spark (in build file:/C:/.../hello-apache-spark/)
[success] Total time: 0 s, completed 12/08/2015 4:22:18 PM
[info] Updating {file:/C:/.../hello-apache-spark/}hello-apache-spark...
[info] Formatting 1 Scala source {file:/C:/.../hello-apache-spark/}hello-apache-spark(compile) ...
[info] Resolving org.scala-lang#scala-library;2.10.4 ...
  [info] Resolving org.apache.spark#spark-core_2.10;0.9.1 ...
  [info] Resolving org.apache.hadoop#hadoop-client;1.0.4 ...
  [info] Resolving org.apache.hadoop#hadoop-core;1.0.4 ...
  ...
[info] Done updating.
[info] Compiling 1 Scala source to C:\...\hello-apache-spark\target\scala-2.10\classes...
[info] Running SimpleApp
log4j:WARN No appenders could be found for logger (akka.event.slf4j.Slf4jLogger).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
15/08/12 16:22:30 INFO SparkEnv: Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
15/08/12 16:22:30 INFO SparkEnv: Registering BlockManagerMaster
15/08/12 16:22:30 INFO DiskBlockManager: Created local directory at C:\Users\ishepher\AppData\Local\Temp\spark-local-20150812162230-5dbe
15/08/12 16:22:30 INFO MemoryStore: MemoryStore started with capacity 2.1 GB.
15/08/12 16:22:30 INFO ConnectionManager: Bound socket to port 65114 with id = ConnectionManagerId(DSGCLTSQ32.prod.quest.corp,65114)
15/08/12 16:22:30 INFO BlockManagerMaster: Trying to register BlockManager
15/08/12 16:22:30 INFO BlockManagerMasterActor$BlockManagerInfo: Registering block manager DSGCLTSQ32.prod.quest.corp:65114 with 2.1 GB RAM
15/08/12 16:22:30 INFO BlockManagerMaster: Registered BlockManager
15/08/12 16:22:30 INFO HttpServer: Starting HTTP Server
15/08/12 16:22:30 INFO HttpBroadcast: Broadcast server started at http://10.20.26.92:65115
15/08/12 16:22:30 INFO SparkEnv: Registering MapOutputTracker
15/08/12 16:22:30 INFO HttpFileServer: HTTP File server directory is C:\Users\ishepher\AppData\Local\Temp\spark-17175fd5-f364-4089-acfd-260f7306a276
15/08/12 16:22:30 INFO HttpServer: Starting HTTP Server
15/08/12 16:22:30 INFO SparkUI: Started Spark Web UI at http://DSGCLTSQ32.prod.quest.corp:4040
15/08/12 16:22:30 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
15/08/12 16:22:30 ERROR Shell: Failed to locate the winutils binary in the hadoop binary path
java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries.
        at org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java:318)
        at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:333)
        at org.apache.hadoop.util.Shell.<clinit>(Shell.java:326)
        ...
[error] (run-main-0) java.io.FileNotFoundException: target\scala-2.10\spark-activator_2.10-1.0.jar (The system cannot find the file specified)
java.io.FileNotFoundException: target\scala-2.10\spark-activator_2.10-1.0.jar (The system cannot find the file specified)
        at java.io.FileInputStream.open0(Native Method)
        at java.io.FileInputStream.open(FileInputStream.java:195)
        at java.io.FileInputStream.<init>(FileInputStream.java:138)
        at com.google.common.io.Files$FileByteSource.openStream(Files.java:124)
        at com.google.common.io.Files$FileByteSource.openStream(Files.java:114)
        at com.google.common.io.ByteSource.copyTo(ByteSource.java:202)
        at com.google.common.io.Files.copy(Files.java:436)
        at org.apache.spark.HttpFileServer.addFileToDir(HttpFileServer.scala:59)
        at org.apache.spark.HttpFileServer.addJar(HttpFileServer.scala:54)
        at org.apache.spark.SparkContext.addJar(SparkContext.scala:754)
        at org.apache.spark.SparkContext$$anonfun$5.apply(SparkContext.scala:165)
        at org.apache.spark.SparkContext$$anonfun$5.apply(SparkContext.scala:165)
        at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
        at scala.collection.mutable.WrappedArray.foreach(WrappedArray.scala:34)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:165)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:100)
        at SimpleApp$.main(SimpleApp.scala:7)
        at SimpleApp.main(SimpleApp.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:497)
[trace] Stack trace suppressed: run last compile:run for the full output.
15/08/12 16:22:31 INFO ConnectionManager: Selector thread was interrupted!
java.lang.RuntimeException: Nonzero exit code: 1
        at scala.sys.package$.error(package.scala:27)
[trace] Stack trace suppressed: run last compile:run for the full output.
[error] (compile:run) Nonzero exit code: 1
[error] Total time: 13 s, completed 12/08/2015 4:22:31 PM
ishepherd commented 9 years ago

I got it running with the following changes:

  1. set HADOOP_HOME = (someplace with Hadoop's bin\winutils.exe)
  2. add package to the run targets: activator clean package run
  3. modify SimpleApp.scala, to set the jars as List("target/scala-2.10/hello-spark_2.10-1.0.0-SNAPSHOT.jar")

I don’t get why change (2) is needed - something changed in sbt perhaps?

Thanks for your work!