Stratio / deep-spark

Connecting Apache Spark with different data stores [DEPRECATED]
http://stratio.github.io/deep-spark
Apache License 2.0
197 stars 42 forks source link

Stratio Sandbox's stratio-deep-shell sc not found (0.91) #10

Closed heskech closed 9 years ago

heskech commented 10 years ago

Version: Stratio Sandbox 0.91

To reproduce: Followed Sandbox installation instructions. Imported appliance into VirtualBox. Started Sandbox and started services after changing /etc/hosts. Created Cassandra schema in cqlsh and started stratio-deep-shell from /opt/sds/spark/bin.

[root@sandbox bin]# ./stratio-deep-shell SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/opt/sds/spark/lib/spark-assembly-1.0.0-hadoop1.0.4.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/opt/sds/spark/lib/spark-examples-1.0.0-hadoop1.0.4.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] Welcome to


/ / /__ / /(_) / _\ \/ / / _ `/ _/ / \ / // / -) -) \ //// ,_//// /__/// ./ /_/ Powered by Spark v1.0.0

Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_55) Type in expressions to have them evaluated. Type :help for more information. 21:38:49,763 INFO [spark-akka.actor.default-dispatcher-5] Slf4jLogger:80 - Slf4jLogger started 21:38:49,892 INFO [spark-akka.actor.default-dispatcher-4] Remoting:74 - Starting remoting 21:38:50,230 INFO [spark-akka.actor.default-dispatcher-5] Remoting:74 - Remoting started; listening on addresses :[akka.tcp://spark@sandbox:43350] 21:38:50,236 INFO [spark-akka.actor.default-dispatcher-4] Remoting:74 - Remoting now listens on addresses: [akka.tcp://spark@sandbox:43350] Failed to load native Mesos library from java.lang.UnsatisfiedLinkError: no mesos in java.library.path at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1886) at java.lang.Runtime.loadLibrary0(Runtime.java:849) at java.lang.System.loadLibrary(System.java:1088) at org.apache.mesos.MesosNativeLibrary.load(MesosNativeLibrary.java:52) at org.apache.mesos.MesosNativeLibrary.load(MesosNativeLibrary.java:64) at org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$createTaskScheduler(SparkContext.scala:1542) at org.apache.spark.SparkContext.(SparkContext.scala:307) at org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:957) at $iwC$$iwC.(:8) at $iwC.(:14) at (:16) at .(:20) at .() at .(:7) at .() at $print() at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:788) at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1056) at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:614) at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:645) at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:609) at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:796) at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:841) at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:753) at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:121) at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:120) at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:263) at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:120) at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:56) at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:913) at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:142) at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:56) at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:104) at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:56) at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:930) at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:884) at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:884) at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135) at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:884) at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:982) at org.apache.spark.repl.Main$.main(Main.scala:31) at org.apache.spark.repl.Main.main(Main.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:292) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:55) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

Spark context available as sc. Loading /opt/sds/spark/bin/stratio-deep-init.scala... import com.stratio.deep.annotations.DeepEntity import com.stratio.deep.annotations.DeepField import com.stratio.deep.entity.IDeepType import org.apache.cassandra.db.marshal.Int32Type import org.apache.cassandra.db.marshal.LongType import com.stratio.deep.config.{DeepJobConfigFactory=>Cfg, } import com.stratio.deep.entity. import com.stratio.deep.context. import com.stratio.deep.rdd. import com.stratio.deep.rdd.mongodb. import com.stratio.deep.testentity.

:33: error: not found: value sc val deepContext = new DeepSparkContext(sc) ^ scala>
heskech commented 10 years ago

The problem seems to be in /etc/sds/spark/spark-defaults.conf. I commented out the spark.master pointing to the mesos uri and replaced it with the spark uri. See below. After making this change, I can start spark-shell without the error.

spark.master mesos://zk://hostname:2181/mesos

spark.master spark://sandbox.stratio.com:7077

aperez-stratio commented 10 years ago

As you say it is Spark configuration problem. We are uploading a new sandbox image fixing it. Thank you very much.

aperez-stratio commented 10 years ago

A new sandbox image is available.

heskech commented 10 years ago

The workaround is to comment out the spark.master line in spark-defaults.conf and then restart. The side effect of the workaround I suggested above cause the issue in #11

aperez-stratio commented 9 years ago

Already fixed