elbamos / Zeppelin-With-R

Mirror of Apache Zeppelin (Incubating)
Apache License 2.0
45 stars 24 forks source link

Question, cannot see R interpreter #14

Closed tomer-ben-david closed 8 years ago

tomer-ben-david commented 8 years ago

I'm using https://github.com/apache/incubator-zeppelin which should contain this pull request. I get:

%spark.r 2 + 2
spark.r interpreter not found

My logs:

tomerb-mac:incubator-zeppelin tomerb$ bin/zeppelin.sh 
Pid dir doesn't exist, create 
mkdir: : No such file or directory
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=512m; support was removed in 8.0
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/Users/tomerb/tmp/incubator-zeppelin/zeppelin-server/target/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/Users/tomerb/tmp/incubator-zeppelin/zeppelin-server/target/lib/zeppelin-interpreter-0.6.0-incubating-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/Users/tomerb/tmp/incubator-zeppelin/zeppelin-zengine/target/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/Users/tomerb/tmp/incubator-zeppelin/zeppelin-zengine/target/lib/zeppelin-interpreter-0.6.0-incubating-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/Users/tomerb/tmp/incubator-zeppelin/zeppelin-interpreter/target/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Apr 20, 2016 6:09:31 PM com.sun.jersey.api.core.PackagesResourceConfig init
INFO: Scanning for root resource and provider classes in the packages:
  org.apache.zeppelin.rest
Apr 20, 2016 6:09:31 PM com.sun.jersey.api.core.ScanningResourceConfig logClasses
INFO: Root resource classes found:
  class org.apache.zeppelin.rest.LoginRestApi
  class org.apache.zeppelin.rest.SecurityRestApi
  class org.apache.zeppelin.rest.NotebookRestApi
  class org.apache.zeppelin.rest.ZeppelinRestApi
  class org.apache.zeppelin.rest.ConfigurationsRestApi
  class org.apache.zeppelin.rest.InterpreterRestApi
Apr 20, 2016 6:09:31 PM com.sun.jersey.api.core.ScanningResourceConfig init
INFO: No provider classes found.
Apr 20, 2016 6:09:31 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
INFO: Initiating Jersey application, version 'Jersey: 1.13 06/29/2012 05:14 PM'
Apr 20, 2016 6:09:32 PM com.sun.jersey.spi.inject.Errors processErrorMessages
WARNING: The following warnings have been detected with resource and/or provider classes:
  WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.zeppelin.rest.InterpreterRestApi.listInterpreter(java.lang.String), should not consume any entity.
  WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.zeppelin.rest.NotebookRestApi.getNotebookList() throws java.io.IOException, with URI template, "/", is treated as a resource method
  WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.zeppelin.rest.NotebookRestApi.createNote(java.lang.String) throws java.io.IOException, with URI template, "/", is treated as a resource method

my environement:

$ echo $SPARK_HOME
/Users/tomerb/dev/spark
$ r --version
R version 3.2.4 (2016-03-10) -- "Very Secure Dishes"
Copyright (C) 2016 The R Foundation for Statistical Computing
Platform: x86_64-apple-darwin13.4.0 (64-bit)

installed the R plugins to the best of my knowledge.

am i missing something please? thanks

elbamos commented 8 years ago

Did you give -Pr when you compiled zeppelin?

On Apr 20, 2016, at 11:13 AM, Tomer Ben David notifications@github.com wrote:

I'm using https://github.com/apache/incubator-zeppelin which should contain this pull request. I get:

%spark.r 2 + 2 spark.r interpreter not found My logs:

tomerb-mac:incubator-zeppelin tomerb$ bin/zeppelin.sh Pid dir doesn't exist, create mkdir: : No such file or directory Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=512m; support was removed in 8.0 SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/Users/tomerb/tmp/incubator-zeppelin/zeppelin-server/target/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/Users/tomerb/tmp/incubator-zeppelin/zeppelin-server/target/lib/zeppelin-interpreter-0.6.0-incubating-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/Users/tomerb/tmp/incubator-zeppelin/zeppelin-zengine/target/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/Users/tomerb/tmp/incubator-zeppelin/zeppelin-zengine/target/lib/zeppelin-interpreter-0.6.0-incubating-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/Users/tomerb/tmp/incubator-zeppelin/zeppelin-interpreter/target/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] Apr 20, 2016 6:09:31 PM com.sun.jersey.api.core.PackagesResourceConfig init INFO: Scanning for root resource and provider classes in the packages: org.apache.zeppelin.rest Apr 20, 2016 6:09:31 PM com.sun.jersey.api.core.ScanningResourceConfig logClasses INFO: Root resource classes found: class org.apache.zeppelin.rest.LoginRestApi class org.apache.zeppelin.rest.SecurityRestApi class org.apache.zeppelin.rest.NotebookRestApi class org.apache.zeppelin.rest.ZeppelinRestApi class org.apache.zeppelin.rest.ConfigurationsRestApi class org.apache.zeppelin.rest.InterpreterRestApi Apr 20, 2016 6:09:31 PM com.sun.jersey.api.core.ScanningResourceConfig init INFO: No provider classes found. Apr 20, 2016 6:09:31 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate INFO: Initiating Jersey application, version 'Jersey: 1.13 06/29/2012 05:14 PM' Apr 20, 2016 6:09:32 PM com.sun.jersey.spi.inject.Errors processErrorMessages WARNING: The following warnings have been detected with resource and/or provider classes: WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.zeppelin.rest.InterpreterRestApi.listInterpreter(java.lang.String), should not consume any entity. WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.zeppelin.rest.NotebookRestApi.getNotebookList() throws java.io.IOException, with URI template, "/", is treated as a resource method WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.zeppelin.rest.NotebookRestApi.createNote(java.lang.String) throws java.io.IOException, with URI template, "/", is treated as a resource method my environement:

$ echo $SPARK_HOME /Users/tomerb/dev/spark installed the R plugins to the best of my knowledge.

am i missing something please? thanks

— You are receiving this because you are subscribed to this thread. Reply to this email directly or view it on GitHub

tomer-ben-david commented 8 years ago

I didn't so I rerun the build with -Pr and it looks better (I don't get the spark.r interpreter not found) however when I run %spark.r 2 + 2 it I get:

org.apache.thrift.TApplicationException: Internal error processing createInterpreter
    at org.apache.thrift.TApplicationException.read(TApplicationException.java:111)
    at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:71)
    at org.apache.zeppelin.interpreter.thrift.RemoteInterpreterService$Client.recv_createInterpreter(RemoteInterpreterService.java:167)
    at org.apache.zeppelin.interpreter.thrift.RemoteInterpreterService$Client.createInterpreter(RemoteInterpreterService.java:151)
    at org.apache.zeppelin.interpreter.remote.RemoteInterpreter.init(RemoteInterpreter.java:170)
    at org.apache.zeppelin.interpreter.remote.RemoteInterpreter.getFormType(RemoteInterpreter.java:309)
    at org.apache.zeppelin.interpreter.LazyOpenInterpreter.getFormType(LazyOpenInterpreter.java:104)
    at org.apache.zeppelin.notebook.Paragraph.jobRun(Paragraph.java:243)
    at org.apache.zeppelin.scheduler.Job.run(Job.java:176)
    at org.apache.zeppelin.scheduler.RemoteScheduler$JobRunner.run(RemoteScheduler.java:328)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)

I have spark 1.6.1 so I tried to recompile with:

mvn clean package -Pr,spark-1.6.1 -DskipTests

but getting:

[WARNING] The requested profile "spark-1.6.1" could not be activated because it does not exist.

I see internally its dependant on spark version <spark.version>1.4.1</spark.version> i couldn't find which other profile of spark I can set, is this the problem? the spark version I have? which is the recommanded spark version and should I then use the -Pspark-1.x.x in another mvn clean package build?

thanks

tomer-ben-david commented 8 years ago

I just saw there is profile spark1.6 i will try it with it. Profile Id: spark-1.6 (Active: false , Source: pom)

tomer-ben-david commented 8 years ago

looks better after using -Pspark=1.6 now I get

org.apache.zeppelin.rinterpreter.rscala.RException at org.apache.zeppelin.rinterpreter.RContext.testRPackage(RContext.scala:242) at org.apache.zeppelin.rinterpreter.RContext.sparkStartup(RContext.scala:107) at org.apache.zeppelin.rinterpreter.RContext.open(RContext.scala:93) at org.apache.zeppelin.rinterpreter.RInterpreter.open(RInterpreter.scala:51) at org.apache.zeppelin.rinterpreter.RRepl.open(RRepl.java:56) at org.apache.zeppelin.interpreter.ClassloaderInterpreter.open(ClassloaderInterpreter.java:74) at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:68) at org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:92) at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:345) at org.apache.zeppelin.scheduler.Job.run(Job.java:176) at org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745)

will check if there is a profile for rscala maybe... no such profile - not sure what to do....

elbamos commented 8 years ago

That exception means that R is returning an error when the interpreter tries to execute an R function. Can you send a complete log?

On Apr 20, 2016, at 12:06 PM, Tomer Ben David notifications@github.com wrote:

looks better after using -Pspark=1.6 now I get

org.apache.zeppelin.rinterpreter.rscala.RException at org.apache.zeppelin.rinterpreter.RContext.testRPackage(RContext.scala:242) at org.apache.zeppelin.rinterpreter.RContext.sparkStartup(RContext.scala:107) at org.apache.zeppelin.rinterpreter.RContext.open(RContext.scala:93) at org.apache.zeppelin.rinterpreter.RInterpreter.open(RInterpreter.scala:51) at org.apache.zeppelin.rinterpreter.RRepl.open(RRepl.java:56) at org.apache.zeppelin.interpreter.ClassloaderInterpreter.open(ClassloaderInterpreter.java:74) at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:68) at org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:92) at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:345) at org.apache.zeppelin.scheduler.Job.run(Job.java:176) at org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745)

will check if there is a profile for rscala maybe...

— You are receiving this because you commented. Reply to this email directly or view it on GitHub

tomer-ben-david commented 8 years ago

Yes below are the logs, and as I saw it suggested to check spark home I made sure by this:

tomerb-mac:logs tomerb$ $SPARK_HOME/bin/sparkR

R version 3.2.4 (2016-03-10) -- "Very Secure Dishes"
Copyright (C) 2016 The R Foundation for Statistical Computing
Platform: x86_64-apple-darwin13.4.0 (64-bit)

R is free software and comes with ABSOLUTELY NO WARRANTY.
You are welcome to redistribute it under certain conditions.
Type 'license()' or 'licence()' for distribution details.

  Natural language support but running in an English locale

R is a collaborative project with many contributors.
Type 'contributors()' for more information and
'citation()' on how to cite R or R packages in publications.

Type 'demo()' for some demos, 'help()' for on-line help, or
'help.start()' for an HTML browser interface to help.
Type 'q()' to quit R.

And the logs:

tomerb-mac:logs tomerb$ ls
zeppelin-interpreter-spark-tomerb-tomerb-mac.local.log  zeppelin-tomerb-tomerb-mac.local.log
tomerb-mac:logs tomerb$ cat zeppelin-tomerb-tomerb-mac.local.log
 INFO [2016-04-21 10:14:39,452] ({main} ZeppelinConfiguration.java[create]:98) - Load configuration from file:/Users/tomerb/tmp/incubator-zeppelin/conf/zeppelin-site.xml
 INFO [2016-04-21 10:14:39,724] ({main} ZeppelinServer.java[main]:113) - Starting zeppelin server
 INFO [2016-04-21 10:14:39,727] ({main} Server.java[doStart]:272) - jetty-8.1.14.v20131031
 INFO [2016-04-21 10:14:39,751] ({main} ContextHandler.java[log]:2040) - Initializing Shiro environment
 INFO [2016-04-21 10:14:39,753] ({main} EnvironmentLoader.java[initEnvironment]:128) - Starting Shiro environment initialization.
 INFO [2016-04-21 10:14:39,927] ({main} EnvironmentLoader.java[initEnvironment]:141) - Shiro environment initialized in 174 ms.
 INFO [2016-04-21 10:14:40,108] ({main} InterpreterFactory.java[init]:119) - Reading /Users/tomerb/tmp/incubator-zeppelin/interpreter/alluxio
 INFO [2016-04-21 10:14:40,149] ({main} InterpreterFactory.java[init]:136) - Interpreter alluxio.alluxio found. class=org.apache.zeppelin.alluxio.AlluxioInterpreter
 INFO [2016-04-21 10:14:40,149] ({main} InterpreterFactory.java[init]:119) - Reading /Users/tomerb/tmp/incubator-zeppelin/interpreter/angular
 INFO [2016-04-21 10:14:40,154] ({main} InterpreterFactory.java[init]:136) - Interpreter angular.angular found. class=org.apache.zeppelin.angular.AngularInterpreter
 INFO [2016-04-21 10:14:40,159] ({main} InterpreterFactory.java[init]:119) - Reading /Users/tomerb/tmp/incubator-zeppelin/interpreter/cassandra
 INFO [2016-04-21 10:14:40,174] ({main} CassandraInterpreter.java[<clinit>]:154) - Bootstrapping Cassandra Interpreter
 INFO [2016-04-21 10:14:40,175] ({main} InterpreterFactory.java[init]:136) - Interpreter cassandra.cassandra found. class=org.apache.zeppelin.cassandra.CassandraInterpreter
 INFO [2016-04-21 10:14:40,178] ({main} InterpreterFactory.java[init]:119) - Reading /Users/tomerb/tmp/incubator-zeppelin/interpreter/elasticsearch
 INFO [2016-04-21 10:14:40,203] ({main} InterpreterFactory.java[init]:136) - Interpreter elasticsearch.elasticsearch found. class=org.apache.zeppelin.elasticsearch.ElasticsearchInterpreter
 INFO [2016-04-21 10:14:40,204] ({main} InterpreterFactory.java[init]:119) - Reading /Users/tomerb/tmp/incubator-zeppelin/interpreter/file
 INFO [2016-04-21 10:14:40,213] ({main} InterpreterFactory.java[init]:136) - Interpreter file.hdfs found. class=org.apache.zeppelin.file.HDFSFileInterpreter
 INFO [2016-04-21 10:14:40,217] ({main} InterpreterFactory.java[init]:119) - Reading /Users/tomerb/tmp/incubator-zeppelin/interpreter/flink
 INFO [2016-04-21 10:14:40,252] ({main} InterpreterFactory.java[init]:136) - Interpreter flink.flink found. class=org.apache.zeppelin.flink.FlinkInterpreter
 INFO [2016-04-21 10:14:40,256] ({main} InterpreterFactory.java[init]:119) - Reading /Users/tomerb/tmp/incubator-zeppelin/interpreter/hbase
 INFO [2016-04-21 10:14:40,292] ({main} InterpreterFactory.java[init]:136) - Interpreter hbase.hbase found. class=org.apache.zeppelin.hbase.HbaseInterpreter
 INFO [2016-04-21 10:14:40,292] ({main} InterpreterFactory.java[init]:119) - Reading /Users/tomerb/tmp/incubator-zeppelin/interpreter/hive
 INFO [2016-04-21 10:14:40,318] ({main} InterpreterFactory.java[init]:136) - Interpreter hive.hql found. class=org.apache.zeppelin.hive.HiveInterpreter
 INFO [2016-04-21 10:14:40,324] ({main} InterpreterFactory.java[init]:119) - Reading /Users/tomerb/tmp/incubator-zeppelin/interpreter/ignite
 INFO [2016-04-21 10:14:40,365] ({main} InterpreterFactory.java[init]:136) - Interpreter ignite.ignite found. class=org.apache.zeppelin.ignite.IgniteInterpreter
 INFO [2016-04-21 10:14:40,366] ({main} InterpreterFactory.java[init]:136) - Interpreter ignite.ignitesql found. class=org.apache.zeppelin.ignite.IgniteSqlInterpreter
 INFO [2016-04-21 10:14:40,370] ({main} InterpreterFactory.java[init]:119) - Reading /Users/tomerb/tmp/incubator-zeppelin/interpreter/jdbc
 INFO [2016-04-21 10:14:40,380] ({main} InterpreterFactory.java[init]:136) - Interpreter jdbc.sql found. class=org.apache.zeppelin.jdbc.JDBCInterpreter
 INFO [2016-04-21 10:14:40,383] ({main} InterpreterFactory.java[init]:119) - Reading /Users/tomerb/tmp/incubator-zeppelin/interpreter/kylin
 INFO [2016-04-21 10:14:40,394] ({main} InterpreterFactory.java[init]:136) - Interpreter kylin.kylin found. class=org.apache.zeppelin.kylin.KylinInterpreter
 INFO [2016-04-21 10:14:40,395] ({main} InterpreterFactory.java[init]:119) - Reading /Users/tomerb/tmp/incubator-zeppelin/interpreter/lens
 INFO [2016-04-21 10:14:40,436] ({main} InterpreterFactory.java[init]:136) - Interpreter lens.lens found. class=org.apache.zeppelin.lens.LensInterpreter
 INFO [2016-04-21 10:14:40,441] ({main} InterpreterFactory.java[init]:119) - Reading /Users/tomerb/tmp/incubator-zeppelin/interpreter/md
 INFO [2016-04-21 10:14:40,445] ({main} InterpreterFactory.java[init]:136) - Interpreter md.md found. class=org.apache.zeppelin.markdown.Markdown
 INFO [2016-04-21 10:14:40,450] ({main} InterpreterFactory.java[init]:119) - Reading /Users/tomerb/tmp/incubator-zeppelin/interpreter/phoenix
 INFO [2016-04-21 10:14:40,487] ({main} InterpreterFactory.java[init]:136) - Interpreter phoenix.sql found. class=org.apache.zeppelin.phoenix.PhoenixInterpreter
 INFO [2016-04-21 10:14:40,490] ({main} InterpreterFactory.java[init]:119) - Reading /Users/tomerb/tmp/incubator-zeppelin/interpreter/psql
 INFO [2016-04-21 10:14:40,499] ({main} InterpreterFactory.java[init]:136) - Interpreter psql.sql found. class=org.apache.zeppelin.postgresql.PostgreSqlInterpreter
 INFO [2016-04-21 10:14:40,501] ({main} InterpreterFactory.java[init]:119) - Reading /Users/tomerb/tmp/incubator-zeppelin/interpreter/sh
 INFO [2016-04-21 10:14:40,506] ({main} InterpreterFactory.java[init]:136) - Interpreter sh.sh found. class=org.apache.zeppelin.shell.ShellInterpreter
 INFO [2016-04-21 10:14:40,510] ({main} InterpreterFactory.java[init]:119) - Reading /Users/tomerb/tmp/incubator-zeppelin/interpreter/spark
 INFO [2016-04-21 10:14:40,556] ({main} InterpreterFactory.java[init]:136) - Interpreter spark.spark found. class=org.apache.zeppelin.spark.SparkInterpreter
 INFO [2016-04-21 10:14:40,558] ({main} InterpreterFactory.java[init]:136) - Interpreter spark.pyspark found. class=org.apache.zeppelin.spark.PySparkInterpreter
 INFO [2016-04-21 10:14:40,564] ({main} InterpreterFactory.java[init]:136) - Interpreter spark.r found. class=org.apache.zeppelin.rinterpreter.RRepl
 INFO [2016-04-21 10:14:40,566] ({main} InterpreterFactory.java[init]:136) - Interpreter spark.knitr found. class=org.apache.zeppelin.rinterpreter.KnitR
 INFO [2016-04-21 10:14:40,567] ({main} InterpreterFactory.java[init]:136) - Interpreter spark.sql found. class=org.apache.zeppelin.spark.SparkSqlInterpreter
 INFO [2016-04-21 10:14:40,568] ({main} InterpreterFactory.java[init]:136) - Interpreter spark.dep found. class=org.apache.zeppelin.spark.DepInterpreter
 INFO [2016-04-21 10:14:40,573] ({main} InterpreterFactory.java[init]:119) - Reading /Users/tomerb/tmp/incubator-zeppelin/interpreter/tajo
 INFO [2016-04-21 10:14:40,583] ({main} InterpreterFactory.java[init]:136) - Interpreter tajo.tql found. class=org.apache.zeppelin.tajo.TajoInterpreter
 INFO [2016-04-21 10:14:40,613] ({main} InterpreterFactory.java[init]:198) - Interpreter setting group flink : id=2BJ1589WS, name=flink
 INFO [2016-04-21 10:14:40,613] ({main} InterpreterFactory.java[init]:198) - Interpreter setting group jdbc : id=2BJJUNKE8, name=jdbc
 INFO [2016-04-21 10:14:40,613] ({main} InterpreterFactory.java[init]:198) - Interpreter setting group phoenix : id=2BJG7Y85B, name=phoenix
 INFO [2016-04-21 10:14:40,614] ({main} InterpreterFactory.java[init]:198) - Interpreter setting group spark : id=2BH9NN3KQ, name=spark
 INFO [2016-04-21 10:14:40,614] ({main} InterpreterFactory.java[init]:198) - Interpreter setting group sh : id=2BFUZ3843, name=sh
 INFO [2016-04-21 10:14:40,614] ({main} InterpreterFactory.java[init]:198) - Interpreter setting group alluxio : id=2BH2F4RQD, name=alluxio
 INFO [2016-04-21 10:14:40,614] ({main} InterpreterFactory.java[init]:198) - Interpreter setting group tajo : id=2BGPPTTY7, name=tajo
 INFO [2016-04-21 10:14:40,614] ({main} InterpreterFactory.java[init]:198) - Interpreter setting group ignite : id=2BJECQYWT, name=ignite
 INFO [2016-04-21 10:14:40,614] ({main} InterpreterFactory.java[init]:198) - Interpreter setting group kylin : id=2BJ24D5J4, name=kylin
 INFO [2016-04-21 10:14:40,614] ({main} InterpreterFactory.java[init]:198) - Interpreter setting group hbase : id=2BH25HAMT, name=hbase
 INFO [2016-04-21 10:14:40,615] ({main} InterpreterFactory.java[init]:198) - Interpreter setting group hive : id=2BJHS8Z8G, name=hive
 INFO [2016-04-21 10:14:40,615] ({main} InterpreterFactory.java[init]:198) - Interpreter setting group file : id=2BGHJ32CE, name=file
 INFO [2016-04-21 10:14:40,615] ({main} InterpreterFactory.java[init]:198) - Interpreter setting group psql : id=2BK1TXWRG, name=psql
 INFO [2016-04-21 10:14:40,615] ({main} InterpreterFactory.java[init]:198) - Interpreter setting group cassandra : id=2BJ6SY5FS, name=cassandra
 INFO [2016-04-21 10:14:40,615] ({main} InterpreterFactory.java[init]:198) - Interpreter setting group md : id=2BKE3YZ6X, name=md
 INFO [2016-04-21 10:14:40,615] ({main} InterpreterFactory.java[init]:198) - Interpreter setting group lens : id=2BKDXWH5G, name=lens
 INFO [2016-04-21 10:14:40,615] ({main} InterpreterFactory.java[init]:198) - Interpreter setting group angular : id=2BJ1TJEG3, name=angular
 INFO [2016-04-21 10:14:40,615] ({main} InterpreterFactory.java[init]:198) - Interpreter setting group elasticsearch : id=2BGC18KAA, name=elasticsearch
 INFO [2016-04-21 10:14:40,632] ({main} VfsLog.java[info]:138) - Using "/var/folders/cj/9m49dzl57nv1wtnt01nzt291k56hj5/T/vfs_cache" as temporary files store.
 INFO [2016-04-21 10:14:40,777] ({main} NotebookAuthorization.java[loadFromFile]:59) - /Users/tomerb/tmp/incubator-zeppelin/conf/notebook-authorization.json
 INFO [2016-04-21 10:14:40,820] ({main} StdSchedulerFactory.java[instantiate]:1184) - Using default implementation for ThreadExecutor
 INFO [2016-04-21 10:14:40,823] ({main} SimpleThreadPool.java[initialize]:268) - Job execution threads will use class loader of thread: main
 INFO [2016-04-21 10:14:40,836] ({main} SchedulerSignalerImpl.java[<init>]:61) - Initialized Scheduler Signaller of type: class org.quartz.core.SchedulerSignalerImpl
 INFO [2016-04-21 10:14:40,836] ({main} QuartzScheduler.java[<init>]:240) - Quartz Scheduler v.2.2.1 created.
 INFO [2016-04-21 10:14:40,837] ({main} RAMJobStore.java[initialize]:155) - RAMJobStore initialized.
 INFO [2016-04-21 10:14:40,838] ({main} QuartzScheduler.java[initialize]:305) - Scheduler meta-data: Quartz Scheduler (v2.2.1) 'DefaultQuartzScheduler' with instanceId 'NON_CLUSTERED'
  Scheduler class: 'org.quartz.core.QuartzScheduler' - running locally.
  NOT STARTED.
  Currently in standby mode.
  Number of jobs executed: 0
  Using thread pool 'org.quartz.simpl.SimpleThreadPool' - with 10 threads.
  Using job-store 'org.quartz.simpl.RAMJobStore' - which does not support persistence. and is not clustered.

 INFO [2016-04-21 10:14:40,838] ({main} StdSchedulerFactory.java[instantiate]:1339) - Quartz scheduler 'DefaultQuartzScheduler' initialized from default resource file in Quartz package: 'quartz.properties'
 INFO [2016-04-21 10:14:40,838] ({main} StdSchedulerFactory.java[instantiate]:1343) - Quartz scheduler version: 2.2.1
 INFO [2016-04-21 10:14:40,838] ({main} QuartzScheduler.java[start]:575) - Scheduler DefaultQuartzScheduler_$_NON_CLUSTERED started.
 INFO [2016-04-21 10:14:40,958] ({main} Notebook.java[<init>]:116) - Notebook indexing started...
 INFO [2016-04-21 10:14:41,176] ({main} LuceneSearch.java[addIndexDocs]:285) - Indexing 3 notebooks took 217ms
 INFO [2016-04-21 10:14:41,177] ({main} Notebook.java[<init>]:118) - Notebook indexing finished: 3 indexed in 0s
 INFO [2016-04-21 10:14:41,293] ({main} ServerImpl.java[initDestination]:94) - Setting the server's publish address to be /
 INFO [2016-04-21 10:14:41,389] ({main} StandardDescriptorProcessor.java[visitServlet]:284) - NO JSP Support for /, did not find org.apache.jasper.servlet.JspServlet
 INFO [2016-04-21 10:14:41,938] ({main} AbstractConnector.java[doStart]:338) - Started SelectChannelConnector@0.0.0.0:8080
 INFO [2016-04-21 10:14:41,939] ({main} ZeppelinServer.java[main]:120) - Done, zeppelin server started
 WARN [2016-04-21 10:14:42,858] ({qtp146305349-36} SecurityRestApi.java[ticket]:79) - {"status":"OK","message":"","body":{"principal":"anonymous","ticket":"anonymous","roles":"[]"}}
 INFO [2016-04-21 10:14:42,928] ({qtp146305349-29} NotebookServer.java[onOpen]:92) - New connection from 0:0:0:0:0:0:0:1 : 55660
 INFO [2016-04-21 10:14:43,039] ({qtp146305349-34} NotebookServer.java[sendNote]:402) - New operation from 0:0:0:0:0:0:0:1 : 55660 : anonymous : GET_NOTE : 2BJFNT4UG
 INFO [2016-04-21 10:14:45,549] ({qtp146305349-38} NotebookServer.java[onClose]:213) - Closed connection to 0:0:0:0:0:0:0:1 : 55660. (1001) null
 WARN [2016-04-21 10:14:46,079] ({qtp146305349-34} SecurityRestApi.java[ticket]:79) - {"status":"OK","message":"","body":{"principal":"anonymous","ticket":"anonymous","roles":"[]"}}
 INFO [2016-04-21 10:14:46,135] ({qtp146305349-38} NotebookServer.java[onOpen]:92) - New connection from 0:0:0:0:0:0:0:1 : 55661
 INFO [2016-04-21 10:14:48,439] ({qtp146305349-34} NotebookServer.java[sendNote]:402) - New operation from 0:0:0:0:0:0:0:1 : 55661 : anonymous : GET_NOTE : 2BJFNT4UG
 INFO [2016-04-21 10:14:51,981] ({qtp146305349-36} InterpreterFactory.java[createInterpretersForNote]:491) - Create interpreter instance spark for note shared_session
 INFO [2016-04-21 10:14:51,982] ({qtp146305349-36} InterpreterFactory.java[createInterpretersForNote]:521) - Interpreter org.apache.zeppelin.spark.SparkInterpreter 266377599 created
 INFO [2016-04-21 10:14:51,983] ({qtp146305349-36} InterpreterFactory.java[createInterpretersForNote]:521) - Interpreter org.apache.zeppelin.spark.PySparkInterpreter 661182170 created
 INFO [2016-04-21 10:14:51,983] ({qtp146305349-36} InterpreterFactory.java[createInterpretersForNote]:521) - Interpreter org.apache.zeppelin.rinterpreter.RRepl 1855160843 created
 INFO [2016-04-21 10:14:51,983] ({qtp146305349-36} InterpreterFactory.java[createInterpretersForNote]:521) - Interpreter org.apache.zeppelin.rinterpreter.KnitR 120700316 created
 INFO [2016-04-21 10:14:51,984] ({qtp146305349-36} InterpreterFactory.java[createInterpretersForNote]:521) - Interpreter org.apache.zeppelin.spark.SparkSqlInterpreter 202504036 created
 INFO [2016-04-21 10:14:51,984] ({qtp146305349-36} InterpreterFactory.java[createInterpretersForNote]:521) - Interpreter org.apache.zeppelin.spark.DepInterpreter 1831743034 created
 INFO [2016-04-21 10:14:51,991] ({pool-1-thread-2} SchedulerFactory.java[jobStarted]:131) - Job paragraph_1461084685142_-1352150657 started by scheduler org.apache.zeppelin.interpreter.remote.RemoteInterpretershared_session527675465
 INFO [2016-04-21 10:14:51,992] ({pool-1-thread-2} Paragraph.java[jobRun]:235) - run paragraph 20160419-195125_1406675577 using spark.r org.apache.zeppelin.interpreter.LazyOpenInterpreter@6e93820b
 INFO [2016-04-21 10:14:52,005] ({pool-1-thread-2} RemoteInterpreterProcess.java[reference]:119) - Run interpreter process [/Users/tomerb/tmp/incubator-zeppelin/bin/interpreter.sh, -d, /Users/tomerb/tmp/incubator-zeppelin/interpreter/spark, -p, 55665, -l, /Users/tomerb/tmp/incubator-zeppelin/local-repo/2BH9NN3KQ]
 INFO [2016-04-21 10:14:53,062] ({pool-1-thread-2} RemoteInterpreter.java[init]:168) - Create remote interpreter org.apache.zeppelin.rinterpreter.RRepl
 INFO [2016-04-21 10:14:53,090] ({pool-1-thread-2} RemoteInterpreter.java[pushAngularObjectRegistryToRemote]:435) - Push local angular object registry from ZeppelinServer to remote interpreter group 2BH9NN3KQ
 INFO [2016-04-21 10:14:53,114] ({pool-1-thread-2} RemoteInterpreter.java[init]:168) - Create remote interpreter org.apache.zeppelin.spark.SparkInterpreter
 INFO [2016-04-21 10:14:53,120] ({pool-1-thread-2} RemoteInterpreter.java[init]:168) - Create remote interpreter org.apache.zeppelin.spark.PySparkInterpreter
 INFO [2016-04-21 10:14:53,132] ({pool-1-thread-2} RemoteInterpreter.java[init]:168) - Create remote interpreter org.apache.zeppelin.rinterpreter.KnitR
 INFO [2016-04-21 10:14:53,134] ({pool-1-thread-2} RemoteInterpreter.java[init]:168) - Create remote interpreter org.apache.zeppelin.spark.SparkSqlInterpreter
 INFO [2016-04-21 10:14:53,135] ({pool-1-thread-2} RemoteInterpreter.java[init]:168) - Create remote interpreter org.apache.zeppelin.spark.DepInterpreter
 INFO [2016-04-21 10:14:54,523] ({pool-1-thread-3} SchedulerFactory.java[jobStarted]:131) - Job paragraph_1461084700250_1660833304 started by scheduler org.apache.zeppelin.interpreter.remote.RemoteInterpretershared_session527675465
 INFO [2016-04-21 10:14:54,524] ({pool-1-thread-3} Paragraph.java[jobRun]:235) - run paragraph 20160419-195140_108162990 using null org.apache.zeppelin.interpreter.LazyOpenInterpreter@fe0997f
 INFO [2016-04-21 10:15:00,962] ({pool-1-thread-3} NotebookServer.java[afterStatusChange]:1092) - Job 20160419-195140_108162990 is finished
ERROR [2016-04-21 10:15:00,966] ({Thread-19} JobProgressPoller.java[run]:54) - Can not get or update progress
org.apache.zeppelin.interpreter.InterpreterException: org.apache.thrift.transport.TTransportException
    at org.apache.zeppelin.interpreter.remote.RemoteInterpreter.getProgress(RemoteInterpreter.java:354)
    at org.apache.zeppelin.interpreter.LazyOpenInterpreter.getProgress(LazyOpenInterpreter.java:110)
    at org.apache.zeppelin.notebook.Paragraph.progress(Paragraph.java:220)
    at org.apache.zeppelin.scheduler.JobProgressPoller.run(JobProgressPoller.java:51)
Caused by: org.apache.thrift.transport.TTransportException
    at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132)
    at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
    at org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:429)
    at org.apache.thrift.protocol.TBinaryProtocol.readI32(TBinaryProtocol.java:318)
    at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:219)
    at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:69)
    at org.apache.zeppelin.interpreter.thrift.RemoteInterpreterService$Client.recv_getProgress(RemoteInterpreterService.java:279)
    at org.apache.zeppelin.interpreter.thrift.RemoteInterpreterService$Client.getProgress(RemoteInterpreterService.java:264)
    at org.apache.zeppelin.interpreter.remote.RemoteInterpreter.getProgress(RemoteInterpreter.java:351)
    ... 3 more
 INFO [2016-04-21 10:15:00,977] ({pool-1-thread-2} NotebookServer.java[afterStatusChange]:1092) - Job 20160419-195125_1406675577 is finished
 INFO [2016-04-21 10:15:00,987] ({pool-1-thread-3} SchedulerFactory.java[jobFinished]:137) - Job paragraph_1461084700250_1660833304 finished by scheduler org.apache.zeppelin.interpreter.remote.RemoteInterpretershared_session527675465
 INFO [2016-04-21 10:15:01,003] ({pool-1-thread-2} SchedulerFactory.java[jobFinished]:137) - Job paragraph_1461084685142_-1352150657 finished by scheduler org.apache.zeppelin.interpreter.remote.RemoteInterpretershared_session527675465
 INFO [2016-04-21 10:15:07,280] ({qtp146305349-36} NotebookServer.java[onOpen]:92) - New connection from 0:0:0:0:0:0:0:1 : 55689
tomerb-mac:logs tomerb$ cat zeppelin-interpreter-spark-tomerb-tomerb-mac.local.log
 INFO [2016-04-21 10:14:52,711] ({Thread-0} RemoteInterpreterServer.java[run]:84) - Starting remote interpreter server on port 55665
 INFO [2016-04-21 10:14:53,089] ({pool-1-thread-3} RemoteInterpreterServer.java[createInterpreter]:173) - Instantiate interpreter org.apache.zeppelin.rinterpreter.RRepl
 INFO [2016-04-21 10:14:53,118] ({pool-1-thread-3} RemoteInterpreterServer.java[createInterpreter]:173) - Instantiate interpreter org.apache.zeppelin.spark.SparkInterpreter
 INFO [2016-04-21 10:14:53,131] ({pool-1-thread-3} RemoteInterpreterServer.java[createInterpreter]:173) - Instantiate interpreter org.apache.zeppelin.spark.PySparkInterpreter
 INFO [2016-04-21 10:14:53,133] ({pool-1-thread-3} RemoteInterpreterServer.java[createInterpreter]:173) - Instantiate interpreter org.apache.zeppelin.rinterpreter.KnitR
 INFO [2016-04-21 10:14:53,135] ({pool-1-thread-3} RemoteInterpreterServer.java[createInterpreter]:173) - Instantiate interpreter org.apache.zeppelin.spark.SparkSqlInterpreter
 INFO [2016-04-21 10:14:53,137] ({pool-1-thread-3} RemoteInterpreterServer.java[createInterpreter]:173) - Instantiate interpreter org.apache.zeppelin.spark.DepInterpreter
 INFO [2016-04-21 10:14:53,222] ({pool-1-thread-3} RClient.scala[<init>]:469) - Trying to open ports filename: /var/folders/cj/9m49dzl57nv1wtnt01nzt291k56hj5/T/rscala-6490233470814172101
 INFO [2016-04-21 10:14:53,223] ({pool-1-thread-3} RClient.scala[<init>]:474) - Servers are running on port 55673 55674
 INFO [2016-04-21 10:14:53,513] ({pool-1-thread-3} RClient.scala[<init>]:478) - serverinaccept done
 INFO [2016-04-21 10:14:53,514] ({pool-1-thread-3} RClient.scala[<init>]:480) - in has been created
 INFO [2016-04-21 10:14:53,514] ({pool-1-thread-3} RClient.scala[<init>]:482) - serverouacceptdone
 INFO [2016-04-21 10:14:53,514] ({pool-1-thread-3} RClient.scala[<init>]:484) - out is done
 INFO [2016-04-21 10:14:53,522] ({pool-1-thread-3} RContext.scala[<init>]:250) - RContext Finished Starting
 INFO [2016-04-21 10:14:53,526] ({pool-2-thread-4} SchedulerFactory.java[jobStarted]:131) - Job remoteInterpretJob_1461222893525 started by scheduler 1397194507
 WARN [2016-04-21 10:14:54,225] ({pool-2-thread-4} NativeCodeLoader.java[<clinit>]:62) - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
 INFO [2016-04-21 10:14:54,283] ({pool-2-thread-4} Logging.scala[logInfo]:58) - Changing view acls to: tomerb
 INFO [2016-04-21 10:14:54,284] ({pool-2-thread-4} Logging.scala[logInfo]:58) - Changing modify acls to: tomerb
 INFO [2016-04-21 10:14:54,285] ({pool-2-thread-4} Logging.scala[logInfo]:58) - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(tomerb); users with modify permissions: Set(tomerb)
 INFO [2016-04-21 10:14:54,497] ({pool-2-thread-4} Logging.scala[logInfo]:58) - Starting HTTP Server
 INFO [2016-04-21 10:14:54,528] ({pool-2-thread-5} SchedulerFactory.java[jobStarted]:131) - Job remoteInterpretJob_1461222894527 started by scheduler org.apache.zeppelin.spark.SparkInterpreter673560091
 INFO [2016-04-21 10:14:54,542] ({pool-2-thread-4} Server.java[doStart]:272) - jetty-8.y.z-SNAPSHOT
 INFO [2016-04-21 10:14:54,557] ({pool-2-thread-4} AbstractConnector.java[doStart]:338) - Started SocketConnector@0.0.0.0:55680
 INFO [2016-04-21 10:14:54,558] ({pool-2-thread-4} Logging.scala[logInfo]:58) - Successfully started service 'HTTP class server' on port 55680.
 INFO [2016-04-21 10:14:56,431] ({pool-2-thread-4} SparkInterpreter.java[createSparkContext]:257) - ------ Create new SparkContext local[*] -------
 INFO [2016-04-21 10:14:56,454] ({pool-2-thread-4} Logging.scala[logInfo]:58) - Running Spark version 1.6.1
 INFO [2016-04-21 10:14:56,483] ({pool-2-thread-4} Logging.scala[logInfo]:58) - Changing view acls to: tomerb
 INFO [2016-04-21 10:14:56,484] ({pool-2-thread-4} Logging.scala[logInfo]:58) - Changing modify acls to: tomerb
 INFO [2016-04-21 10:14:56,484] ({pool-2-thread-4} Logging.scala[logInfo]:58) - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(tomerb); users with modify permissions: Set(tomerb)
 INFO [2016-04-21 10:14:56,647] ({pool-2-thread-4} Logging.scala[logInfo]:58) - Successfully started service 'sparkDriver' on port 55682.
 INFO [2016-04-21 10:14:56,913] ({sparkDriverActorSystem-akka.actor.default-dispatcher-2} Slf4jLogger.scala[applyOrElse]:80) - Slf4jLogger started
 INFO [2016-04-21 10:14:56,941] ({sparkDriverActorSystem-akka.actor.default-dispatcher-2} Slf4jLogger.scala[apply$mcV$sp]:74) - Starting remoting
 INFO [2016-04-21 10:14:57,078] ({sparkDriverActorSystem-akka.actor.default-dispatcher-2} Slf4jLogger.scala[apply$mcV$sp]:74) - Remoting started; listening on addresses :[akka.tcp://sparkDriverActorSystem@192.168.14.191:55683]
 INFO [2016-04-21 10:14:57,084] ({pool-2-thread-4} Logging.scala[logInfo]:58) - Successfully started service 'sparkDriverActorSystem' on port 55683.
 INFO [2016-04-21 10:14:57,093] ({pool-2-thread-4} Logging.scala[logInfo]:58) - Registering MapOutputTracker
 INFO [2016-04-21 10:14:57,106] ({pool-2-thread-4} Logging.scala[logInfo]:58) - Registering BlockManagerMaster
 INFO [2016-04-21 10:14:57,117] ({pool-2-thread-4} Logging.scala[logInfo]:58) - Created local directory at /private/var/folders/cj/9m49dzl57nv1wtnt01nzt291k56hj5/T/blockmgr-5a59f5bb-fb67-404c-ad19-77cb56d616ff
 INFO [2016-04-21 10:14:57,121] ({pool-2-thread-4} Logging.scala[logInfo]:58) - MemoryStore started with capacity 511.1 MB
 INFO [2016-04-21 10:14:57,169] ({pool-2-thread-4} Logging.scala[logInfo]:58) - Registering OutputCommitCoordinator
 INFO [2016-04-21 10:14:57,257] ({pool-2-thread-4} Server.java[doStart]:272) - jetty-8.y.z-SNAPSHOT
 INFO [2016-04-21 10:14:57,267] ({pool-2-thread-4} AbstractConnector.java[doStart]:338) - Started SelectChannelConnector@0.0.0.0:4040
 INFO [2016-04-21 10:14:57,268] ({pool-2-thread-4} Logging.scala[logInfo]:58) - Successfully started service 'SparkUI' on port 4040.
 INFO [2016-04-21 10:14:57,271] ({pool-2-thread-4} Logging.scala[logInfo]:58) - Started SparkUI at http://192.168.14.191:4040
 INFO [2016-04-21 10:14:57,287] ({pool-2-thread-4} Logging.scala[logInfo]:58) - HTTP File server directory is /private/var/folders/cj/9m49dzl57nv1wtnt01nzt291k56hj5/T/spark-7f3bc378-8caf-4d02-a10d-2e3ccc96a0da/httpd-1771da48-99c0-407f-8d6f-97b2de34b53f
 INFO [2016-04-21 10:14:57,287] ({pool-2-thread-4} Logging.scala[logInfo]:58) - Starting HTTP Server
 INFO [2016-04-21 10:14:57,288] ({pool-2-thread-4} Server.java[doStart]:272) - jetty-8.y.z-SNAPSHOT
 INFO [2016-04-21 10:14:57,289] ({pool-2-thread-4} AbstractConnector.java[doStart]:338) - Started SocketConnector@0.0.0.0:55684
 INFO [2016-04-21 10:14:57,290] ({pool-2-thread-4} Logging.scala[logInfo]:58) - Successfully started service 'HTTP file server' on port 55684.
 INFO [2016-04-21 10:14:57,338] ({pool-2-thread-4} Logging.scala[logInfo]:58) - Added JAR file:/Users/tomerb/tmp/incubator-zeppelin/interpreter/spark/zeppelin-spark-0.6.0-incubating-SNAPSHOT.jar at http://192.168.14.191:55684/jars/zeppelin-spark-0.6.0-incubating-SNAPSHOT.jar with timestamp 1461222897338
 INFO [2016-04-21 10:14:57,362] ({pool-2-thread-4} Logging.scala[logInfo]:58) - Created default pool default, schedulingMode: FIFO, minShare: 0, weight: 1
 INFO [2016-04-21 10:14:57,382] ({pool-2-thread-4} Logging.scala[logInfo]:58) - Starting executor ID driver on host localhost
 INFO [2016-04-21 10:14:57,387] ({pool-2-thread-4} Logging.scala[logInfo]:58) - Using REPL class URI: http://192.168.14.191:55680
 INFO [2016-04-21 10:14:57,396] ({pool-2-thread-4} Logging.scala[logInfo]:58) - Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 55685.
 INFO [2016-04-21 10:14:57,397] ({pool-2-thread-4} Logging.scala[logInfo]:58) - Server created on 55685
 INFO [2016-04-21 10:14:57,398] ({pool-2-thread-4} Logging.scala[logInfo]:58) - Trying to register BlockManager
 INFO [2016-04-21 10:14:57,401] ({dispatcher-event-loop-2} Logging.scala[logInfo]:58) - Registering block manager localhost:55685 with 511.1 MB RAM, BlockManagerId(driver, localhost, 55685)
 INFO [2016-04-21 10:14:57,402] ({pool-2-thread-4} Logging.scala[logInfo]:58) - Registered BlockManager
 WARN [2016-04-21 10:14:57,540] ({pool-2-thread-4} SparkInterpreter.java[getSQLContext]:221) - Can't create HiveContext. Fallback to SQLContext
java.lang.ClassNotFoundException: org.apache.spark.sql.hive.HiveContext
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    at org.apache.zeppelin.spark.SparkInterpreter.getSQLContext(SparkInterpreter.java:214)
    at org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:536)
    at org.apache.zeppelin.interpreter.ClassloaderInterpreter.open(ClassloaderInterpreter.java:74)
    at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:68)
    at org.apache.zeppelin.rinterpreter.RInterpreter.getSparkInterpreter(RInterpreter.scala:76)
    at org.apache.zeppelin.rinterpreter.RInterpreter.getSparkInterpreter(RInterpreter.scala:70)
    at org.apache.zeppelin.rinterpreter.RInterpreter.open(RInterpreter.scala:50)
    at org.apache.zeppelin.rinterpreter.RRepl.open(RRepl.java:56)
    at org.apache.zeppelin.interpreter.ClassloaderInterpreter.open(ClassloaderInterpreter.java:74)
    at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:68)
    at org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:92)
    at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:345)
    at org.apache.zeppelin.scheduler.Job.run(Job.java:176)
    at org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
 INFO [2016-04-21 10:15:00,950] ({pool-2-thread-5} SchedulerFactory.java[jobFinished]:137) - Job remoteInterpretJob_1461222894527 finished by scheduler org.apache.zeppelin.spark.SparkInterpreter673560091
ERROR [2016-04-21 10:15:00,953] ({pool-2-thread-4} RContext.scala[testRPackage]:241) - The SparkR package could not be loaded. 
ERROR [2016-04-21 10:15:00,955] ({pool-2-thread-4} Job.java[run]:189) - Job failed
org.apache.zeppelin.interpreter.InterpreterException: java.lang.RuntimeException: 
      Could not connect R to Spark.  If the stack trace is not clear,
    check whether SPARK_HOME is set properly.
    at org.apache.zeppelin.interpreter.ClassloaderInterpreter.open(ClassloaderInterpreter.java:76)
    at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:68)
    at org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:92)
    at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:345)
    at org.apache.zeppelin.scheduler.Job.run(Job.java:176)
    at org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.RuntimeException: 
      Could not connect R to Spark.  If the stack trace is not clear,
    check whether SPARK_HOME is set properly.
    at org.apache.zeppelin.rinterpreter.RContext.sparkStartup(RContext.scala:124)
    at org.apache.zeppelin.rinterpreter.RContext.open(RContext.scala:93)
    at org.apache.zeppelin.rinterpreter.RInterpreter.open(RInterpreter.scala:51)
    at org.apache.zeppelin.rinterpreter.RRepl.open(RRepl.java:56)
    at org.apache.zeppelin.interpreter.ClassloaderInterpreter.open(ClassloaderInterpreter.java:74)
    ... 12 more
Caused by: org.apache.zeppelin.rinterpreter.rscala.RException
    at org.apache.zeppelin.rinterpreter.RContext.testRPackage(RContext.scala:242)
    at org.apache.zeppelin.rinterpreter.RContext.sparkStartup(RContext.scala:107)
    ... 16 more
ERROR [2016-04-21 10:15:00,964] ({pool-1-thread-4} RClient.scala[eval]:79) - R Error SparkR:::connectBackend("localhost", 55688) there is no package called ‘SparkR’
ERROR [2016-04-21 10:15:00,964] ({pool-1-thread-4} TThreadPoolServer.java[run]:296) - Error occurred during processing of message.
org.apache.zeppelin.interpreter.InterpreterException: java.lang.RuntimeException: 
      Could not connect R to Spark.  If the stack trace is not clear,
    check whether SPARK_HOME is set properly.
    at org.apache.zeppelin.interpreter.ClassloaderInterpreter.open(ClassloaderInterpreter.java:76)
    at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:68)
    at org.apache.zeppelin.interpreter.LazyOpenInterpreter.getProgress(LazyOpenInterpreter.java:109)
    at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer.getProgress(RemoteInterpreterServer.java:408)
    at org.apache.zeppelin.interpreter.thrift.RemoteInterpreterService$Processor$getProgress.getResult(RemoteInterpreterService.java:1492)
    at org.apache.zeppelin.interpreter.thrift.RemoteInterpreterService$Processor$getProgress.getResult(RemoteInterpreterService.java:1477)
    at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
    at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
    at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:285)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.RuntimeException: 
      Could not connect R to Spark.  If the stack trace is not clear,
    check whether SPARK_HOME is set properly.
    at org.apache.zeppelin.rinterpreter.RContext.sparkStartup(RContext.scala:124)
    at org.apache.zeppelin.rinterpreter.RContext.open(RContext.scala:93)
    at org.apache.zeppelin.rinterpreter.RInterpreter.open(RInterpreter.scala:51)
    at org.apache.zeppelin.rinterpreter.RRepl.open(RRepl.java:56)
    at org.apache.zeppelin.interpreter.ClassloaderInterpreter.open(ClassloaderInterpreter.java:74)
    ... 11 more
Caused by: org.apache.zeppelin.rinterpreter.rscala.RException
    at org.apache.zeppelin.rinterpreter.rscala.RClient.eval(RClient.scala:80)
    at org.apache.zeppelin.rinterpreter.RContext.sparkStartup(RContext.scala:118)
    ... 15 more

also to add I had to run mvn package on spark after downloaded it because it appeared to be source.

elbamos commented 8 years ago

That error implies that there's something funky in your spark installation. What version of spark is in your spark home?

On Apr 21, 2016, at 3:20 AM, Tomer Ben David notifications@github.com wrote:

Yes below are the logs, and as I saw it suggested to check spark home I made sure by this:

tomerb-mac:logs tomerb$ $SPARK_HOME/bin/sparkR

R version 3.2.4 (2016-03-10) -- "Very Secure Dishes" Copyright (C) 2016 The R Foundation for Statistical Computing Platform: x86_64-apple-darwin13.4.0 (64-bit)

R is free software and comes with ABSOLUTELY NO WARRANTY. You are welcome to redistribute it under certain conditions. Type 'license()' or 'licence()' for distribution details.

Natural language support but running in an English locale

R is a collaborative project with many contributors. Type 'contributors()' for more information and 'citation()' on how to cite R or R packages in publications.

Type 'demo()' for some demos, 'help()' for on-line help, or 'help.start()' for an HTML browser interface to help. Type 'q()' to quit R. And the logs:

tomerb-mac:logs tomerb$ ls zeppelin-interpreter-spark-tomerb-tomerb-mac.local.log zeppelin-tomerb-tomerb-mac.local.log tomerb-mac:logs tomerb$ cat zeppelin-tomerb-tomerb-mac.local.log INFO [2016-04-21 10:14:39,452]({main} ZeppelinConfiguration.java[create]:98) - Load configuration from file:/Users/tomerb/tmp/incubator-zeppelin/conf/zeppelin-site.xml INFO [2016-04-21 10:14:39,724]({main} ZeppelinServer.java[main]:113) - Starting zeppelin server INFO [2016-04-21 10:14:39,727]({main} Server.java[doStart]:272) - jetty-8.1.14.v20131031 INFO [2016-04-21 10:14:39,751]({main} ContextHandler.java[log]:2040) - Initializing Shiro environment INFO [2016-04-21 10:14:39,753]({main} EnvironmentLoader.java[initEnvironment]:128) - Starting Shiro environment initialization. INFO [2016-04-21 10:14:39,927]({main} EnvironmentLoader.java[initEnvironment]:141) - Shiro environment initialized in 174 ms. INFO [2016-04-21 10:14:40,108]({main} InterpreterFactory.java[init]:119) - Reading /Users/tomerb/tmp/incubator-zeppelin/interpreter/alluxio INFO [2016-04-21 10:14:40,149]({main} InterpreterFactory.java[init]:136) - Interpreter alluxio.alluxio found. class=org.apache.zeppelin.alluxio.AlluxioInterpreter INFO [2016-04-21 10:14:40,149]({main} InterpreterFactory.java[init]:119) - Reading /Users/tomerb/tmp/incubator-zeppelin/interpreter/angular INFO [2016-04-21 10:14:40,154]({main} InterpreterFactory.java[init]:136) - Interpreter angular.angular found. class=org.apache.zeppelin.angular.AngularInterpreter INFO [2016-04-21 10:14:40,159]({main} InterpreterFactory.java[init]:119) - Reading /Users/tomerb/tmp/incubator-zeppelin/interpreter/cassandra INFO [2016-04-21 10:14:40,174]({main} CassandraInterpreter.java[]:154) - Bootstrapping Cassandra Interpreter INFO [2016-04-21 10:14:40,175]({main} InterpreterFactory.java[init]:136) - Interpreter cassandra.cassandra found. class=org.apache.zeppelin.cassandra.CassandraInterpreter INFO [2016-04-21 10:14:40,178]({main} InterpreterFactory.java[init]:119) - Reading /Users/tomerb/tmp/incubator-zeppelin/interpreter/elasticsearch INFO [2016-04-21 10:14:40,203]({main} InterpreterFactory.java[init]:136) - Interpreter elasticsearch.elasticsearch found. class=org.apache.zeppelin.elasticsearch.ElasticsearchInterpreter INFO [2016-04-21 10:14:40,204]({main} InterpreterFactory.java[init]:119) - Reading /Users/tomerb/tmp/incubator-zeppelin/interpreter/file INFO [2016-04-21 10:14:40,213]({main} InterpreterFactory.java[init]:136) - Interpreter file.hdfs found. class=org.apache.zeppelin.file.HDFSFileInterpreter INFO [2016-04-21 10:14:40,217]({main} InterpreterFactory.java[init]:119) - Reading /Users/tomerb/tmp/incubator-zeppelin/interpreter/flink INFO [2016-04-21 10:14:40,252]({main} InterpreterFactory.java[init]:136) - Interpreter flink.flink found. class=org.apache.zeppelin.flink.FlinkInterpreter INFO [2016-04-21 10:14:40,256]({main} InterpreterFactory.java[init]:119) - Reading /Users/tomerb/tmp/incubator-zeppelin/interpreter/hbase INFO [2016-04-21 10:14:40,292]({main} InterpreterFactory.java[init]:136) - Interpreter hbase.hbase found. class=org.apache.zeppelin.hbase.HbaseInterpreter INFO [2016-04-21 10:14:40,292]({main} InterpreterFactory.java[init]:119) - Reading /Users/tomerb/tmp/incubator-zeppelin/interpreter/hive INFO [2016-04-21 10:14:40,318]({main} InterpreterFactory.java[init]:136) - Interpreter hive.hql found. class=org.apache.zeppelin.hive.HiveInterpreter INFO [2016-04-21 10:14:40,324]({main} InterpreterFactory.java[init]:119) - Reading /Users/tomerb/tmp/incubator-zeppelin/interpreter/ignite INFO [2016-04-21 10:14:40,365]({main} InterpreterFactory.java[init]:136) - Interpreter ignite.ignite found. class=org.apache.zeppelin.ignite.IgniteInterpreter INFO [2016-04-21 10:14:40,366]({main} InterpreterFactory.java[init]:136) - Interpreter ignite.ignitesql found. class=org.apache.zeppelin.ignite.IgniteSqlInterpreter INFO [2016-04-21 10:14:40,370]({main} InterpreterFactory.java[init]:119) - Reading /Users/tomerb/tmp/incubator-zeppelin/interpreter/jdbc INFO [2016-04-21 10:14:40,380]({main} InterpreterFactory.java[init]:136) - Interpreter jdbc.sql found. class=org.apache.zeppelin.jdbc.JDBCInterpreter INFO [2016-04-21 10:14:40,383]({main} InterpreterFactory.java[init]:119) - Reading /Users/tomerb/tmp/incubator-zeppelin/interpreter/kylin INFO [2016-04-21 10:14:40,394]({main} InterpreterFactory.java[init]:136) - Interpreter kylin.kylin found. class=org.apache.zeppelin.kylin.KylinInterpreter INFO [2016-04-21 10:14:40,395]({main} InterpreterFactory.java[init]:119) - Reading /Users/tomerb/tmp/incubator-zeppelin/interpreter/lens INFO [2016-04-21 10:14:40,436]({main} InterpreterFactory.java[init]:136) - Interpreter lens.lens found. class=org.apache.zeppelin.lens.LensInterpreter INFO [2016-04-21 10:14:40,441]({main} InterpreterFactory.java[init]:119) - Reading /Users/tomerb/tmp/incubator-zeppelin/interpreter/md INFO [2016-04-21 10:14:40,445]({main} InterpreterFactory.java[init]:136) - Interpreter md.md found. class=org.apache.zeppelin.markdown.Markdown INFO [2016-04-21 10:14:40,450]({main} InterpreterFactory.java[init]:119) - Reading /Users/tomerb/tmp/incubator-zeppelin/interpreter/phoenix INFO [2016-04-21 10:14:40,487]({main} InterpreterFactory.java[init]:136) - Interpreter phoenix.sql found. class=org.apache.zeppelin.phoenix.PhoenixInterpreter INFO [2016-04-21 10:14:40,490]({main} InterpreterFactory.java[init]:119) - Reading /Users/tomerb/tmp/incubator-zeppelin/interpreter/psql INFO [2016-04-21 10:14:40,499]({main} InterpreterFactory.java[init]:136) - Interpreter psql.sql found. class=org.apache.zeppelin.postgresql.PostgreSqlInterpreter INFO [2016-04-21 10:14:40,501]({main} InterpreterFactory.java[init]:119) - Reading /Users/tomerb/tmp/incubator-zeppelin/interpreter/sh INFO [2016-04-21 10:14:40,506]({main} InterpreterFactory.java[init]:136) - Interpreter sh.sh found. class=org.apache.zeppelin.shell.ShellInterpreter INFO [2016-04-21 10:14:40,510]({main} InterpreterFactory.java[init]:119) - Reading /Users/tomerb/tmp/incubator-zeppelin/interpreter/spark INFO [2016-04-21 10:14:40,556]({main} InterpreterFactory.java[init]:136) - Interpreter spark.spark found. class=org.apache.zeppelin.spark.SparkInterpreter INFO [2016-04-21 10:14:40,558]({main} InterpreterFactory.java[init]:136) - Interpreter spark.pyspark found. class=org.apache.zeppelin.spark.PySparkInterpreter INFO [2016-04-21 10:14:40,564]({main} InterpreterFactory.java[init]:136) - Interpreter spark.r found. class=org.apache.zeppelin.rinterpreter.RRepl INFO [2016-04-21 10:14:40,566]({main} InterpreterFactory.java[init]:136) - Interpreter spark.knitr found. class=org.apache.zeppelin.rinterpreter.KnitR INFO [2016-04-21 10:14:40,567]({main} InterpreterFactory.java[init]:136) - Interpreter spark.sql found. class=org.apache.zeppelin.spark.SparkSqlInterpreter INFO [2016-04-21 10:14:40,568]({main} InterpreterFactory.java[init]:136) - Interpreter spark.dep found. class=org.apache.zeppelin.spark.DepInterpreter INFO [2016-04-21 10:14:40,573]({main} InterpreterFactory.java[init]:119) - Reading /Users/tomerb/tmp/incubator-zeppelin/interpreter/tajo INFO [2016-04-21 10:14:40,583]({main} InterpreterFactory.java[init]:136) - Interpreter tajo.tql found. class=org.apache.zeppelin.tajo.TajoInterpreter INFO [2016-04-21 10:14:40,613]({main} InterpreterFactory.java[init]:198) - Interpreter setting group flink : id=2BJ1589WS, name=flink INFO [2016-04-21 10:14:40,613]({main} InterpreterFactory.java[init]:198) - Interpreter setting group jdbc : id=2BJJUNKE8, name=jdbc INFO [2016-04-21 10:14:40,613]({main} InterpreterFactory.java[init]:198) - Interpreter setting group phoenix : id=2BJG7Y85B, name=phoenix INFO [2016-04-21 10:14:40,614]({main} InterpreterFactory.java[init]:198) - Interpreter setting group spark : id=2BH9NN3KQ, name=spark INFO [2016-04-21 10:14:40,614]({main} InterpreterFactory.java[init]:198) - Interpreter setting group sh : id=2BFUZ3843, name=sh INFO [2016-04-21 10:14:40,614]({main} InterpreterFactory.java[init]:198) - Interpreter setting group alluxio : id=2BH2F4RQD, name=alluxio INFO [2016-04-21 10:14:40,614]({main} InterpreterFactory.java[init]:198) - Interpreter setting group tajo : id=2BGPPTTY7, name=tajo INFO [2016-04-21 10:14:40,614]({main} InterpreterFactory.java[init]:198) - Interpreter setting group ignite : id=2BJECQYWT, name=ignite INFO [2016-04-21 10:14:40,614]({main} InterpreterFactory.java[init]:198) - Interpreter setting group kylin : id=2BJ24D5J4, name=kylin INFO [2016-04-21 10:14:40,614]({main} InterpreterFactory.java[init]:198) - Interpreter setting group hbase : id=2BH25HAMT, name=hbase INFO [2016-04-21 10:14:40,615]({main} InterpreterFactory.java[init]:198) - Interpreter setting group hive : id=2BJHS8Z8G, name=hive INFO [2016-04-21 10:14:40,615]({main} InterpreterFactory.java[init]:198) - Interpreter setting group file : id=2BGHJ32CE, name=file INFO [2016-04-21 10:14:40,615]({main} InterpreterFactory.java[init]:198) - Interpreter setting group psql : id=2BK1TXWRG, name=psql INFO [2016-04-21 10:14:40,615]({main} InterpreterFactory.java[init]:198) - Interpreter setting group cassandra : id=2BJ6SY5FS, name=cassandra INFO [2016-04-21 10:14:40,615]({main} InterpreterFactory.java[init]:198) - Interpreter setting group md : id=2BKE3YZ6X, name=md INFO [2016-04-21 10:14:40,615]({main} InterpreterFactory.java[init]:198) - Interpreter setting group lens : id=2BKDXWH5G, name=lens INFO [2016-04-21 10:14:40,615]({main} InterpreterFactory.java[init]:198) - Interpreter setting group angular : id=2BJ1TJEG3, name=angular INFO [2016-04-21 10:14:40,615]({main} InterpreterFactory.java[init]:198) - Interpreter setting group elasticsearch : id=2BGC18KAA, name=elasticsearch INFO [2016-04-21 10:14:40,632]({main} VfsLog.java[info]:138) - Using "/var/folders/cj/9m49dzl57nv1wtnt01nzt291k56hj5/T/vfs_cache" as temporary files store. INFO [2016-04-21 10:14:40,777]({main} NotebookAuthorization.java[loadFromFile]:59) - /Users/tomerb/tmp/incubator-zeppelin/conf/notebook-authorization.json INFO [2016-04-21 10:14:40,820]({main} StdSchedulerFactory.java[instantiate]:1184) - Using default implementation for ThreadExecutor INFO [2016-04-21 10:14:40,823]({main} SimpleThreadPool.java[initialize]:268) - Job execution threads will use class loader of thread: main INFO [2016-04-21 10:14:40,836]({main} SchedulerSignalerImpl.java[]:61) - Initialized Scheduler Signaller of type: class org.quartz.core.SchedulerSignalerImpl INFO [2016-04-21 10:14:40,836]({main} QuartzScheduler.java[]:240) - Quartz Scheduler v.2.2.1 created. INFO [2016-04-21 10:14:40,837]({main} RAMJobStore.java[initialize]:155) - RAMJobStore initialized. INFO [2016-04-21 10:14:40,838]({main} QuartzScheduler.java[initialize]:305) - Scheduler meta-data: Quartz Scheduler (v2.2.1) 'DefaultQuartzScheduler' with instanceId 'NON_CLUSTERED' Scheduler class: 'org.quartz.core.QuartzScheduler' - running locally. NOT STARTED. Currently in standby mode. Number of jobs executed: 0 Using thread pool 'org.quartz.simpl.SimpleThreadPool' - with 10 threads. Using job-store 'org.quartz.simpl.RAMJobStore' - which does not support persistence. and is not clustered.

INFO [2016-04-21 10:14:40,838]({main} StdSchedulerFactory.java[instantiate]:1339) - Quartz scheduler 'DefaultQuartzScheduler' initialized from default resource file in Quartz package: 'quartz.properties' INFO [2016-04-21 10:14:40,838]({main} StdSchedulerFactory.java[instantiate]:1343) - Quartz scheduler version: 2.2.1 INFO [2016-04-21 10:14:40,838]({main} QuartzScheduler.java[start]:575) - Scheduler DefaultQuartzScheduler_$_NON_CLUSTERED started. INFO [2016-04-21 10:14:40,958]({main} Notebook.java[]:116) - Notebook indexing started... INFO [2016-04-21 10:14:41,176]({main} LuceneSearch.java[addIndexDocs]:285) - Indexing 3 notebooks took 217ms INFO [2016-04-21 10:14:41,177]({main} Notebook.java[]:118) - Notebook indexing finished: 3 indexed in 0s INFO [2016-04-21 10:14:41,293]({main} ServerImpl.java[initDestination]:94) - Setting the server's publish address to be / INFO [2016-04-21 10:14:41,389]({main} StandardDescriptorProcessor.java[visitServlet]:284) - NO JSP Support for /, did not find org.apache.jasper.servlet.JspServlet INFO [2016-04-21 10:14:41,938]({main} AbstractConnector.java[doStart]:338) - Started SelectChannelConnector@0.0.0.0:8080 INFO [2016-04-21 10:14:41,939]({main} ZeppelinServer.java[main]:120) - Done, zeppelin server started WARN [2016-04-21 10:14:42,858]({qtp146305349-36} SecurityRestApi.java[ticket]:79) - {"status":"OK","message":"","body":{"principal":"anonymous","ticket":"anonymous","roles":"[]"}} INFO [2016-04-21 10:14:42,928]({qtp146305349-29} NotebookServer.java[onOpen]:92) - New connection from 0:0:0:0:0:0:0:1 : 55660 INFO [2016-04-21 10:14:43,039]({qtp146305349-34} NotebookServer.java[sendNote]:402) - New operation from 0:0:0:0:0:0:0:1 : 55660 : anonymous : GET_NOTE : 2BJFNT4UG INFO [2016-04-21 10:14:45,549]({qtp146305349-38} NotebookServer.java[onClose]:213) - Closed connection to 0:0:0:0:0:0:0:1 : 55660. (1001) null WARN [2016-04-21 10:14:46,079]({qtp146305349-34} SecurityRestApi.java[ticket]:79) - {"status":"OK","message":"","body":{"principal":"anonymous","ticket":"anonymous","roles":"[]"}} INFO [2016-04-21 10:14:46,135]({qtp146305349-38} NotebookServer.java[onOpen]:92) - New connection from 0:0:0:0:0:0:0:1 : 55661 INFO [2016-04-21 10:14:48,439]({qtp146305349-34} NotebookServer.java[sendNote]:402) - New operation from 0:0:0:0:0:0:0:1 : 55661 : anonymous : GET_NOTE : 2BJFNT4UG INFO [2016-04-21 10:14:51,981]({qtp146305349-36} InterpreterFactory.java[createInterpretersForNote]:491) - Create interpreter instance spark for note shared_session INFO [2016-04-21 10:14:51,982]({qtp146305349-36} InterpreterFactory.java[createInterpretersForNote]:521) - Interpreter org.apache.zeppelin.spark.SparkInterpreter 266377599 created INFO [2016-04-21 10:14:51,983]({qtp146305349-36} InterpreterFactory.java[createInterpretersForNote]:521) - Interpreter org.apache.zeppelin.spark.PySparkInterpreter 661182170 created INFO [2016-04-21 10:14:51,983]({qtp146305349-36} InterpreterFactory.java[createInterpretersForNote]:521) - Interpreter org.apache.zeppelin.rinterpreter.RRepl 1855160843 created INFO [2016-04-21 10:14:51,983]({qtp146305349-36} InterpreterFactory.java[createInterpretersForNote]:521) - Interpreter org.apache.zeppelin.rinterpreter.KnitR 120700316 created INFO [2016-04-21 10:14:51,984]({qtp146305349-36} InterpreterFactory.java[createInterpretersForNote]:521) - Interpreter org.apache.zeppelin.spark.SparkSqlInterpreter 202504036 created INFO [2016-04-21 10:14:51,984]({qtp146305349-36} InterpreterFactory.java[createInterpretersForNote]:521) - Interpreter org.apache.zeppelin.spark.DepInterpreter 1831743034 created INFO [2016-04-21 10:14:51,991]({pool-1-thread-2} SchedulerFactory.java[jobStarted]:131) - Job paragraph1461084685142-1352150657 started by scheduler org.apache.zeppelin.interpreter.remote.RemoteInterpretershared_session527675465 INFO [2016-04-21 10:14:51,992]({pool-1-thread-2} Paragraph.java[jobRun]:235) - run paragraph 20160419-195125_1406675577 using spark.r org.apache.zeppelin.interpreter.LazyOpenInterpreter@6e93820b INFO [2016-04-21 10:14:52,005]({pool-1-thread-2} RemoteInterpreterProcess.java[reference]:119) - Run interpreter process [/Users/tomerb/tmp/incubator-zeppelin/bin/interpreter.sh, -d, /Users/tomerb/tmp/incubator-zeppelin/interpreter/spark, -p, 55665, -l, /Users/tomerb/tmp/incubator-zeppelin/local-repo/2BH9NN3KQ] INFO [2016-04-21 10:14:53,062]({pool-1-thread-2} RemoteInterpreter.java[init]:168) - Create remote interpreter org.apache.zeppelin.rinterpreter.RRepl INFO [2016-04-21 10:14:53,090]({pool-1-thread-2} RemoteInterpreter.java[pushAngularObjectRegistryToRemote]:435) - Push local angular object registry from ZeppelinServer to remote interpreter group 2BH9NN3KQ INFO [2016-04-21 10:14:53,114]({pool-1-thread-2} RemoteInterpreter.java[init]:168) - Create remote interpreter org.apache.zeppelin.spark.SparkInterpreter INFO [2016-04-21 10:14:53,120]({pool-1-thread-2} RemoteInterpreter.java[init]:168) - Create remote interpreter org.apache.zeppelin.spark.PySparkInterpreter INFO [2016-04-21 10:14:53,132]({pool-1-thread-2} RemoteInterpreter.java[init]:168) - Create remote interpreter org.apache.zeppelin.rinterpreter.KnitR INFO [2016-04-21 10:14:53,134]({pool-1-thread-2} RemoteInterpreter.java[init]:168) - Create remote interpreter org.apache.zeppelin.spark.SparkSqlInterpreter INFO [2016-04-21 10:14:53,135]({pool-1-thread-2} RemoteInterpreter.java[init]:168) - Create remote interpreter org.apache.zeppelin.spark.DepInterpreter INFO [2016-04-21 10:14:54,523]({pool-1-thread-3} SchedulerFactory.java[jobStarted]:131) - Job paragraph_1461084700250_1660833304 started by scheduler org.apache.zeppelin.interpreter.remote.RemoteInterpretershared_session527675465 INFO [2016-04-21 10:14:54,524]({pool-1-thread-3} Paragraph.java[jobRun]:235) - run paragraph 20160419-195140_108162990 using null org.apache.zeppelin.interpreter.LazyOpenInterpreter@fe0997f INFO [2016-04-21 10:15:00,962]({pool-1-thread-3} NotebookServer.java[afterStatusChange]:1092) - Job 20160419-195140_108162990 is finished ERROR [2016-04-21 10:15:00,966]({Thread-19} JobProgressPoller.java[run]:54) - Can not get or update progress org.apache.zeppelin.interpreter.InterpreterException: org.apache.thrift.transport.TTransportException at org.apache.zeppelin.interpreter.remote.RemoteInterpreter.getProgress(RemoteInterpreter.java:354) at org.apache.zeppelin.interpreter.LazyOpenInterpreter.getProgress(LazyOpenInterpreter.java:110) at org.apache.zeppelin.notebook.Paragraph.progress(Paragraph.java:220) at org.apache.zeppelin.scheduler.JobProgressPoller.run(JobProgressPoller.java:51) Caused by: org.apache.thrift.transport.TTransportException at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132) at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86) at org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:429) at org.apache.thrift.protocol.TBinaryProtocol.readI32(TBinaryProtocol.java:318) at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:219) at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:69) at org.apache.zeppelin.interpreter.thrift.RemoteInterpreterService$Client.recv_getProgress(RemoteInterpreterService.java:279) at org.apache.zeppelin.interpreter.thrift.RemoteInterpreterService$Client.getProgress(RemoteInterpreterService.java:264) at org.apache.zeppelin.interpreter.remote.RemoteInterpreter.getProgress(RemoteInterpreter.java:351) ... 3 more INFO [2016-04-21 10:15:00,977]({pool-1-thread-2} NotebookServer.java[afterStatusChange]:1092) - Job 20160419-195125_1406675577 is finished INFO [2016-04-21 10:15:00,987]({pool-1-thread-3} SchedulerFactory.java[jobFinished]:137) - Job paragraph_1461084700250_1660833304 finished by scheduler org.apache.zeppelin.interpreter.remote.RemoteInterpretershared_session527675465 INFO [2016-04-21 10:15:01,003]({pool-1-thread-2} SchedulerFactory.java[jobFinished]:137) - Job paragraph1461084685142-1352150657 finished by scheduler org.apache.zeppelin.interpreter.remote.RemoteInterpretershared_session527675465 INFO [2016-04-21 10:15:07,280]({qtp146305349-36} NotebookServer.java[onOpen]:92) - New connection from 0:0:0:0:0:0:0:1 : 55689 tomerb-mac:logs tomerb$ cat zeppelin-interpreter-spark-tomerb-tomerb-mac.local.log INFO [2016-04-21 10:14:52,711]({Thread-0} RemoteInterpreterServer.java[run]:84) - Starting remote interpreter server on port 55665 INFO [2016-04-21 10:14:53,089]({pool-1-thread-3} RemoteInterpreterServer.java[createInterpreter]:173) - Instantiate interpreter org.apache.zeppelin.rinterpreter.RRepl INFO [2016-04-21 10:14:53,118]({pool-1-thread-3} RemoteInterpreterServer.java[createInterpreter]:173) - Instantiate interpreter org.apache.zeppelin.spark.SparkInterpreter INFO [2016-04-21 10:14:53,131]({pool-1-thread-3} RemoteInterpreterServer.java[createInterpreter]:173) - Instantiate interpreter org.apache.zeppelin.spark.PySparkInterpreter INFO [2016-04-21 10:14:53,133]({pool-1-thread-3} RemoteInterpreterServer.java[createInterpreter]:173) - Instantiate interpreter org.apache.zeppelin.rinterpreter.KnitR INFO [2016-04-21 10:14:53,135]({pool-1-thread-3} RemoteInterpreterServer.java[createInterpreter]:173) - Instantiate interpreter org.apache.zeppelin.spark.SparkSqlInterpreter INFO [2016-04-21 10:14:53,137]({pool-1-thread-3} RemoteInterpreterServer.java[createInterpreter]:173) - Instantiate interpreter org.apache.zeppelin.spark.DepInterpreter INFO [2016-04-21 10:14:53,222]({pool-1-thread-3} RClient.scala[]:469) - Trying to open ports filename: /var/folders/cj/9m49dzl57nv1wtnt01nzt291k56hj5/T/rscala-6490233470814172101 INFO [2016-04-21 10:14:53,223]({pool-1-thread-3} RClient.scala[]:474) - Servers are running on port 55673 55674 INFO [2016-04-21 10:14:53,513]({pool-1-thread-3} RClient.scala[]:478) - serverinaccept done INFO [2016-04-21 10:14:53,514]({pool-1-thread-3} RClient.scala[]:480) - in has been created INFO [2016-04-21 10:14:53,514]({pool-1-thread-3} RClient.scala[]:482) - serverouacceptdone INFO [2016-04-21 10:14:53,514]({pool-1-thread-3} RClient.scala[]:484) - out is done INFO [2016-04-21 10:14:53,522]({pool-1-thread-3} RContext.scala[]:250) - RContext Finished Starting INFO [2016-04-21 10:14:53,526]({pool-2-thread-4} SchedulerFactory.java[jobStarted]:131) - Job remoteInterpretJob_1461222893525 started by scheduler 1397194507 WARN [2016-04-21 10:14:54,225]({pool-2-thread-4} NativeCodeLoader.java[]:62) - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable INFO [2016-04-21 10:14:54,283]({pool-2-thread-4} Logging.scala[logInfo]:58) - Changing view acls to: tomerb INFO [2016-04-21 10:14:54,284]({pool-2-thread-4} Logging.scala[logInfo]:58) - Changing modify acls to: tomerb INFO [2016-04-21 10:14:54,285]({pool-2-thread-4} Logging.scala[logInfo]:58) - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(tomerb); users with modify permissions: Set(tomerb) INFO [2016-04-21 10:14:54,497]({pool-2-thread-4} Logging.scala[logInfo]:58) - Starting HTTP Server INFO [2016-04-21 10:14:54,528]({pool-2-thread-5} SchedulerFactory.java[jobStarted]:131) - Job remoteInterpretJob_1461222894527 started by scheduler org.apache.zeppelin.spark.SparkInterpreter673560091 INFO [2016-04-21 10:14:54,542]({pool-2-thread-4} Server.java[doStart]:272) - jetty-8.y.z-SNAPSHOT INFO [2016-04-21 10:14:54,557]({pool-2-thread-4} AbstractConnector.java[doStart]:338) - Started SocketConnector@0.0.0.0:55680 INFO [2016-04-21 10:14:54,558]({pool-2-thread-4} Logging.scala[logInfo]:58) - Successfully started service 'HTTP class server' on port 55680. INFO [2016-04-21 10:14:56,431]({pool-2-thread-4} SparkInterpreter.java[createSparkContext]:257) - ------ Create new SparkContext local[*] ------- INFO [2016-04-21 10:14:56,454]({pool-2-thread-4} Logging.scala[logInfo]:58) - Running Spark version 1.6.1 INFO [2016-04-21 10:14:56,483]({pool-2-thread-4} Logging.scala[logInfo]:58) - Changing view acls to: tomerb INFO [2016-04-21 10:14:56,484]({pool-2-thread-4} Logging.scala[logInfo]:58) - Changing modify acls to: tomerb INFO [2016-04-21 10:14:56,484]({pool-2-thread-4} Logging.scala[logInfo]:58) - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(tomerb); users with modify permissions: Set(tomerb) INFO [2016-04-21 10:14:56,647]({pool-2-thread-4} Logging.scala[logInfo]:58) - Successfully started service 'sparkDriver' on port 55682. INFO [2016-04-21 10:14:56,913]({sparkDriverActorSystem-akka.actor.default-dispatcher-2} Slf4jLogger.scala[applyOrElse]:80) - Slf4jLogger started INFO [2016-04-21 10:14:56,941]({sparkDriverActorSystem-akka.actor.default-dispatcher-2} Slf4jLogger.scala[apply$mcV$sp]:74) - Starting remoting INFO [2016-04-21 10:14:57,078]({sparkDriverActorSystem-akka.actor.default-dispatcher-2} Slf4jLogger.scala[apply$mcV$sp]:74) - Remoting started; listening on addresses :[akka.tcp://sparkDriverActorSystem@192.168.14.191:55683] INFO [2016-04-21 10:14:57,084]({pool-2-thread-4} Logging.scala[logInfo]:58) - Successfully started service 'sparkDriverActorSystem' on port 55683. INFO [2016-04-21 10:14:57,093]({pool-2-thread-4} Logging.scala[logInfo]:58) - Registering MapOutputTracker INFO [2016-04-21 10:14:57,106]({pool-2-thread-4} Logging.scala[logInfo]:58) - Registering BlockManagerMaster INFO [2016-04-21 10:14:57,117]({pool-2-thread-4} Logging.scala[logInfo]:58) - Created local directory at /private/var/folders/cj/9m49dzl57nv1wtnt01nzt291k56hj5/T/blockmgr-5a59f5bb-fb67-404c-ad19-77cb56d616ff INFO [2016-04-21 10:14:57,121]({pool-2-thread-4} Logging.scala[logInfo]:58) - MemoryStore started with capacity 511.1 MB INFO [2016-04-21 10:14:57,169]({pool-2-thread-4} Logging.scala[logInfo]:58) - Registering OutputCommitCoordinator INFO [2016-04-21 10:14:57,257]({pool-2-thread-4} Server.java[doStart]:272) - jetty-8.y.z-SNAPSHOT INFO [2016-04-21 10:14:57,267]({pool-2-thread-4} AbstractConnector.java[doStart]:338) - Started SelectChannelConnector@0.0.0.0:4040 INFO [2016-04-21 10:14:57,268]({pool-2-thread-4} Logging.scala[logInfo]:58) - Successfully started service 'SparkUI' on port 4040. INFO [2016-04-21 10:14:57,271]({pool-2-thread-4} Logging.scala[logInfo]:58) - Started SparkUI at http://192.168.14.191:4040 INFO [2016-04-21 10:14:57,287]({pool-2-thread-4} Logging.scala[logInfo]:58) - HTTP File server directory is /private/var/folders/cj/9m49dzl57nv1wtnt01nzt291k56hj5/T/spark-7f3bc378-8caf-4d02-a10d-2e3ccc96a0da/httpd-1771da48-99c0-407f-8d6f-97b2de34b53f INFO [2016-04-21 10:14:57,287]({pool-2-thread-4} Logging.scala[logInfo]:58) - Starting HTTP Server INFO [2016-04-21 10:14:57,288]({pool-2-thread-4} Server.java[doStart]:272) - jetty-8.y.z-SNAPSHOT INFO [2016-04-21 10:14:57,289]({pool-2-thread-4} AbstractConnector.java[doStart]:338) - Started SocketConnector@0.0.0.0:55684 INFO [2016-04-21 10:14:57,290]({pool-2-thread-4} Logging.scala[logInfo]:58) - Successfully started service 'HTTP file server' on port 55684. INFO [2016-04-21 10:14:57,338]({pool-2-thread-4} Logging.scala[logInfo]:58) - Added JAR file:/Users/tomerb/tmp/incubator-zeppelin/interpreter/spark/zeppelin-spark-0.6.0-incubating-SNAPSHOT.jar at http://192.168.14.191:55684/jars/zeppelin-spark-0.6.0-incubating-SNAPSHOT.jar with timestamp 1461222897338 INFO [2016-04-21 10:14:57,362]({pool-2-thread-4} Logging.scala[logInfo]:58) - Created default pool default, schedulingMode: FIFO, minShare: 0, weight: 1 INFO [2016-04-21 10:14:57,382]({pool-2-thread-4} Logging.scala[logInfo]:58) - Starting executor ID driver on host localhost INFO [2016-04-21 10:14:57,387]({pool-2-thread-4} Logging.scala[logInfo]:58) - Using REPL class URI: http://192.168.14.191:55680 INFO [2016-04-21 10:14:57,396]({pool-2-thread-4} Logging.scala[logInfo]:58) - Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 55685. INFO [2016-04-21 10:14:57,397]({pool-2-thread-4} Logging.scala[logInfo]:58) - Server created on 55685 INFO [2016-04-21 10:14:57,398]({pool-2-thread-4} Logging.scala[logInfo]:58) - Trying to register BlockManager INFO [2016-04-21 10:14:57,401]({dispatcher-event-loop-2} Logging.scala[logInfo]:58) - Registering block manager localhost:55685 with 511.1 MB RAM, BlockManagerId(driver, localhost, 55685) INFO [2016-04-21 10:14:57,402]({pool-2-thread-4} Logging.scala[logInfo]:58) - Registered BlockManager WARN [2016-04-21 10:14:57,540]({pool-2-thread-4} SparkInterpreter.java[getSQLContext]:221) - Can't create HiveContext. Fallback to SQLContext java.lang.ClassNotFoundException: org.apache.spark.sql.hive.HiveContext at java.net.URLClassLoader.findClass(URLClassLoader.java:381) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) at org.apache.zeppelin.spark.SparkInterpreter.getSQLContext(SparkInterpreter.java:214) at org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:536) at org.apache.zeppelin.interpreter.ClassloaderInterpreter.open(ClassloaderInterpreter.java:74) at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:68) at org.apache.zeppelin.rinterpreter.RInterpreter.getSparkInterpreter(RInterpreter.scala:76) at org.apache.zeppelin.rinterpreter.RInterpreter.getSparkInterpreter(RInterpreter.scala:70) at org.apache.zeppelin.rinterpreter.RInterpreter.open(RInterpreter.scala:50) at org.apache.zeppelin.rinterpreter.RRepl.open(RRepl.java:56) at org.apache.zeppelin.interpreter.ClassloaderInterpreter.open(ClassloaderInterpreter.java:74) at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:68) at org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:92) at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:345) at org.apache.zeppelin.scheduler.Job.run(Job.java:176) at org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) INFO [2016-04-21 10:15:00,950]({pool-2-thread-5} SchedulerFactory.java[jobFinished]:137) - Job remoteInterpretJob_1461222894527 finished by scheduler org.apache.zeppelin.spark.SparkInterpreter673560091 ERROR [2016-04-21 10:15:00,953]({pool-2-thread-4} RContext.scala[testRPackage]:241) - The SparkR package could not be loaded. ERROR [2016-04-21 10:15:00,955]({pool-2-thread-4} Job.java[run]:189) - Job failed org.apache.zeppelin.interpreter.InterpreterException: java.lang.RuntimeException: Could not connect R to Spark. If the stack trace is not clear, check whether SPARK_HOME is set properly. at org.apache.zeppelin.interpreter.ClassloaderInterpreter.open(ClassloaderInterpreter.java:76) at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:68) at org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:92) at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:345) at org.apache.zeppelin.scheduler.Job.run(Job.java:176) at org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) Caused by: java.lang.RuntimeException: Could not connect R to Spark. If the stack trace is not clear, check whether SPARK_HOME is set properly. at org.apache.zeppelin.rinterpreter.RContext.sparkStartup(RContext.scala:124) at org.apache.zeppelin.rinterpreter.RContext.open(RContext.scala:93) at org.apache.zeppelin.rinterpreter.RInterpreter.open(RInterpreter.scala:51) at org.apache.zeppelin.rinterpreter.RRepl.open(RRepl.java:56) at org.apache.zeppelin.interpreter.ClassloaderInterpreter.open(ClassloaderInterpreter.java:74) ... 12 more Caused by: org.apache.zeppelin.rinterpreter.rscala.RException at org.apache.zeppelin.rinterpreter.RContext.testRPackage(RContext.scala:242) at org.apache.zeppelin.rinterpreter.RContext.sparkStartup(RContext.scala:107) ... 16 more ERROR [2016-04-21 10:15:00,964]({pool-1-thread-4} RClient.scala[eval]:79) - R Error SparkR:::connectBackend("localhost", 55688) there is no package called ‘SparkR’ ERROR [2016-04-21 10:15:00,964]({pool-1-thread-4} TThreadPoolServer.java[run]:296) - Error occurred during processing of message. org.apache.zeppelin.interpreter.InterpreterException: java.lang.RuntimeException: Could not connect R to Spark. If the stack trace is not clear, check whether SPARK_HOME is set properly. at org.apache.zeppelin.interpreter.ClassloaderInterpreter.open(ClassloaderInterpreter.java:76) at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:68) at org.apache.zeppelin.interpreter.LazyOpenInterpreter.getProgress(LazyOpenInterpreter.java:109) at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer.getProgress(RemoteInterpreterServer.java:408) at org.apache.zeppelin.interpreter.thrift.RemoteInterpreterService$Processor$getProgress.getResult(RemoteInterpreterService.java:1492) at org.apache.zeppelin.interpreter.thrift.RemoteInterpreterService$Processor$getProgress.getResult(RemoteInterpreterService.java:1477) at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39) at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39) at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:285) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) Caused by: java.lang.RuntimeException: Could not connect R to Spark. If the stack trace is not clear, check whether SPARK_HOME is set properly. at org.apache.zeppelin.rinterpreter.RContext.sparkStartup(RContext.scala:124) at org.apache.zeppelin.rinterpreter.RContext.open(RContext.scala:93) at org.apache.zeppelin.rinterpreter.RInterpreter.open(RInterpreter.scala:51) at org.apache.zeppelin.rinterpreter.RRepl.open(RRepl.java:56) at org.apache.zeppelin.interpreter.ClassloaderInterpreter.open(ClassloaderInterpreter.java:74) ... 11 more Caused by: org.apache.zeppelin.rinterpreter.rscala.RException at org.apache.zeppelin.rinterpreter.rscala.RClient.eval(RClient.scala:80) at org.apache.zeppelin.rinterpreter.RContext.sparkStartup(RContext.scala:118) ... 15 more — You are receiving this because you commented. Reply to this email directly or view it on GitHub

tomer-ben-david commented 8 years ago

COOL it worked! I have redownloaded instead the binary version of spark1.6.1 instead of compiling from source and it worked! thank you so much!!

%spark.r 2 + 2
2 + 2
simpleWarning in library(package, lib.loc = lib.loc, character.only = TRUE, logical.return = TRUE, : there is no package called ‘repr’
4
zenonlpc commented 8 years ago

Hello guys

I met some similar errors when I try to compile zeppelin with -Pr option. I was following the instruction here:

https://gist.github.com/andershammar/224e1077021d0ea376dd

It failed at step mvn compile, here is the my mvn compile command:

mvn clean package -Pspark-1.6 -Dhadoop.version=2.6.0 -Phadoop-2.6 -Pyarn -Pr -DskipTests

Here is the error information:

[INFO] Nothing to compile - all classes are up to date [INFO] [INFO] --- exec-maven-plugin:1.2.1:exec (default) @ zeppelin-zrinterpreter --- +++ dirname R/install-dev.sh ++ cd R ++ pwd

All other interpreters were build successfully.

Looks like it is looking evaluate R package, should I insall R before mvn compliing zeppelin with -Pr?

Thanks in advance for you help!

tomer-ben-david commented 8 years ago

I'm not sure about the problem but the guide definetly says to install it https://github.com/elbamos/Zeppelin-With-R

Additional requirements for the R interpreter are:

For full R support, you will also need the following R packages:

On Wed, May 4, 2016 at 9:49 PM zenonlpc notifications@github.com wrote:

Hello guys

I met some similar errors when I try to compile zeppelin with -Pr option. I was following the instruction here:

https://gist.github.com/andershammar/224e1077021d0ea376dd

It failed at step mvn compile, here is the my mvn compile command:

mvn clean package -Pspark-1.6 -Dhadoop.version=2.6.0 -Phadoop-2.6 -Pyarn -Pr -DskipTests

Here is the error information:

`+++ dirname R/install-dev.sh ++ cd R ++ pwd

  • FWDIR=/home/hadoop/zeppelin/r/R
  • LIB_DIR=/home/hadoop/zeppelin/r/R/../../R/lib
  • mkdir -p /home/hadoop/zeppelin/r/R/../../R/lib
  • pushd /home/hadoop/zeppelin/r/R
  • R CMD INSTALL --library=/home/hadoop/zeppelin/r/R/../../R/lib /home/hadoop/zeppelin/r/R/rzeppelin/ ERROR: dependency ‘evaluate’ is not available for package ‘rzeppelin’
  • removing ‘/home/hadoop/zeppelin/R/lib/rzeppelin’ [INFO]

    [INFO] Reactor Summary: [INFO] [INFO] Zeppelin ........................................... SUCCESS [ 16.493 s] [INFO] Zeppelin: Interpreter .............................. SUCCESS [ 13.782 s] [INFO] Zeppelin: Zengine .................................. SUCCESS [ 6.542 s] [INFO] Zeppelin: Display system apis ...................... SUCCESS [ 18.446 s] [INFO] Zeppelin: Spark dependencies ....................... SUCCESS [ 58.197 s] [INFO] Zeppelin: Spark .................................... SUCCESS [ 28.614 s] [INFO] Zeppelin: Markdown interpreter ..................... SUCCESS [ 0.458 s] [INFO] Zeppelin: Angular interpreter ...................... SUCCESS [ 0.326 s] [INFO] Zeppelin: Shell interpreter ........................ SUCCESS [ 0.346 s] [INFO] Zeppelin: Hive interpreter ......................... SUCCESS [ 3.757 s] [INFO] Zeppelin: HBase interpreter ........................ SUCCESS [ 15.924 s] [INFO] Zeppelin: Apache Phoenix Interpreter ............... SUCCESS [ 5.450 s] [INFO] Zeppelin: PostgreSQL interpreter ................... SUCCESS [ 1.061 s] [INFO] Zeppelin: JDBC interpreter ......................... SUCCESS [ 0.442 s] [INFO] Zeppelin: Tajo interpreter ......................... SUCCESS [ 1.216 s] [INFO] Zeppelin: File System Interpreters ................. SUCCESS [ 1.679 s] [INFO] Zeppelin: Flink .................................... SUCCESS [ 13.549 s] [INFO] Zeppelin: Apache Ignite interpreter ................ SUCCESS [ 1.815 s] [INFO] Zeppelin: Kylin interpreter ........................ SUCCESS [ 0.435 s] [INFO] Zeppelin: Lens interpreter ......................... SUCCESS [ 5.079 s] [INFO] Zeppelin: Cassandra ................................ SUCCESS [01:12 min] [INFO] Zeppelin: Elasticsearch interpreter ................ SUCCESS [ 2.935 s] [INFO] Zeppelin: Alluxio interpreter ...................... SUCCESS [ 3.764 s] [INFO] Zeppelin: web Application .......................... SUCCESS [02:07 min] [INFO] Zeppelin: Server ................................... SUCCESS [ 15.480 s] [INFO] Zeppelin: Packaging distribution ................... SUCCESS [ 0.942 s] [INFO] Zeppelin: R Interpreter ............................ FAILURE [ 29.611 s] [INFO]

    [INFO] BUILD FAILURE`

Looks like it is looking evaluate R package, should I insall R before mvn compliing zeppelin with -Pr?

Thanks in advance for you help!

— You are receiving this because you modified the open/close state.

Reply to this email directly or view it on GitHub https://github.com/elbamos/Zeppelin-With-R/issues/14#issuecomment-216963530

zenonlpc commented 8 years ago

Thanks David for the quick response.

I will install those packages try again.

On Wed, May 4, 2016 at 2:58 PM, Tomer Ben David notifications@github.com wrote:

I'm not sure about the problem but the guide definetly says to install it https://github.com/elbamos/Zeppelin-With-R

Additional requirements for the R interpreter are:

  • R 3.1 or later (earlier versions may work, but have not been tested)
  • The evaluate R package.

For full R support, you will also need the following R packages:

  • knitr
  • repr -- available with devtools::install_github("IRkernel/repr")
  • htmltools -- required for some interactive plotting
  • base64enc -- required to view R base plots

On Wed, May 4, 2016 at 9:49 PM zenonlpc notifications@github.com wrote:

Hello guys

I met some similar errors when I try to compile zeppelin with -Pr option. I was following the instruction here:

https://gist.github.com/andershammar/224e1077021d0ea376dd

It failed at step mvn compile, here is the my mvn compile command:

mvn clean package -Pspark-1.6 -Dhadoop.version=2.6.0 -Phadoop-2.6 -Pyarn -Pr -DskipTests

Here is the error information:

`+++ dirname R/install-dev.sh ++ cd R ++ pwd

  • FWDIR=/home/hadoop/zeppelin/r/R
  • LIB_DIR=/home/hadoop/zeppelin/r/R/../../R/lib
  • mkdir -p /home/hadoop/zeppelin/r/R/../../R/lib
  • pushd /home/hadoop/zeppelin/r/R
  • R CMD INSTALL --library=/home/hadoop/zeppelin/r/R/../../R/lib /home/hadoop/zeppelin/r/R/rzeppelin/ ERROR: dependency ‘evaluate’ is not available for package ‘rzeppelin’

- removing ‘/home/hadoop/zeppelin/R/lib/rzeppelin’ [INFO]

[INFO] Reactor Summary: [INFO] [INFO] Zeppelin ........................................... SUCCESS [ 16.493 s] [INFO] Zeppelin: Interpreter .............................. SUCCESS [ 13.782 s] [INFO] Zeppelin: Zengine .................................. SUCCESS [ 6.542 s] [INFO] Zeppelin: Display system apis ...................... SUCCESS [ 18.446 s] [INFO] Zeppelin: Spark dependencies ....................... SUCCESS [ 58.197 s] [INFO] Zeppelin: Spark .................................... SUCCESS [ 28.614 s] [INFO] Zeppelin: Markdown interpreter ..................... SUCCESS [ 0.458 s] [INFO] Zeppelin: Angular interpreter ...................... SUCCESS [ 0.326 s] [INFO] Zeppelin: Shell interpreter ........................ SUCCESS [ 0.346 s] [INFO] Zeppelin: Hive interpreter ......................... SUCCESS [ 3.757 s] [INFO] Zeppelin: HBase interpreter ........................ SUCCESS [ 15.924 s] [INFO] Zeppelin: Apache Phoenix Interpreter ............... SUCCESS [ 5.450 s] [INFO] Zeppelin: PostgreSQL interpreter ................... SUCCESS [ 1.061 s] [INFO] Zeppelin: JDBC interpreter ......................... SUCCESS [ 0.442 s] [INFO] Zeppelin: Tajo interpreter ......................... SUCCESS [ 1.216 s] [INFO] Zeppelin: File System Interpreters ................. SUCCESS [ 1.679 s] [INFO] Zeppelin: Flink .................................... SUCCESS [ 13.549 s] [INFO] Zeppelin: Apache Ignite interpreter ................ SUCCESS [ 1.815 s] [INFO] Zeppelin: Kylin interpreter ........................ SUCCESS [ 0.435 s] [INFO] Zeppelin: Lens interpreter ......................... SUCCESS [ 5.079 s] [INFO] Zeppelin: Cassandra ................................ SUCCESS [01:12 min] [INFO] Zeppelin: Elasticsearch interpreter ................ SUCCESS [ 2.935 s] [INFO] Zeppelin: Alluxio interpreter ...................... SUCCESS [ 3.764 s] [INFO] Zeppelin: web Application .......................... SUCCESS [02:07 min] [INFO] Zeppelin: Server ................................... SUCCESS [ 15.480 s] [INFO] Zeppelin: Packaging distribution ................... SUCCESS [ 0.942 s] [INFO] Zeppelin: R Interpreter

............................ FAILURE [ 29.611 s] [INFO]

[INFO] BUILD FAILURE`

Looks like it is looking evaluate R package, should I insall R before mvn compliing zeppelin with -Pr?

Thanks in advance for you help!

— You are receiving this because you modified the open/close state.

Reply to this email directly or view it on GitHub < https://github.com/elbamos/Zeppelin-With-R/issues/14#issuecomment-216963530

— You are receiving this because you commented. Reply to this email directly or view it on GitHub https://github.com/elbamos/Zeppelin-With-R/issues/14#issuecomment-216966532

zenonlpc commented 8 years ago

Hello guys

I successfully installed R interpreter in zeppelin after installed those missing R packages.

But there are still something wrong when I run the command:

%r foo <- TRUE print(foo) bare <- c(1, 2.5, 4) print(bare) double <- 15.0 print(double)

I get error info:

java.lang.ClassNotFoundException: com.amazonaws.event.ProgressListener at java.net.URLClassLoader$1.run(URLClassLoader.java:366) at java.net.URLClassLoader$1.run(URLClassLoader.java:355) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:354) at java.lang.ClassLoader.loadClass(ClassLoader.java:425) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) at java.lang.ClassLoader.loadClass(ClassLoader.java:358) at java.lang.Class.getDeclaredConstructors0(Native Method) at java.lang.Class.privateGetDeclaredConstructors(Class.java:2595) at java.lang.Class.getConstructor0(Class.java:2895) at java.lang.Class.newInstance(Class.java:354) at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:373) at java.util.ServiceLoader$1.next(ServiceLoader.java:445) at org.apache.hadoop.fs.FileSystem.loadFileSystems(FileSystem.java:2563) at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2574) at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2591)

elbamos commented 8 years ago

@tomer-ben-david You're correct, thanks. On installation, R needs to be available to compile the R package. Evaluate is the key dependency.

@zenonlpc I've never seen that error before, and the class its referring to seems to be related to amazonaws. Can you open this as a new issue and provide more detail about your configuration?

zenonlpc commented 8 years ago

Thanks Elbamos

I will collect more information and create the issue Friday

Sent from my iPhone

On May 4, 2016, at 4:50 PM, elbamos notifications@github.com wrote:

@tomer-ben-david You're correct, thanks. On installation, R needs to be available to compile the R package. Evaluate is the key dependency.

@zenonlpc I've never seen that error before, and the class its referring to seems to be related to amazonaws. Can you open this as a new issue and provide more detail about your configuration?

— You are receiving this because you were mentioned. Reply to this email directly or view it on GitHub