chyh1990 / jruby-spark

8 stars 2 forks source link

Installation issues #1

Closed gnilrets closed 8 years ago

gnilrets commented 8 years ago

Hi, I tried following the directions in the README, but got an error:

➜  jruby-spark git:(master) echo $SPARK_HOME
/usr/local/Cellar/apache-spark/1.6.0/libexec
➜  jruby-spark git:(master) ./scripts/jruby-spark-repl.sh 
JRuby Spark jar not found

Then I tried rake package, but got the following error:

➜  jruby-spark git:(master) ✗ ruby --version
jruby 9.0.5.0 (2.2.3) 2016-01-26 7bee00d Java HotSpot(TM) 64-Bit Server VM 25.51-b03 on 1.8.0_51-b16 +jit [darwin-x86_64]
➜  jruby-spark git:(master) ✗ rake package             
./gradlew jar --info
Starting Build
Settings evaluated using settings file '/Users/gnilrets/git/jruby-spark/settings.gradle'.
Projects loaded. Root project using build file '/Users/gnilrets/git/jruby-spark/build.gradle'.
Included projects: [root project 'jruby-spark']
Evaluating root project 'jruby-spark' using build file '/Users/gnilrets/git/jruby-spark/build.gradle'.
All projects evaluated.
Selected primary task 'jar' from project :
Tasks to be executed: [task ':compileJava', task ':compileScala', task ':processResources', task ':classes', task ':jar']
:compileJava (Thread[main,5,main]) started.
:compileJava
Skipping task ':compileJava' as it has no source files.
:compileJava UP-TO-DATE
:compileJava (Thread[main,5,main]) completed. Took 0.008 secs.
:compileScala (Thread[main,5,main]) started.
:compileScala
Executing task ':compileScala' (up-to-date check took 0.844 secs) due to:
  No history is available.
Compiling with Ant scalac task.
[ant:scalac] Compiling 4 source files to /Users/gnilrets/git/jruby-spark/build/classes/main
[ant:scalac] Compiling 0 scala and 1 java source files to /Users/gnilrets/git/jruby-spark/build/classes/main
[ant:scalac] Compiling 0 scala and 5 java source files to /Users/gnilrets/git/jruby-spark/build/classes/main
[ant:scalac] Compiling 0 scala and 8 java source files to /Users/gnilrets/git/jruby-spark/build/classes/main
[ant:scalac] Element '/Users/gnilrets/git/jruby-spark/libs/jruby-complete-9.1.0.0-SNAPSHOT.jar' does not exist.
[ant:scalac] /Users/gnilrets/git/jruby-spark/src/main/scala/org/apache/spark/jruby/JRubyIteratableAdaptor.scala:6: error: object jruby is not a member of package org
[ant:scalac] import org.jruby.exceptions.RaiseException
[ant:scalac]            ^
[ant:scalac] /Users/gnilrets/git/jruby-spark/src/main/scala/org/apache/spark/jruby/JRubyIteratableAdaptor.scala:7: error: object jruby is not a member of package org
[ant:scalac] import org.jruby.runtime.builtin.IRubyObject
[ant:scalac]            ^
[ant:scalac] /Users/gnilrets/git/jruby-spark/src/main/scala/org/apache/spark/jruby/JRubyIteratableAdaptor.scala:8: error: object jruby is not a member of package org
[ant:scalac] import org.jruby.{Ruby, RubyEnumerator, RubyStopIteration}
[ant:scalac]            ^
[ant:scalac] /Users/gnilrets/git/jruby-spark/src/main/scala/org/apache/spark/jruby/JRubyIteratableAdaptor.scala:13: error: not found: type Ruby
[ant:scalac] class JRubyIteratableAdaptor[AnyRef](private val runtime: Ruby, private var obj: RubyEnumerator)
[ant:scalac]                                                           ^
[ant:scalac] /Users/gnilrets/git/jruby-spark/src/main/scala/org/apache/spark/jruby/JRubyIteratableAdaptor.scala:13: error: not found: type RubyEnumerator
[ant:scalac] class JRubyIteratableAdaptor[AnyRef](private val runtime: Ruby, private var obj: RubyEnumerator)
[ant:scalac]                                                                                  ^
[ant:scalac] /Users/gnilrets/git/jruby-spark/src/main/scala/org/apache/spark/jruby/JRubyIteratableAdaptor.scala:18: error: not found: type IRubyObject
[ant:scalac]     private[jruby] var nextObj: IRubyObject = null

.... plus many more. Similar to @ondra-m's issues.

Any idea what could be going on?

gnilrets commented 8 years ago

I found it confusing that it was trying to fine a file called jruby-complete-9.1.0.0-SNAPSHOT.jar since jruby 9.0.5 is the current release. However, I tried using the latest dev build of 9.1.0 (via rbenv) and got the same errors.

gnilrets commented 8 years ago

I should also of course note that I'm on OS X El Capitan and here are my Java/Scala versions:

➜  jruby-spark git:(master) ✗ java -version
java version "1.8.0_51"
Java(TM) SE Runtime Environment (build 1.8.0_51-b16)
Java HotSpot(TM) 64-Bit Server VM (build 25.51-b03, mixed mode)
➜  jruby-spark git:(master) ✗ scala -version
Scala code runner version 2.10.5 -- Copyright 2002-2013, LAMP/EPFL
gnilrets commented 8 years ago

I think I'm making some progress (FYI, I know Ruby, I'm a total noob when it comes to JRuby). Here's what I had to do to get a spark session running

ondra-m commented 8 years ago

Now I've got

jruby-spark/src/main/scala/org/apache/spark/jruby/JRubyIteratableAdaptor.scala:17
  error: class JRubyIterator needs to be abstract, since method remove in trait 
  Iterator of type ()Unit is not defined
gnilrets commented 8 years ago

The latest commits work for me without having to do anything else. Thanks!