Open mattpap opened 10 years ago
Can you show how you run IScala and resolve/import dependencies? I tried org.apache.spark
in console and notebook under 2.10, both using command line (-m
option) or through %libraryDependencies
and %update
, and spark always imports. I used "org.apache.spark" %% "spark-core" % "0.9.1"
dependency.
btw. Although the default target is 2.11, you can build for 2.10 without Build.scala
modifications. Just issue ++2.10.4 compile
(or release
, etc.) in sbt. Or you can just issue ++2.10.4
which will switch Scala version for the remaining commands (see [1]).
[1] http://www.scala-sbt.org/0.13/docs/Command-Line-Reference.html
While trying to recreate the error, I was able to fix my problem. But I will describe the way to recreate the problem, because I think there is a real issue here. I saw this behavior on both the notebook and the console.
This code works fine:
%libraryDependencies += "org.apache.spark" %% "spark-core" % "1.1.0"
%update
import org.apache.spark.SparkContext
(restart the notebook before the next step)
But if I first do the import (and it fails because there is no libraryDependencies):
import org.apache.spark.SparkContext
<console>:7: error: object spark is not a member of package org.apache
running the same code from before will fail:
%libraryDependencies += "org.apache.spark" %% "spark-core" % "1.1.0"
%update
import org.apache.spark.SparkContext
<console>:7: error: object spark is not a member of package org.apache
(in the update logs, it said that spark was resolved)
At this point it is impossible to do the import, and you have to restart the notebook. It's not a big issue, but can be really annoying that the notebook can get to a broken state and need to be restarted.
And thanks for the ++ command in sbt
There is a very unfortunate bug currently in master
. Mainly, after the interpreter is started, you can't change the classpath. This used to work and %reset
was sufficient to achieve this (%update
does reset automatically). I added a special case where if you run %libraryDependencies
and %update
before initializing the interpreter, then %update
will update the classpath and no reset is necessary. Until this is fixed, I wouldn't depend on %reset
doing anything useful. I hope to fix this soon, because it's really annoying and confusing.
If unsure, you can pass library dependencies via command line arguments, e.g.:
bin/2.10/notebook -m org.apache.spark::spark-core:0.9.1
(this isn't document anywhere yet).
Originally submitted in #7 by @lev112:
What is the right way to compile with scala 2.10?
I've tried to set
scalaVersion := "2.10.4"
inBuild.scala
, and it compiles, but I see some strange behavior...I try to import spark package, and in the console it works fine, but in the notebook I get an error:
error: object spark is not a member of package org.apache
Is it a bug or did I do something wrong? (in IScala 0.1 the same code works fine)