Closed helena closed 9 years ago
I grabbed your build locally on my Mac, but could not reproduce the issue.
I got "[info] Including from cache: ..." for all jars. The log you show seems to show mixed hit on cache, which is weird if you're just running assembly
three times in a row.
Hrm, I'll see if anyone else gets your outcome vs mine. Thanks.
Any updates? I cannot reproduce the issue with either running
$ sbt "; clean; assembly; clean; assembly; clean; assembly"
or
$ sbt "; assembly; assembly; assembly"
Hi @eed3si9n thanks for checking back. The issue is resolved, it was related to some configuration in the assembly I had.
I'm closing this then.
Hello @helena - could you mention how you resolved this error ? I am getting the same exception trace.However, my build is not successfull even after running clean and assembly. Also, i believe there is something funky with my build.sbt. Pasted below
import AssemblyKeys._
name := "spark_streaming"
version := "1.0"
scalaVersion := "2.10.3"
ivyScala := ivyScala.value map { _.copy(overrideScalaVersion = true) }
parallelExecution := false
parallelExecution in Test := false
libraryDependencies ++= Seq(
"org.apache.spark" % "spark-core_2.10" % "1.2.0",
"org.apache.spark" % "spark-streaming_2.10" % "1.2.0",
"org.apache.spark" % "spark-streaming-kafka_2.10" % "1.2.0",
"org.apache.hadoop" % "hadoop-client" % "2.6.0",
"org.apache.kafka" % "kafka_2.10" % "0.8.2.0",
"joda-time" % "joda-time" % "2.7",
"net.java.dev.jna" % "jna" % "4.1.0",
"net.java.dev.jna" % "jna-platform" % "4.1.0")
resolvers += "Maven repo" at "https://oss.sonatype.org/content/repositories/snapshots"
unmanagedJars in Compile += file("/custom-SNAPSHOT.jar")
Revolver.settings
assemblySettings
// ignoring running tests in assembly phase..
test in assembly := {}
Hrm, it was something possibly related to another process running that modified the output dir, I don't recall, or something minor in the build.
For posterity, this error was resolved. It required a bunch of tweaks in my build.sbt file.
Things to look out , when u encountered this error.
Also, as a suggestion for debugging purposes, you should create a new sbt project and add all the libraryDependencies one-by-one in build.sbt && try to build a assembly jar.This will help u in debugging the library dependency which causes the error
Here is my updated build.sbt. My project needs spark-streaming and spray framework as dependencies
import AssemblyKeys._
name := "test-sbt"
version := "1.0"
scalaVersion := "2.11.5"
resolvers ++= Seq(
"Sonatype OSS Snapshots" at "https://oss.sonatype.org/content/repositories/snapshots",
"jgit-repository" at "http://download.eclipse.org/jgit/maven",
"releases" at "http://oss.sonatype.org/content/repositories/releases",
"maven" at "https://repo1.maven.org/maven2/",
"spray repo" at "http://repo.spray.io"
)
libraryDependencies ++= {
val sprayVersion = "1.3.3"
val akkaVersion = "2.3.9"
val sprayJsonVersion = "1.3.1"
val scalaTestVersion = "2.2.1"
Seq(
// excluding transitive deps. creating fat-jar with spark is painful. https://github.com/sbt/sbt-assembly#exclude-specific-transitive-deps
"org.apache.spark" %% "spark-core" % "1.3.0"
exclude("com.twitter", "parquet-format")
exclude("org.mortbay.jetty", "servlet-api")
exclude("commons-beanutils", "commons-beanutils-core")
exclude("commons-collections", "commons-collections")
exclude("commons-logging", "commons-logging")
exclude("com.esotericsoftware.minlog", "minlog")
exclude("org.apache.spark", "spark-network-common_2.11")
exclude("org.apache.spark", "spark-network-shuffle_2.11")
exclude("org.apache.hadoop", "hadoop-yarn-common")
exclude("org.spark-project.spark", "unused"),
"org.apache.spark" %% "spark-streaming" % "1.3.0"
exclude("org.spark-project.spark", "unused")
exclude("org.apache.spark", "spark-core"),
// JNA dependencies
"net.java.dev.jna" % "jna" % "4.1.0",
"net.java.dev.jna" % "jna-platform" % "4.1.0",
// Spray client libs
// spark-core already includes a akka-actor.jar, so including it resolves in a deduplicate exception. Hence, using 'provided' task
// https://github.com/sbt/sbt-assembly#-provided-configuration
"com.typesafe.akka" %% "akka-actor" % akkaVersion % "provided",
"io.spray" %% "spray-client" % sprayVersion,
"io.spray" %% "spray-json" % sprayJsonVersion,
// Scala test libs
"org.scalatest" %% "scalatest" % scalaTestVersion % "test"
)
}
ivyScala := ivyScala.value map {
_.copy(overrideScalaVersion = true)
}
Revolver.settings
Revolver.reColors := Seq("blue", "green", "magenta")
unmanagedJars in Compile += file("custom.jar")
assemblySettings
@eed3si9n this is caused by missing particular dependencies that may be excluded directly or indirectly in a build file. Just fix this and it is resolved.
For future reference, it's easier to pull the jar file from the following:
http://mvnrepository.com/artifact/com.datastax.spark/spark-cassandra-connector_2.10/1.4.0
this is caused by missing particular dependencies that may be excluded directly or indirectly in a build file. Just fix this and it is resolved.
@helena - could you please provide more details? If you had a missing dependency, then wouldn't you just get an error when trying to resolve a class, i.e. but no such dependency contained it?
Example: new com.mycompany.Foobar
, but no such dependency (in build.sbt
) provides that class?
Thanks
In trying to reproduce and fix this bug https://github.com/datastax/spark-cassandra-connector/issues/459 I created this repo with this build https://github.com/helena/fu/blob/master/project/FuBuild.scala and found that I get either of these each time:
I can't get 3 repeatable successes with the same build file, no changes.
sbt 0.13.7 scala 2.10.4 jdk 1.7.0_71 mac 10.9.5
The failures are like this
... many many more lines of the same pattern, different jars