Open dportabella opened 8 years ago
I also tried another approach: to include the warcbase as an unmanaged dependency.
I put warcbase-core-0.1.0-SNAPSHOT-fatjar.jar
in myproject/lib/
and I keep a simple build.sbt:
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % sparkVersion,
)
I get this error when I run it from the terminal:
$ sbt -Dspark.master=local[2] run
ERROR SparkContext - Error initializing SparkContext.
com.typesafe.config.ConfigException$Missing: No configuration setting found for key 'akka.event-handlers'
what can be the problem?
As
warcbase-core
artifact is not yet published in a repository (snapshots nor releases), I do as follows:then, for my SBT project, I add this to
build.sbt
:When I run
sbt -Dspark.master=local[2] run
, I get this exception:Exception in thread "main" java.lang.NoClassDefFoundError: org/w3c/dom/ElementTraversal
My program (for now just a SparkPi example that does not use warcbase) runs ok if I remove the warcbase-core dependency from
build.sbt
.Using sbt dependencyTree, I see the following:
I don't understand really the problem, but on this post they propose to add this xml-apis:1.4.01 dependency. And indeed it works with this
build.sbt
:The question is why I need to add that dependency? what is the problem?
And while this works from the command line, it still fails when running it from IntelliJ with the same previous error.
Also, it would be great to publish the warcbase-core artifact (snapshots and releases) in a repository. Do you plan to do that? Can I help on this?