databricks / spark-corenlp

Stanford CoreNLP wrapper for Apache Spark
GNU General Public License v3.0
422 stars 120 forks source link

Can't execute your example code using Java #6

Closed wojtuch closed 8 years ago

wojtuch commented 8 years ago

Hello,

I'm doing some tests using your CoreNLP wrapper but I'm unable to execute the example code you provided:

DataFrame df = sqlContext.read().json("corenlptest.json");
CoreNLP coreNLP = new CoreNLP()
      .setInputCol("text")
      .setAnnotators(new String[]{"tokenize", "ssplit", "lemma"})
      .setFlattenNestedFields(new String[]{"sentence_token_word"})
      .setOutputCol("parsed");
DataFrame outputDF = coreNLP.transform(df);

The stack trace is:

Exception in thread "main" java.lang.NoSuchMethodError: scala.collection.immutable.$colon$colon.hd$1()Ljava/lang/Object; at com.databricks.spark.corenlp.CoreNLP$.extractElementType(CoreNLP.scala:170) at com.databricks.spark.corenlp.CoreNLP$.com$databricks$spark$corenlp$CoreNLP$$flattenStructField(CoreNLP.scala:162) at com.databricks.spark.corenlp.CoreNLP$$anonfun$2.apply(CoreNLP.scala:90) at com.databricks.spark.corenlp.CoreNLP$$anonfun$2.apply(CoreNLP.scala:89) at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:245) at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:245) at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33) at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186) at scala.collection.TraversableLike$class.map(TraversableLike.scala:245) at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:186) at com.databricks.spark.corenlp.CoreNLP.outputSchema(CoreNLP.scala:89) at com.databricks.spark.corenlp.CoreNLP.transform(CoreNLP.scala:80) at com.test.CoreNLPSpark.main(CoreNLPSpark.java:35)

Do you have an idea what can cause this? Is my attempt wrong or can this be considered a bug?

W.

mengxr commented 8 years ago

You were using a different version of Scala. spark-corenlp was compiled with Scala 2.10.