Closed FroMage closed 9 years ago
So apparently other problems:
/home/stephane/.m2/repository/org/apache/spark/spark-core_2.10/1.2.1/spark-core_2.10-1.2.1.jar(org/apache/spark/Accumulable.class): warning: Cannot find annotation method 'bytes()' in type 'ScalaSignature'
test/src/com/redhat/ceylon/compiler/java/test/cmr/modules/apachespark/test.ceylon:23: error: the 'ExpressionVisitor' caused an exception visiting a 'InvocationExpression' node: '"com.sun.tools.javac.code.Symbol$CompletionFailure: class file for scala.collection.Seq not found"'
SparkConf conf = SparkConf();
^
So this is fixed in master, where you just have to add a dependency on scala in your module:
module com.redhat.ceylon.compiler.java.test.cmr.modules.apachespark "1" {
import "org.apache.spark:spark-core_2.10" "1.2.1";
import "org.scala-lang:scala-library" "2.10.4";
import java.base "7";
}
Then you can use Spark as such:
import java.lang { JString = String, JBoolean = Boolean }
import org.apache.spark { ... }
import org.apache.spark.api.java { ... }
import org.apache.spark.api.java.\ifunction { ... }
shared void run() {
SparkConf conf = SparkConf().setMaster("local").setAppName("My App");
JavaSparkContext sc = JavaSparkContext(conf);
value textFile = sc.textFile("README.md");
print(textFile.count());
print(textFile.first());
// How many lines contain "Ceylon"?
print(textFile.filter(object satisfies Function<JString,JBoolean> {
call(JString line) => JBoolean(line.contains(JString("Ceylon")));
}).count());
}
For runtime, it's easier to run with ceylon run --flat-classpath --maven-overrides overrides.xml
with this overrides file:
<overrides>
<!-- Otherwise some artifact tries to use 1.1.0 which does not exist -->
<set groupId="org.jboss.weld" artifactId="weld-osgi-bundle" version="1.1.4.Final"/>
<!-- Otherwise some artifact tries to use a version with a missing method definition (old?) -->
<set groupId="xerces" artifactId="xercesImpl" version="2.11.0"/>
<!-- Otherwise we have two artifacts with the same contents, they got renamed -->
<replace groupId="org.jboss.netty" artifactId="netty">
<with groupId="io.netty" artifactId="netty" version="3.5.13.Final"/>
</replace>
</overrides>
Great work, Stef!
For 1.1 you don't have --flat-classpath
so you are going to need a much bigger module override. I'll work on this. You also have to avoid wildcard imports in Scala libs, like import org.apache.spark { ... }
and you have to specify the classes manually, otherwise it triggers a javac bug due to bogus Scala class files (that I work around in 1.2).
Trying Apache Spark leads to exceptions in the model loader, such as an NPE when completing packages which have a class named
package
(Scala has those, andpackage$
too), or completion failures forSparkContext$1
.