memsql / singlestore-spark-connector

A connector for SingleStore and Spark
Apache License 2.0
160 stars 54 forks source link

Add support for Scala 2.11.7 #8

Closed tsindot closed 7 years ago

tsindot commented 8 years ago

First, thanks for the great project, just what I what I needed for a given use case.

Can the build.sbt be modify to add crossScalaVersions and set the scalaVersion := "2.11.7". I bumped the java version to 1.8 as I wanted to test the deploy locally, but that should not be necessary. Here is the diff from my system, will be glad to put together a PR if you feel this is worth doing.

RADTech-MBP:memsql-spark-connector tnist$ git diff build.sbt
diff --git a/build.sbt b/build.sbt
index 0c2e410..e5b401e 100644
--- a/build.sbt
+++ b/build.sbt
@@ -14,7 +14,8 @@ lazy val testScalastyle = taskKey[Unit]("testScalastyle")
 lazy val commonSettings = Seq(
   organization := "com.memsql",
   version := "1.2.1-SNAPSHOT",
-  scalaVersion := "2.10.5",
+  scalaVersion := "2.11.7",
+  crossScalaVersions := Seq("2.10.4", "2.11.7"),
   assemblyScalastyle := org.scalastyle.sbt.ScalastylePlugin.scalastyle.in(Compile).toTask("").value,
   assembly <<= assembly dependsOn assemblyScalastyle,
   testScalastyle := org.scalastyle.sbt.ScalastylePlugin.scalastyle.in(Test).toTask("").value,
@@ -54,7 +55,7 @@ lazy val commonSettings = Seq(
   publishMavenStyle := true,
   publishArtifact in Test := false,
   pomIncludeRepository := { _ => false },
-  javaVersionPrefix in javaVersionCheck := Some("1.7")
+  javaVersionPrefix in javaVersionCheck := Some("1.8")
 )

 lazy val connectorLib = (project in file("connectorLib")).

-Todd

choochootrain commented 8 years ago

until https://issues.apache.org/jira/browse/SPARK-6363 is resolved i don't think we will support scala 2.11. but you can open a pr for when that change is made in spark

tsindot commented 8 years ago

I see. Which artifact are not currently published under 2.11 by the Spark project that the memsql connector is dependent on? The majority of the modules are now crossed built, including the spark-core, spark-streaming, spark-streaming-kafka, and spark-sql. The one thing I do see missing is spark-jdbc, and the thrift sever, but I don't believe that this project is dependent on those. I will put together a PR with the cross build and submit. Hopefully, you will be able to publish the 2.11 artifacts, if not I will fall back to publishing them to our local artifactory until resolved. TIA for the feedback.

-Todd

choochootrain commented 8 years ago

correct, it supports 2.11 for everything we need but until we've tested it extensively we won't publish the artifacts. please feel free to put together that pr - once 1.2.1 is shipped i'll test and publish artifacts for the next release

tsindot commented 8 years ago

@choochootrain Sorry, messed up the PR, closed it and created a new one.

-Todd

btrofimov commented 8 years ago

@choochootrain - that https://issues.apache.org/jira/browse/SPARK-6363 is already resolved. Are there any plans to publish memsql-spark-connector with 2.11 support?

choochootrain commented 8 years ago

memsql-spark-connector runs on Spark 1.5.1 at the moment because it uses some of the internal API for SQL pushdown support. publishing with 2.11 support is blocked until memsql-spark-connector is ported to Spark 2.0 - at that point, publishing with 2.11 should be straightforward