Closed tsindot closed 7 years ago
until https://issues.apache.org/jira/browse/SPARK-6363 is resolved i don't think we will support scala 2.11. but you can open a pr for when that change is made in spark
I see. Which artifact are not currently published under 2.11 by the Spark project that the memsql connector is dependent on? The majority of the modules are now crossed built, including the spark-core, spark-streaming, spark-streaming-kafka, and spark-sql. The one thing I do see missing is spark-jdbc, and the thrift sever, but I don't believe that this project is dependent on those. I will put together a PR with the cross build and submit. Hopefully, you will be able to publish the 2.11 artifacts, if not I will fall back to publishing them to our local artifactory until resolved. TIA for the feedback.
-Todd
correct, it supports 2.11 for everything we need but until we've tested it extensively we won't publish the artifacts. please feel free to put together that pr - once 1.2.1 is shipped i'll test and publish artifacts for the next release
@choochootrain Sorry, messed up the PR, closed it and created a new one.
-Todd
@choochootrain - that https://issues.apache.org/jira/browse/SPARK-6363 is already resolved. Are there any plans to publish memsql-spark-connector with 2.11 support?
memsql-spark-connector runs on Spark 1.5.1 at the moment because it uses some of the internal API for SQL pushdown support. publishing with 2.11 support is blocked until memsql-spark-connector is ported to Spark 2.0 - at that point, publishing with 2.11 should be straightforward
First, thanks for the great project, just what I what I needed for a given use case.
Can the
build.sbt
be modify to addcrossScalaVersions
and set thescalaVersion := "2.11.7"
. I bumped the java version to 1.8 as I wanted to test the deploy locally, but that should not be necessary. Here is the diff from my system, will be glad to put together a PR if you feel this is worth doing.-Todd