Open JohnCunningham opened 8 years ago
Hi @JohnCunningham ,
You are right, Spark-MongoDB connector is not compatible with MongoDB 3.2 yet. Currently, we are using casbah 2.8.0 which relies on java-driver 2.13 (compatibility matrix)
We'd like to migrate from casbah to the new scala-mongodb-driver. We'll take over this issue shortly in future releases.
Thanks
I dumped into the same issue too, you can still make it work with mongo driver 3.x by enforcing sbt to build your project with a new version of casbah by adding this
libraryDependencies +="org.mongodb" % "casbah_2.10" % "3.1.1"
i did successfully Spark 1.6.1- MongoDB 3.0.11 integration using Stratio-Spark-MongoDB Connector. Add all jar files mention in doc's https://github.com/Stratio/spark-mongodb/blob/master/doc/src/site/sphinx/First_Steps.rst except one jar version,
We are looking to test/prototype MongoDB. We would like to integrate it with Spark using Stratio Spark-MongoDB connector. The latest version of MongoDB is 3.2.x. Are there compatible versions of Spark and Stratio Spark-MongoDB connector that would work with MongoDB 3.2?
I presume from the absence of version 3.2 on the README.md that the answer is no. Just double checking.
Thanks