Stratio / Spark-MongoDB

Spark library for easy MongoDB access
http://www.stratio.com
Apache License 2.0
307 stars 96 forks source link

spark.sqlContext.fromMongoDB(readConfig) retrieve error #173

Open dark-spark2 opened 7 years ago

dark-spark2 commented 7 years ago

Hi , I installed spark 2.0.2 , and run the spark shell.

bin]$ ./spark-shell --jars ~/spark-mongodb_2.10-0.11.2.jar --packages org.mongodb:casbah-core_2.10:3.0.0 Ivy Default Cache set to: /home/centos/.ivy2/cache The jars for the packages stored in: /home/centos/.ivy2/jars :: loading settings :: url = jar:file:/home/centos/spark/spark-2.0.2-bin-hadoop2.7/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml org.mongodb#casbah-core_2.10 added as a dependency :: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0 confs: [default] found org.mongodb#casbah-core_2.10;3.0.0 in central found org.mongodb#casbah-commons_2.10;3.0.0 in central found com.github.nscala-time#nscala-time_2.10;1.0.0 in central found joda-time#joda-time;2.3 in central found org.joda#joda-convert;1.2 in central found org.mongodb#mongo-java-driver;3.0.4 in central found org.slf4j#slf4j-api;1.6.0 in central found org.mongodb#casbah-query_2.10;3.0.0 in central :: resolution report :: resolve 377ms :: artifacts dl 13ms :: modules in use: com.github.nscala-time#nscala-time_2.10;1.0.0 from central in [default] joda-time#joda-time;2.3 from central in [default] org.joda#joda-convert;1.2 from central in [default] org.mongodb#casbah-commons_2.10;3.0.0 from central in [default] org.mongodb#casbah-core_2.10;3.0.0 from central in [default] org.mongodb#casbah-query_2.10;3.0.0 from central in [default] org.mongodb#mongo-java-driver;3.0.4 from central in [default] org.slf4j#slf4j-api;1.6.0 from central in [default]

    |                  |            modules            ||   artifacts   |
    |       conf       | number| search|dwnlded|evicted|| number|dwnlded|
    ---------------------------------------------------------------------
    |      default     |   8   |   0   |   0   |   0   ||   8   |   0   |
    ---------------------------------------------------------------------

:: retrieving :: org.apache.spark#spark-submit-parent confs: [default] 0 artifacts copied, 8 already retrieved (0kB/8ms) Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties Setting default log level to "WARN". To adjust logging level use sc.setLogLevel(newLevel). 17/01/23 14:17:53 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 17/01/23 14:17:55 WARN SparkContext: Use an existing SparkContext, some configuration may not take effect. Spark context Web UI available at http://172.31.19.188:4040 Spark context available as 'sc' (master = local[*], app id = local-1485181074885). Spark session available as 'spark'. Welcome to


 / __/__  ___ _____/ /__
_\ \/ _ \/ _ `/ __/  '_/

// ./_,// //_\ version 2.0.2 /_/

Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_111) Type in expressions to have them evaluated. Type :help for more information.

scala> import org.apache.spark.sql. import org.apache.spark.sql.

scala> import com.mongodb.casbah.{WriteConcern => MongodbWriteConcern} import com.mongodb.casbah.{WriteConcern=>MongodbWriteConcern}

scala> import com.stratio.datasource.mongodb. import com.stratio.datasource.mongodb.

scala> import com.stratio.datasource.mongodb.config. import com.stratio.datasource.mongodb.config.

scala> import com.stratio.datasource.mongodb.config.MongodbConfig. import com.stratio.datasource.mongodb.config.MongodbConfig.

scala> val builder = MongodbConfigBuilder(Map(Host -> List("localhost:27017"), Database -> "db1", Collection ->"coll1", SamplingRatio -> 0.001, WriteConcern -> "normal")) builder: com.stratio.datasource.mongodb.config.MongodbConfigBuilder = MongodbConfigBuilder(Map(database -> db1, writeConcern -> normal, schema_samplingRatio -> 0.001, collection -> coll1, host -> List(localhost:27017)))

scala>

scala> val readConfig = builder.build() readConfig: com.stratio.datasource.util.Config = com.stratio.datasource.util.ConfigBuilder$$anon$1@f3cee0fa

scala> val mongoRDD = spark.sqlContext.fromMongoDB(readConfig) java.lang.NoSuchMethodError: com.stratio.datasource.mongodb.MongodbContext.fromMongoDB(Lcom/stratio/datasource/util/Config;Lscala/Option;)Lorg/apache/spark/sql/Dataset; ... 56 elided

sanjosh commented 7 years ago

I am seeing the same error. Did you find a workaround @dark-spark2 ?

cyjj commented 7 years ago

I am currently getting the same error using spark 2.1.0 Do you guys get a solution for this?

thirumalalagu commented 5 years ago

Still I am getting the same error using Spark 2.3.1.