springml / spark-salesforce

Spark data source for Salesforce
Apache License 2.0
80 stars 67 forks source link

Getting NoSuchMethodError for org.apache.http.client.methods.HttpGet.setConfig #50

Open ramkrishnautpat opened 4 years ago

ramkrishnautpat commented 4 years ago

Hello All, i am getting NoSuchMethodError for org.apache.http.client.methods.HttpGet.setConfig while running cloudera VM centos linux. Same code can run fine on IDE on windows machine. I have also explicitly added httpclient dependency. Below are my project details.

My Pom <?xml version="1.0" encoding="UTF-8"?> <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">

4.0.0
<groupId>com.decisionmines.etl.bigdata.spark</groupId>
<artifactId>salesforce_connector</artifactId>
<version>1.0-SNAPSHOT</version>
<build>
    <plugins>
        <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-compiler-plugin</artifactId>
            <configuration>
                <source>8</source>
                <target>8</target>
            </configuration>
        </plugin>
        <plugin>
            <artifactId>maven-assembly-plugin</artifactId>
            <executions>
                <execution>
                    <phase>package</phase>
                    <goals>
                        <goal>single</goal>
                    </goals>
                </execution>
            </executions>
            <configuration>
                <descriptorRefs>
                    <descriptorRef>jar-with-dependencies</descriptorRef>
                </descriptorRefs>
            </configuration>
        </plugin>
    </plugins>
</build>

<dependencies>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.11</artifactId>
        <version>2.2.0</version>
        <scope>provided</scope>
    </dependency>

    <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-sql -->
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-sql_2.11</artifactId>
        <version>2.2.0</version>
        <scope>provided</scope>
    </dependency>
    <dependency>
        <groupId>com.springml</groupId>
        <artifactId>spark-salesforce_2.11</artifactId>
        <version>1.1.3</version>
    </dependency>
    <dependency>
        <groupId>org.apache.httpcomponents</groupId>
        <artifactId>httpclient</artifactId>
        <version>4.3.2</version>
    </dependency>
    <dependency>
        <groupId>org.apache.httpcomponents</groupId>
        <artifactId>httpcore</artifactId>
        <version>4.3.3</version>
    </dependency>
</dependencies>

Error log: 20/02/27 04:53:29 INFO session.SessionState: Created local directory: /tmp/4ae44611-2b88-4076-b32c-6ba36a384daf_resources 20/02/27 04:53:30 INFO session.SessionState: Created HDFS directory: /tmp/hive/root/4ae44611-2b88-4076-b32c-6ba36a384daf 20/02/27 04:53:30 INFO session.SessionState: Created local directory: /tmp/root/4ae44611-2b88-4076-b32c-6ba36a384daf 20/02/27 04:53:30 INFO session.SessionState: Created HDFS directory: /tmp/hive/root/4ae44611-2b88-4076-b32c-6ba36a384daf/_tmp_space.db 20/02/27 04:53:30 INFO session.SessionState: No Tez session required at this point. hive.execution.engine=mr. 20/02/27 04:53:30 INFO client.HiveClientImpl: Warehouse location for Hive client (version 1.1.0) is /user/hive/warehouse 20/02/27 04:53:30 INFO hive.metastore: Trying to connect to metastore with URI thrift://etl-dn19:9083 20/02/27 04:53:30 INFO hive.metastore: Opened a connection to metastore, current connections: 1 20/02/27 04:53:30 INFO hive.metastore: Connected to metastore. 20/02/27 04:53:30 INFO session.SessionState: Created local directory: /tmp/078755a1-c91a-4485-9fe1-41ee1b8c3c09_resources 20/02/27 04:53:30 INFO session.SessionState: Created HDFS directory: /tmp/hive/root/078755a1-c91a-4485-9fe1-41ee1b8c3c09 20/02/27 04:53:30 INFO session.SessionState: Created local directory: /tmp/root/078755a1-c91a-4485-9fe1-41ee1b8c3c09 20/02/27 04:53:30 INFO session.SessionState: Created HDFS directory: /tmp/hive/root/078755a1-c91a-4485-9fe1-41ee1b8c3c09/_tmp_space.db 20/02/27 04:53:30 INFO session.SessionState: No Tez session required at this point. hive.execution.engine=mr. 20/02/27 04:53:30 INFO client.HiveClientImpl: Warehouse location for Hive client (version 1.1.0) is /user/hive/warehouse 20/02/27 04:53:30 INFO state.StateStoreCoordinatorRef: Registered StateStoreCoordinator endpoint 20/02/27 04:53:30 INFO util.SFConfig: loginURL : https://login.salesforce.com/services/Soap/u/35.0 20/02/27 04:53:32 INFO util.HTTPHelper: Executing GET request on https://na124.salesforce.com/services/data/v36.0/query?q=SELECT%20Id,Industry%20FROM%20Account Exception in thread "main" java.lang.NoSuchMethodError: org.apache.http.client.methods.HttpGet.setConfig(Lorg/apache/http/client/config/RequestConfig;)V at com.springml.salesforce.wave.util.HTTPHelper.get(HTTPHelper.java:79) at com.springml.salesforce.wave.util.HTTPHelper.get(HTTPHelper.java:95) at com.springml.salesforce.wave.impl.ForceAPIImpl.query(ForceAPIImpl.java:138) at com.springml.salesforce.wave.impl.ForceAPIImpl.query(ForceAPIImpl.java:37) at com.springml.spark.salesforce.DatasetRelation.querySF(DatasetRelation.scala:105) at com.springml.spark.salesforce.DatasetRelation.read(DatasetRelation.scala:47) at com.springml.spark.salesforce.DatasetRelation.(DatasetRelation.scala:39) at com.springml.spark.salesforce.DefaultSource.createRelation(DefaultSource.scala:99) at com.springml.spark.salesforce.DefaultSource.createRelation(DefaultSource.scala:50) at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:306) at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:178) at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:146) at SparkAppMain.main(SparkAppMain.java:37) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:755) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) Please help

karthikeyankalimuthu commented 3 years ago

I'm also getting the same error

karthikeyankalimuthu commented 3 years ago

Using with Spark shell - exactly followed the same step This package can be added to Spark using the --packages command line option. For example, to include it when starting the spark shell:

$ bin/spark-shell --packages com.springml:spark-salesforce_2.11:1.1.3 Below is the exact error java.lang.NoSuchMethodError: org.apache.http.client.methods.HttpGet.setConfig(Lorg/apache/http/client/config/RequestConfig;)V at com.springml.salesforce.wave.util.HTTPHelper.get(HTTPHelper.java:79) at com.springml.salesforce.wave.util.HTTPHelper.get(HTTPHelper.java:95) at com.springml.salesforce.wave.impl.ForceAPIImpl.query(ForceAPIImpl.java:138) at com.springml.salesforce.wave.impl.ForceAPIImpl.query(ForceAPIImpl.java:37) at com.springml.spark.salesforce.DatasetRelation.querySF(DatasetRelation.scala:105) at com.springml.spark.salesforce.DatasetRelation.read(DatasetRelation.scala:47) at com.springml.spark.salesforce.DatasetRelation.(DatasetRelation.scala:39) at com.springml.spark.salesforce.DefaultSource.createRelation(DefaultSource.scala:99) at com.springml.spark.salesforce.DefaultSource.createRelation(DefaultSource.scala:50) at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:306) at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:178) at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:146) ... 55 elided