Open pranaygoyal02 opened 7 years ago
HbaseConfiguration
is defined in org.apache.hadoop.hbase.HBaseConfiguration
.
Thanks! I imported hbase-client lib to my pom and it let me pass this error atleast . Is there a configuration I can set up to read it from cluster. I am using hdp secure cluster.
HbaseConfiguration
is defined inorg.apache.hadoop.hbase.HBaseConfiguration
.
there is no such symbol in hbase-client-2.1.3.jar
I am using java to write to HBASE but I face the following issue: User class threw exception: java.lang.NoClassDefFoundError:org/apache/hadoop/hbase/HBaseConfiguration
Please find the code here : https://gist.github.com/pranaygoyal02/8d89297778107dfea2882a88febbf4a4
I cannot dictate the contract for writing to Hbase with the number of regionservers in the column since I am not the admin on the cluster. I tried to build the master code which has the fix for namespace not required (HBaseTableCatalog.newTable(),"5").
I tried to download from the repo(http://repo.hortonworks.com/content/groups/public/) as well as build the custom jar using the command: mvn clean -Pscala-2.11 -DskipTests package
Spark version used : Spark 2.1.0. I have updated hbase-site.xml to --files path(application master classpath) as well in dataframewriter options.
Command I use: /usr/local/bin/spark-submit-2.1 \ --master yarn \ --deploy-mode cluster \ --keytab \ --principal \ --num-executors 2 \ --conf spark.executor.extraJavaOptions="-Djavax.net.debug=SSL" \ --executor-cores 8 \ --conf spark.dynamicAllocation.enabled=false \ --conf spark.driver.extraJavaOptions="-Dhdp.version=2.5.3.0-37" \ --conf spark.executor.extraJavaOptions="-Dlog4j.configuration=/home_dir/log4j.properties" \ --conf spark.speculation="true" \ --files /home_dir/log4j.properties,/etc/hive/conf/hive-site.xml,/etc/hbase/conf/hbase-site.xml \ --verbose \ minotaur-kafka-spark-1.0-SNAPSHOT.jar test
Can you help me ?