Open kasireddybhimireddy opened 5 years ago
Hi, i have the same error when i try to save my dataframe in hbase table with SHC in cluster mode. with a spark-shell i d'ont have this problem. Someone can help please ? Thanks
+1, I have the same issue, tried with different versions of shc and json4s and always have the same result...
Anybody got some solution to this issue? I just encountered the same problem.
I encountered this issue when the VM I was running this in was updated to spark version 2.4.4
It was resolved by making sure that in the pom.xml file that the spark version matched the spark version in the running VM
+1, spark version 2.4
+1, spark version 2.4.4
+1 spark version 2.4.3.1
+1 spark version 2.4
+1, its been more than a year, Spark 3.0 is now GA and such a basic issue hasn't been addressed. Is this connector still maintained or should hbase-spark
be used?
Same issues since some months : spark version 2.3.0... Anyone knows how to fix this ?
Guys, @strayMat @syedhassaanahmed @Moh-BOB @thetruechar @sean-azoci @nealzh, If you are using spark version 2.4.0, you need to get a build from https://github.com/hortonworks-spark/shc/tree/branch-2.4. And you need to give hbase jars and shc jar to spark-submit application with --jars jar-list
. Make sure that you are giving jar list as comma-seperated, /hbase-path/* is not working.
Then set two configurations.
--conf spark.driver.userClassPathFirst=true
--conf spark.executor.userClassPathFirst=true
I was able to make it work in spark-submit, but i couldn't run it in spark-shell. It is throwing NoClassDefFoundError error.
Even I am facing this issue for spark 2.4..
@weiqingy if you have any resolution please let me know. Thanks in advance...
Hey @NonStopProgrammer sorry for the late reply. I haven't worked on this repo for a while. + @davidov541 as the latest commit is from David. Pls feel free to comment here.
@weiqingy @NonStopProgrammer I'm in the same boat as Wei, but I'm willing to poke around and see what I can do.
As an aside, I feel like I was told in a different forum by someone at Cloudera that this plugin was not the suggested route with later versions of Spark, especially moving into the CDP distributions. Based on that, I remember having issues getting my PR approved for this repo. I could be wrong on this, but I just wanted to set expectations that I may be able to get a PR made but not be able to get anyone to review it and merge it into trunk. If all else fails I can always fork and add the PR and you can pull from there, but we won't have as nice of a distribution mechanism as here.
Looks like my normal test environment won't work here since it uses Scala 2.12, which this plugin doesn't support. Can you get me the HBase version you are targeting as well, so that I can make sure to replicate your environment? @NonStopProgrammer
Not able to store data in hbase, It seems to to some jar file isse. I am using Spark 2.3
using :spark-submit --packages com.hortonworks:shc-core:1.1.0-2.1-s_2.11 --repositories http://repo.hortonworks.com/content/groups/public/