Closed Yiutto closed 2 years ago
We are glad that you are contributing by opening this issue.
Please make sure to include all the relevant context. We will be here shortly.
If you are interested in contributing to our website project, please let us know! You can check out our contributing guide on :point_right: How to Participate in Project Contribution.
WeChat Assistant | WeChat Public Account |
---|---|
name | description | Subscribe | Unsubscribe | archive |
---|---|---|---|---|
dev@linkis.apache.org | community activity information | subscribe | unsubscribe | archive |
解决方法1【spark】: 在/opt/spark-2.4.3-bin-hadoop2.6/conf添加spark-defaults.conf
$ cat spark-defaults.conf spark.driver.extraLibraryPath=/home/cloudera/parcel/CDH-5.12.1-1.cdh5.12.1.p0.3/lib/hadoop/lib/native spark.executor.extraLibraryPath=/home/cloudera/parcel/CDH-5.12.1-1.cdh5.12.1.p0.3/lib/hadoop/lib/native spark.yarn.am.extraLibraryPath=/home/cloudera/parcel/CDH-5.12.1-1.cdh5.12.1.p0.3/lib/hadoop/lib/native
解决方法2【hive】:
cp /home/cloudera/parcel/CDH/jars/snappy-java-1.0.4.1.jar /home/webank/LinkisInstall/lib/linkis-engineconn-plugins/hive/dist/v1.1.0_cdh5.12.1/lib/
Search before asking
Linkis Component
linkis-engineconnn-plugin
Steps to reproduce
使用linkis查询hive表数据为空,日志报错。Caused by: java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy
Expected behavior
希望hive查询表数据成功。
Your environment
Environment name and version:
cdh-5.12.1
hive-1.1.0——cdh5.12.1
spark-2.4.3
jdk 1.8.0_121
....
Anything else
Are you willing to submit a PR?