apache / linkis

Apache Linkis builds a computation middleware layer to facilitate connection, governance and orchestration between the upper applications and the underlying data engines.
https://linkis.apache.org/
Apache License 2.0
3.3k stars 1.17k forks source link

[Bug] scriptis查询hive表数据为空,Caused by: java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy #3540

Closed Yiutto closed 2 years ago

Yiutto commented 2 years ago

Search before asking

Linkis Component

linkis-engineconnn-plugin

Steps to reproduce

image image

使用linkis查询hive表数据为空,日志报错。Caused by: java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy

Expected behavior

希望hive查询表数据成功。

Your environment

Anything else

2022-09-26 20:34:59.034 INFO Program is substituting variables for you
2022-09-26 20:34:59.034 INFO Variables substitution ended successfully
2022-09-26 20:34:59.034 INFO SQL code check has passed
Job with jobId : 56 and execID : IDE_hadoop_hive_1 submitted 
2022-09-26 20:34:59.034 INFO You have submitted a new job, script code (after variable substitution) is
************************************SCRIPT CODE************************************
select * from yiutto.cup_mcc limit 30
************************************SCRIPT CODE************************************
2022-09-26 20:34:59.034 INFO Your job is accepted,  jobID is exec_id018009linkis-cg-entrancedn19:9104IDE_hadoop_hive_1 and taskID is 56 in ServiceInstance(linkis-cg-entrance, dn19:9104). Please wait it to be scheduled
job is scheduled.
2022-09-26 20:34:59.034 INFO Your job is Scheduled. Please wait it to run.
Your job is being scheduled by orchestrator.
2022-09-26 20:35:00.035 INFO job is running.
2022-09-26 20:35:00.035 INFO Your job is Running now. Please wait it to complete.
2022-09-26 20:35:00.035 INFO Job with jobGroupId : 56 and subJobId : 56 was submitted to Orchestrator.
2022-09-26 20:35:00.035 INFO Background is starting a new engine for you,execId TaskID_56_otJobId_astJob_20_codeExec_20 mark id is mark_20, it may take several seconds, please wait
2022-09-26 20:35:00.035 INFO Task submit to ec: ServiceInstance(linkis-cg-engineconn, dn19:27507) get engineConnExecId is: 2
2022-09-26 20:35:00.035 INFO EngineConn local log path: ServiceInstance(linkis-cg-engineconn, dn19:27507) /home/webank/LinkisInstall/tmp/hadoop/20220926/hive/9e69c3d2-d530-423f-bb3b-60879fcf2800/logs
HiveEngineExecutor_0 >> select * from yiutto.cup_mcc limit 30
Time taken: 568 ms, begin to fetch results.
Fetched  3 col(s) : 0 row(s) in hive
2022-09-26 20:35:00.802 WARN  [Linkis-Default-Scheduler-Thread-5] org.apache.linkis.engineplugin.hive.executor.HiveDriverProxy 78 apply - java.lang.reflect.InvocationTargetException: null
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_292]
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_292]
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_292]
    at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_292]
    at org.apache.linkis.engineplugin.hive.executor.HiveDriverProxy$$anonfun$getResults$1.apply$mcZ$sp(HiveEngineConnExecutor.scala:540) ~[linkis-engineplugin-hive-1.2.0.jar:1.2.0]
    at org.apache.linkis.engineplugin.hive.executor.HiveDriverProxy$$anonfun$getResults$1.apply(HiveEngineConnExecutor.scala:540) ~[linkis-engineplugin-hive-1.2.0.jar:1.2.0]
    at org.apache.linkis.engineplugin.hive.executor.HiveDriverProxy$$anonfun$getResults$1.apply(HiveEngineConnExecutor.scala:540) ~[linkis-engineplugin-hive-1.2.0.jar:1.2.0]
    at org.apache.linkis.common.utils.Utils$.tryCatch(Utils.scala:39) ~[linkis-common-1.2.0.jar:1.2.0]
    at org.apache.linkis.common.utils.Utils$.tryAndWarn(Utils.scala:68) ~[linkis-common-1.2.0.jar:1.2.0]
    at org.apache.linkis.engineplugin.hive.executor.HiveDriverProxy.getResults(HiveEngineConnExecutor.scala:539) ~[linkis-engineplugin-hive-1.2.0.jar:1.2.0]
    at org.apache.linkis.engineplugin.hive.executor.HiveEngineConnExecutor.sendResultSet(HiveEngineConnExecutor.scala:262) ~[linkis-engineplugin-hive-1.2.0.jar:1.2.0]
    at org.apache.linkis.engineplugin.hive.executor.HiveEngineConnExecutor.org$apache$linkis$engineplugin$hive$executor$HiveEngineConnExecutor$$executeHQL(HiveEngineConnExecutor.scala:219) ~[linkis-engineplugin-hive-1.2.0.jar:1.2.0]
    at org.apache.linkis.engineplugin.hive.executor.HiveEngineConnExecutor$$anon$1.run(HiveEngineConnExecutor.scala:145) ~[linkis-engineplugin-hive-1.2.0.jar:1.2.0]
    at org.apache.linkis.engineplugin.hive.executor.HiveEngineConnExecutor$$anon$1.run(HiveEngineConnExecutor.scala:138) ~[linkis-engineplugin-hive-1.2.0.jar:1.2.0]
    at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_292]
    at javax.security.auth.Subject.doAs(Subject.java:422) ~[?:1.8.0_292]
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1917) ~[hadoop-common-2.6.0-cdh5.12.1.jar:?]
    at org.apache.linkis.engineplugin.hive.executor.HiveEngineConnExecutor.executeLine(HiveEngineConnExecutor.scala:138) ~[linkis-engineplugin-hive-1.2.0.jar:1.2.0]
    at org.apache.linkis.engineconn.computation.executor.execute.ComputationExecutor$$anonfun$toExecuteTask$2$$anonfun$apply$6$$anonfun$apply$7.apply(ComputationExecutor.scala:179) ~[linkis-computation-engineconn-1.2.0.jar:1.2.0]
    at org.apache.linkis.engineconn.computation.executor.execute.ComputationExecutor$$anonfun$toExecuteTask$2$$anonfun$apply$6$$anonfun$apply$7.apply(ComputationExecutor.scala:178) ~[linkis-computation-engineconn-1.2.0.jar:1.2.0]
    at org.apache.linkis.common.utils.Utils$.tryCatch(Utils.scala:39) ~[linkis-common-1.2.0.jar:1.2.0]
    at org.apache.linkis.engineconn.computation.executor.execute.ComputationExecutor$$anonfun$toExecuteTask$2$$anonfun$apply$6.apply(ComputationExecutor.scala:180) ~[linkis-computation-engineconn-1.2.0.jar:1.2.0]
    at org.apache.linkis.engineconn.computation.executor.execute.ComputationExecutor$$anonfun$toExecuteTask$2$$anonfun$apply$6.apply(ComputationExecutor.scala:174) ~[linkis-computation-engineconn-1.2.0.jar:1.2.0]
    at scala.collection.immutable.Range.foreach(Range.scala:160) ~[scala-library-2.11.12.jar:?]
    at org.apache.linkis.engineconn.computation.executor.execute.ComputationExecutor$$anonfun$toExecuteTask$2.apply(ComputationExecutor.scala:173) ~[linkis-computation-engineconn-1.2.0.jar:1.2.0]
    at org.apache.linkis.engineconn.computation.executor.execute.ComputationExecutor$$anonfun$toExecuteTask$2.apply(ComputationExecutor.scala:149) ~[linkis-computation-engineconn-1.2.0.jar:1.2.0]
    at org.apache.linkis.common.utils.Utils$.tryFinally(Utils.scala:60) ~[linkis-common-1.2.0.jar:1.2.0]
    at org.apache.linkis.engineconn.computation.executor.execute.ComputationExecutor.toExecuteTask(ComputationExecutor.scala:226) ~[linkis-computation-engineconn-1.2.0.jar:1.2.0]
    at org.apache.linkis.engineconn.computation.executor.execute.ComputationExecutor$$anonfun$3.apply(ComputationExecutor.scala:241) ~[linkis-computation-engineconn-1.2.0.jar:1.2.0]
    at org.apache.linkis.engineconn.computation.executor.execute.ComputationExecutor$$anonfun$3.apply(ComputationExecutor.scala:241) ~[linkis-computation-engineconn-1.2.0.jar:1.2.0]
    at org.apache.linkis.common.utils.Utils$.tryFinally(Utils.scala:60) ~[linkis-common-1.2.0.jar:1.2.0]
    at org.apache.linkis.engineconn.acessible.executor.entity.AccessibleExecutor.ensureIdle(AccessibleExecutor.scala:55) ~[linkis-accessible-executor-1.2.0.jar:1.2.0]
    at org.apache.linkis.engineconn.acessible.executor.entity.AccessibleExecutor.ensureIdle(AccessibleExecutor.scala:49) ~[linkis-accessible-executor-1.2.0.jar:1.2.0]
    at org.apache.linkis.engineconn.computation.executor.execute.ComputationExecutor.ensureOp(ComputationExecutor.scala:133) ~[linkis-computation-engineconn-1.2.0.jar:1.2.0]
    at org.apache.linkis.engineconn.computation.executor.execute.ComputationExecutor.execute(ComputationExecutor.scala:240) ~[linkis-computation-engineconn-1.2.0.jar:1.2.0]
    at org.apache.linkis.engineconn.computation.executor.service.TaskExecutionServiceImpl.org$apache$linkis$engineconn$computation$executor$service$TaskExecutionServiceImpl$$executeTask(TaskExecutionServiceImpl.scala:298) ~[linkis-computation-engineconn-1.2.0.jar:1.2.0]
    at org.apache.linkis.engineconn.computation.executor.service.TaskExecutionServiceImpl$$anon$2$$anonfun$run$2.apply$mcV$sp(TaskExecutionServiceImpl.scala:231) ~[linkis-computation-engineconn-1.2.0.jar:1.2.0]
    at org.apache.linkis.engineconn.computation.executor.service.TaskExecutionServiceImpl$$anon$2$$anonfun$run$2.apply(TaskExecutionServiceImpl.scala:229) ~[linkis-computation-engineconn-1.2.0.jar:1.2.0]
    at org.apache.linkis.engineconn.computation.executor.service.TaskExecutionServiceImpl$$anon$2$$anonfun$run$2.apply(TaskExecutionServiceImpl.scala:229) ~[linkis-computation-engineconn-1.2.0.jar:1.2.0]
    at org.apache.linkis.common.utils.Utils$.tryCatch(Utils.scala:39) ~[linkis-common-1.2.0.jar:1.2.0]
    at org.apache.linkis.common.utils.Utils$.tryAndWarn(Utils.scala:68) ~[linkis-common-1.2.0.jar:1.2.0]
    at org.apache.linkis.engineconn.computation.executor.service.TaskExecutionServiceImpl$$anon$2.run(TaskExecutionServiceImpl.scala:229) ~[linkis-computation-engineconn-1.2.0.jar:1.2.0]
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_292]
    at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_292]
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_292]
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293) ~[?:1.8.0_292]
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_292]
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_292]
    at java.lang.Thread.run(Thread.java:748) ~[?:1.8.0_292]
Caused by: java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
    at org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native Method) ~[hadoop-common-2.6.0-cdh5.12.1.jar:?]
    at org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63) ~[hadoop-common-2.6.0-cdh5.12.1.jar:?]
    at org.apache.hadoop.io.compress.SnappyCodec.getDecompressorType(SnappyCodec.java:195) ~[hadoop-common-2.6.0-cdh5.12.1.jar:?]
    at org.apache.hadoop.io.compress.CodecPool.getDecompressor(CodecPool.java:178) ~[hadoop-common-2.6.0-cdh5.12.1.jar:?]
    at org.apache.hadoop.mapred.LineRecordReader.<init>(LineRecordReader.java:111) ~[hadoop-mapreduce-client-core-2.6.0-cdh5.12.1.jar:?]
    at org.apache.hadoop.mapred.TextInputFormat.getRecordReader(TextInputFormat.java:67) ~[hadoop-mapreduce-client-core-2.6.0-cdh5.12.1.jar:?]
    at org.apache.hadoop.hive.ql.exec.FetchOperator$FetchInputFormatSplit.getRecordReader(FetchOperator.java:674) ~[hive-exec-1.1.0-cdh5.12.1.jar:1.1.0-cdh5.12.1]
    at org.apache.hadoop.hive.ql.exec.FetchOperator.getRecordReader(FetchOperator.java:324) ~[hive-exec-1.1.0-cdh5.12.1.jar:1.1.0-cdh5.12.1]
    at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:446) ~[hive-exec-1.1.0-cdh5.12.1.jar:1.1.0-cdh5.12.1]
    at org.apache.hadoop.hive.ql.exec.FetchOperator.pushRow(FetchOperator.java:415) ~[hive-exec-1.1.0-cdh5.12.1.jar:1.1.0-cdh5.12.1]
    at org.apache.hadoop.hive.ql.exec.FetchTask.fetch(FetchTask.java:140) ~[hive-exec-1.1.0-cdh5.12.1.jar:1.1.0-cdh5.12.1]
    at org.apache.hadoop.hive.ql.Driver.getResults(Driver.java:2058) ~[hive-exec-1.1.0-cdh5.12.1.jar:1.1.0-cdh5.12.1]
    ... 49 more
2022-09-26 20:35:00.035 INFO Your subjob : 56 execue with state succeed, has 1 resultsets.
2022-09-26 20:35:00.035 INFO Congratuaions! Your job : IDE_hadoop_hive_1 executed with status succeed and 0 results.
2022-09-26 20:35:00.035 INFO job is completed.
2022-09-26 20:35:00.035 INFO Task creation time(任务创建时间): 2022-09-26 20:34:59, Task scheduling time(任务调度时间): 2022-09-26 20:34:59, Task start time(任务开始时间): 2022-09-26 20:35:00, Mission end time(任务结束时间): 2022-09-26 20:35:00
2022-09-26 20:35:00.035 INFO Your mission(您的任务) 56 The total time spent is(总耗时时间为): 1.1 s
2022-09-26 20:35:00.035 INFO Congratulations. Your job completed with status Success.

Are you willing to submit a PR?

github-actions[bot] commented 2 years ago

:blush: Welcome to the Apache Linkis (incubating) community!!

We are glad that you are contributing by opening this issue.

Please make sure to include all the relevant context. We will be here shortly.

If you are interested in contributing to our website project, please let us know! You can check out our contributing guide on :point_right: How to Participate in Project Contribution.

Community

WeChat Assistant WeChat Public Account

Mailing Lists

name description Subscribe Unsubscribe archive
dev@linkis.apache.org community activity information subscribe unsubscribe archive
Yiutto commented 2 years ago

解决方法1【spark】: 在/opt/spark-2.4.3-bin-hadoop2.6/conf添加spark-defaults.conf

$ cat spark-defaults.conf spark.driver.extraLibraryPath=/home/cloudera/parcel/CDH-5.12.1-1.cdh5.12.1.p0.3/lib/hadoop/lib/native spark.executor.extraLibraryPath=/home/cloudera/parcel/CDH-5.12.1-1.cdh5.12.1.p0.3/lib/hadoop/lib/native spark.yarn.am.extraLibraryPath=/home/cloudera/parcel/CDH-5.12.1-1.cdh5.12.1.p0.3/lib/hadoop/lib/native

解决方法2【hive】:

cp /home/cloudera/parcel/CDH/jars/snappy-java-1.0.4.1.jar /home/webank/LinkisInstall/lib/linkis-engineconn-plugins/hive/dist/v1.1.0_cdh5.12.1/lib/