Scan case of sql run failed in spark with log following,while in hadoop successfully.
whether did I mistake some config?
env version :
spark:2.4.6
hadoop:2.6.5
Hibench:7.1
2021-04-06 03:50:58,532 INFO ui.JettyUtils: Adding filter org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter to /metrics/json.
2021-04-06 03:50:58,540 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@44598ef7{/metrics/json,null,AVAILABLE,@Spark}
2021-04-06 03:50:58,552 INFO cluster.YarnClientSchedulerBackend: SchedulerBackend is ready for scheduling beginning after waiting maxRegisteredResourcesWaitingTime: 5000(ms)
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/sql/hive/HiveContext
at com.intel.hibench.sparkbench.sql.ScalaSparkSQLBench$.main(ScalaSparkSQLBench.scala:38)
at com.intel.hibench.sparkbench.sql.ScalaSparkSQLBench.main(ScalaSparkSQLBench.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:845)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:920)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:929)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.sql.hive.HiveContext
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
... 14 more
2021-04-06 03:50:58,561 INFO spark.SparkContext: Invoking stop() from shutdown hook
2021-04-06 03:50:58,567 INFO server.AbstractConnector: Stopped Spark@88d6f9b{HTTP/1.1,[http/1.1]}{127.0.0.1:4040}
2021-04-06 03:50:58,569 INFO ui.SparkUI: Stopped Spark web UI at http://localhost:4040
2021-04-06 03:50:58,573 INFO cluster.YarnClientSchedulerBackend: Interrupting monitor thread
2021-04-06 03:50:58,597 INFO cluster.YarnClientSchedulerBackend: Shutting down all executors
2021-04-06 03:50:58,600 INFO cluster.YarnSchedulerBackend$YarnDriverEndpoint: Asking each executor to shut down
2021-04-06 03:50:58,602 INFO cluster.SchedulerExtensionServices: Stopping SchedulerExtensionServices
(serviceOption=None,
services=List(),
Scan case of sql run failed in spark with log following,while in hadoop successfully. whether did I mistake some config?
env version : spark:2.4.6 hadoop:2.6.5 Hibench:7.1
2021-04-06 03:50:58,532 INFO ui.JettyUtils: Adding filter org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter to /metrics/json. 2021-04-06 03:50:58,540 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@44598ef7{/metrics/json,null,AVAILABLE,@Spark} 2021-04-06 03:50:58,552 INFO cluster.YarnClientSchedulerBackend: SchedulerBackend is ready for scheduling beginning after waiting maxRegisteredResourcesWaitingTime: 5000(ms) Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/sql/hive/HiveContext at com.intel.hibench.sparkbench.sql.ScalaSparkSQLBench$.main(ScalaSparkSQLBench.scala:38) at com.intel.hibench.sparkbench.sql.ScalaSparkSQLBench.main(ScalaSparkSQLBench.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52) at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:845) at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161) at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184) at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86) at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:920) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:929) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) Caused by: java.lang.ClassNotFoundException: org.apache.spark.sql.hive.HiveContext at java.net.URLClassLoader.findClass(URLClassLoader.java:382) at java.lang.ClassLoader.loadClass(ClassLoader.java:418) at java.lang.ClassLoader.loadClass(ClassLoader.java:351) ... 14 more 2021-04-06 03:50:58,561 INFO spark.SparkContext: Invoking stop() from shutdown hook 2021-04-06 03:50:58,567 INFO server.AbstractConnector: Stopped Spark@88d6f9b{HTTP/1.1,[http/1.1]}{127.0.0.1:4040} 2021-04-06 03:50:58,569 INFO ui.SparkUI: Stopped Spark web UI at http://localhost:4040 2021-04-06 03:50:58,573 INFO cluster.YarnClientSchedulerBackend: Interrupting monitor thread 2021-04-06 03:50:58,597 INFO cluster.YarnClientSchedulerBackend: Shutting down all executors 2021-04-06 03:50:58,600 INFO cluster.YarnSchedulerBackend$YarnDriverEndpoint: Asking each executor to shut down 2021-04-06 03:50:58,602 INFO cluster.SchedulerExtensionServices: Stopping SchedulerExtensionServices (serviceOption=None, services=List(),
thanks!