apache / linkis

Apache Linkis builds a computation middleware layer to facilitate connection, governance and orchestration between the upper applications and the underlying data engines.
https://linkis.apache.org/
Apache License 2.0
3.3k stars 1.17k forks source link

hive读取不了hbase外部表 #437

Closed ittechblog closed 3 years ago

ittechblog commented 4 years ago

错误信息如下: 59469: 2020-06-05 15:03:29-915 WARN [hiveEngineEngine-Thread-2] com.webank.wedatasphere.linkis.engine.hive.executor.HiveDriverProxy$ com.webank.wedatasphere.linkis.common.utils.Utils$$anonfun$tryAndWarn$1.apply(Utils.scala:84) apply - java.lang.ClassNotFoundException: org.apache.hadoop.hive.ql.IDriver 59469: at java.net.URLClassLoader.findClass(URLClassLoader.java:381) ~[?:1.8.0_181] 59469: at java.lang.ClassLoader.loadClass(ClassLoader.java:424) ~[?:1.8.0_181] 59469: at java.lang.ClassLoader.loadClass(ClassLoader.java:357) ~[?:1.8.0_181] 59469: at com.webank.wedatasphere.linkis.engine.hive.executor.HiveDriverProxy$$anonfun$3.apply(HiveEngineExecutor.scala:378) ~[linkis-hive-engine-0.9.3.jar:?] 59469: at com.webank.wedatasphere.linkis.engine.hive.executor.HiveDriverProxy$$anonfun$3.apply(HiveEngineExecutor.scala:378) ~[linkis-hive-engine-0.9.3.jar:?] 59469: at com.webank.wedatasphere.linkis.common.utils.Utils$.tryCatch(Utils.scala:48) [linkis-common-0.9.3.jar:?] 59469: at com.webank.wedatasphere.linkis.common.utils.Utils$.tryAndWarn(Utils.scala:74) [linkis-common-0.9.3.jar:?] 59469: at com.webank.wedatasphere.linkis.engine.hive.executor.HiveDriverProxy$.(HiveEngineExecutor.scala:376) [linkis-hive-engine-0.9.3.jar:?] 59469: at com.webank.wedatasphere.linkis.engine.hive.executor.HiveDriverProxy$.(HiveEngineExecutor.scala) [linkis-hive-engine-0.9.3.jar:?] 59469: at com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutor$$anon$1.run(HiveEngineExecutor.scala:124) [linkis-hive-engine-0.9.3.jar:?] 59469: at com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutor$$anon$1.run(HiveEngineExecutor.scala:121) [linkis-hive-engine-0.9.3.jar:?] 59469: at java.security.AccessController.doPrivileged(Native Method) [?:1.8.0_181] 59469: at javax.security.auth.Subject.doAs(Subject.java:422) [?:1.8.0_181] 59469: at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) [hadoop-common-2.7.2.jar:?] 59469: at com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutor.executeLine(HiveEngineExecutor.scala:121) [linkis-hive-engine-0.9.3.jar:?] 59469: at com.webank.wedatasphere.linkis.engine.execute.EngineExecutor$$anonfun$execute$1$$anonfun$apply$9$$anonfun$apply$10.apply(EngineExecutor.scala:141) [linkis-ujes-engine-0.9.3.jar:?] 59469: at com.webank.wedatasphere.linkis.engine.execute.EngineExecutor$$anonfun$execute$1$$anonfun$apply$9$$anonfun$apply$10.apply(EngineExecutor.scala:140) [linkis-ujes-engine-0.9.3.jar:?] 59469: at com.webank.wedatasphere.linkis.common.utils.Utils$.tryCatch(Utils.scala:48) [linkis-common-0.9.3.jar:?] 59469: at com.webank.wedatasphere.linkis.engine.execute.EngineExecutor$$anonfun$execute$1$$anonfun$apply$9.apply(EngineExecutor.scala:141) [linkis-ujes-engine-0.9.3.jar:?] 59469: at com.webank.wedatasphere.linkis.engine.execute.EngineExecutor$$anonfun$execute$1$$anonfun$apply$9.apply(EngineExecutor.scala:136) [linkis-ujes-engine-0.9.3.jar:?] 59469: at scala.collection.immutable.Range.foreach(Range.scala:160) [scala-library-2.11.8.jar:?] 59469: at com.webank.wedatasphere.linkis.engine.execute.EngineExecutor$$anonfun$execute$1.apply(EngineExecutor.scala:136) [linkis-ujes-engine-0.9.3.jar:?] 59469: at com.webank.wedatasphere.linkis.engine.execute.EngineExecutor$$anonfun$execute$1.apply(EngineExecutor.scala:118) [linkis-ujes-engine-0.9.3.jar:?] 59469: at com.webank.wedatasphere.linkis.common.utils.Utils$.tryFinally(Utils.scala:62) [linkis-common-0.9.3.jar:?] 59469: at com.webank.wedatasphere.linkis.scheduler.executer.AbstractExecutor.ensureIdle(AbstractExecutor.scala:60) [linkis-scheduler-0.9.3.jar:?] 59469: at com.webank.wedatasphere.linkis.scheduler.executer.AbstractExecutor.ensureIdle(AbstractExecutor.scala:54) [linkis-scheduler-0.9.3.jar:?] 59469: at com.webank.wedatasphere.linkis.engine.execute.EngineExecutor.ensureOp$1(EngineExecutor.scala:117) [linkis-ujes-engine-0.9.3.jar:?] 59469: at com.webank.wedatasphere.linkis.engine.execute.EngineExecutor.execute(EngineExecutor.scala:118) [linkis-ujes-engine-0.9.3.jar:?] 59469: at com.webank.wedatasphere.linkis.scheduler.queue.Job$$anonfun$3.apply(Job.scala:254) [linkis-scheduler-0.9.3.jar:?] 59469: at com.webank.wedatasphere.linkis.scheduler.queue.Job$$anonfun$3.apply(Job.scala:254) [linkis-scheduler-0.9.3.jar:?] 59469: at com.webank.wedatasphere.linkis.common.utils.Utils$.tryCatch(Utils.scala:48) [linkis-common-0.9.3.jar:?] 59469: at com.webank.wedatasphere.linkis.scheduler.queue.Job.run(Job.scala:254) [linkis-scheduler-0.9.3.jar:?] 59469: at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_181] 59469: at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_181] 59469: at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_181] 59469: at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_181] 59469: at java.lang.Thread.run(Thread.java:748) [?:1.8.0_181] 59469:

org.apache.hadoop.hive.ql.IDriver在hive-exec.jar里并没有这个类只有org.apache.hadoop.hive.ql.Driver

wForget commented 4 years ago

这里是为了做Hive的兼容,并不是问题所在,这个提示后面会改掉,https://github.com/WeBankFinTech/Linkis/pull/349/commits/bc48e799c220a8dccd8a69c7bded6937c97a6f0a

wForget commented 4 years ago

具体原因需要看一下其他的 Error 日志

ittechblog commented 4 years ago

具体原因需要看一下其他的 Error 日志

2020-06-05 10:46:44,858 INFO (background-preinit) INFO Version - HV000001: Hibernate Validator 5.1.2.Final 2020-06-05 10:46:45,140 INFO (main) INFO AnnotationConfigApplicationContext - Refreshing org.springframework.context.annotation.AnnotationConfigApplicationContext@4b6579e8: startup date [Fri Jun 05 10:46:45 CST 2020]; root of context hierarchy 2020-06-05 10:46:45,371 INFO (main) INFO AutowiredAnnotationBeanPostProcessor - JSR-330 'javax.inject.Inject' annotation found and supported for autowiring 2020-06-05 10:46:45,405 INFO (main) INFO PostProcessorRegistrationDelegate$BeanPostProcessorChecker - Bean 'configurationPropertiesRebinderAutoConfiguration' of type [org.springframework.cloud.autoconfigure.ConfigurationPropertiesRebinderAutoConfiguration$$EnhancerBySpringCGLIB$$a94202bb] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying)

. _ _ /\ / '_ () \ \ \ \ ( ( )\ | ' | '| | ' \/ ` | \ \ \ \ \/ _)| |)| | | | | || (| | ) ) ) ) ' |__| .|| ||| |\, | / / / / =========|_|==============|__/=//// :: Spring Boot :: (v2.0.3.RELEASE)

2020-06-05 10:46:45,653 INFO (main) INFO ConfigServicePropertySourceLocator - Fetching config from server at : http://localhost:8888 2020-06-05 10:46:45,749 INFO (main) INFO ConfigServicePropertySourceLocator - Connect Timeout Exception on Url - http://localhost:8888. Will be trying the next url if available 2020-06-05 10:46:45,749 WARN (main) WARN ConfigServicePropertySourceLocator - Could not locate PropertySource: I/O error on GET request for "http://localhost:8888/hiveEngineManager/default": Connection refused (Connection refused); nested exception is java.net.ConnectException: Connection refused (Connection refused) 2020-06-05 10:46:45,751 INFO (main) INFO DataWorkCloudApplication - No active profile set, falling back to default profiles: default 2020-06-05 10:46:45,770 INFO (main) INFO DataWorkCloudApplication - add config from config server... 2020-06-05 10:46:45,770 INFO (main) INFO DataWorkCloudApplication - initialize DataWorkCloud spring application... 2020-06-05 10:46:45,773 INFO (main) INFO AnnotationConfigServletWebServerApplicationContext - Refreshing org.springframework.boot.web.servlet.context.AnnotationConfigServletWebServerApplicationContext@444548a0: startup date [Fri Jun 05 10:46:45 CST 2020]; parent: org.springframework.context.annotation.AnnotationConfigApplicationContext@4b6579e8 2020-06-05 10:46:46,479 INFO (main) INFO DefaultListableBeanFactory - Overriding bean definition for bean 'resources' with a different definition: replacing [Root bean: class [null]; scope=; abstract=false; lazyInit=false; autowireMode=3; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=engineManagerSpringConfiguration; factoryMethodName=createResource; initMethodName=null; destroyMethodName=(inferred); defined in class path resource [com/webank/wedatasphere/linkis/enginemanager/impl/EngineManagerSpringConfiguration.class]] with [Root bean: class [null]; scope=; abstract=false; lazyInit=false; autowireMode=3; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=hiveEngineManagerSpringConfiguration; factoryMethodName=createResource; initMethodName=null; destroyMethodName=(inferred); defined in class path resource [com/webank/wedatasphere/linkis/enginemanager/hive/conf/HiveEngineManagerSpringConfiguration.class]] 2020-06-05 10:46:46,479 INFO (main) INFO DefaultListableBeanFactory - Overriding bean definition for bean 'hooks' with a different definition: replacing [Root bean: class [null]; scope=; abstract=false; lazyInit=false; autowireMode=3; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=engineManagerSpringConfiguration; factoryMethodName=createEngineHook; initMethodName=null; destroyMethodName=(inferred); defined in class path resource [com/webank/wedatasphere/linkis/enginemanager/impl/EngineManagerSpringConfiguration.class]] with [Root bean: class [null]; scope=; abstract=false; lazyInit=false; autowireMode=3; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=hiveEngineManagerSpringConfiguration; factoryMethodName=createEngineHook; initMethodName=null; destroyMethodName=(inferred); defined in class path resource [com/webank/wedatasphere/linkis/enginemanager/hive/conf/HiveEngineManagerSpringConfiguration.class]] 2020-06-05 10:46:46,999 INFO (main) INFO GenericScope - BeanFactory id=572d531d-6d22-33ef-9a3b-6fb666d9b9e3 2020-06-05 10:46:47,015 INFO (main) INFO AutowiredAnnotationBeanPostProcessor - JSR-330 'javax.inject.Inject' annotation found and supported for autowiring 2020-06-05 10:46:47,243 INFO (main) INFO PostProcessorRegistrationDelegate$BeanPostProcessorChecker - Bean 'org.springframework.cloud.autoconfigure.ConfigurationPropertiesRebinderAutoConfiguration' of type [org.springframework.cloud.autoconfigure.ConfigurationPropertiesRebinderAutoConfiguration$$EnhancerBySpringCGLIB$$a94202bb] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) 2020-06-05 10:46:47.436 INFO [main] org.eclipse.jetty.util.log 193 initialized - Logging initialized @3678ms to org.eclipse.jetty.util.log.Slf4jLog 2020-06-05 10:46:47,545 INFO (main) INFO JettyServletWebServerFactory - Server initialized with port: 10099 2020-06-05 10:46:47.575 INFO [main] org.eclipse.jetty.server.Server 374 doStart - jetty-9.4.11.v20180605; built: 2018-06-05T18:24:03.829Z; git: d5fc0523cfa96bfebfbda19606cad384d772f04c; jvm 1.8.0_181-b13 2020-06-05 10:46:47.687 INFO [main] org.eclipse.jetty.server.session 365 doStart - DefaultSessionIdManager workerName=node0 2020-06-05 10:46:47.687 INFO [main] org.eclipse.jetty.server.session 370 doStart - No SessionScavenger set, using defaults 2020-06-05 10:46:47.689 INFO [main] org.eclipse.jetty.server.session 149 startScavenging - node0 Scavenging every 660000ms 2020-06-05 10:46:47.694 INFO [main] org.eclipse.jetty.server.handler.ContextHandler.application 2318 log - Initializing Spring embedded WebApplicationContext 2020-06-05 10:46:47,694 INFO (main) INFO ContextLoader - Root WebApplicationContext: initialization completed in 1921 ms 2020-06-05 10:46:47.910 WARN [main] com.netflix.config.sources.URLConfigurationSource 121 - No URLs will be polled as dynamic configuration sources. 2020-06-05 10:46:47.911 INFO [main] com.netflix.config.sources.URLConfigurationSource 122 - To enable URLs as dynamic configuration sources, define System property archaius.configurationSource.additionalUrls or make config.properties available on classpath. 2020-06-05 10:46:47.922 INFO [main] com.netflix.config.DynamicPropertyFactory 281 getInstance - DynamicPropertyFactory is initialized with configuration sources: com.netflix.config.ConcurrentCompositeConfiguration@6614289a 2020-06-05 10:46:48,611 INFO (main) INFO ServletRegistrationBean - Servlet dispatcherServlet mapped to [/] 2020-06-05 10:46:48,613 INFO (main) INFO FilterRegistrationBean - Mapping filter: 'characterEncodingFilter' to: [/] 2020-06-05 10:46:48,613 INFO (main) INFO FilterRegistrationBean - Mapping filter: 'hiddenHttpMethodFilter' to: [/] 2020-06-05 10:46:48,613 INFO (main) INFO FilterRegistrationBean - Mapping filter: 'httpPutFormContentFilter' to: [/] 2020-06-05 10:46:48,613 INFO (main) INFO FilterRegistrationBean - Mapping filter: 'requestContextFilter' to: [/] 2020-06-05 10:46:48,613 INFO (main) INFO FilterRegistrationBean - Mapping filter: 'httpTraceFilter' to: [/] 2020-06-05 10:46:48,613 INFO (main) INFO FilterRegistrationBean - Mapping filter: 'webMvcMetricsFilter' to: [/] 2020-06-05 10:46:48.617 INFO [main] org.eclipse.jetty.server.handler.ContextHandler 851 doStart - Started o.s.b.w.e.j.JettyEmbeddedWebAppContext@4d16975b{application,/,[file:///tmp/jetty-docbase.5430824108114653056.10099/],AVAILABLE} 2020-06-05 10:46:48.617 INFO [main] org.eclipse.jetty.server.Server 411 doStart - Started @4864ms 2020-06-05 10:46:48,625 INFO (main) INFO CglibAopProxy - Final method [protected final void org.springframework.boot.web.servlet.support.SpringBootServletInitializer.setRegisterErrorPageFilter(boolean)] cannot get proxied via CGLIB: Calls to this method will NOT be routed to the target instance and might lead to NPEs against uninitialized fields in the proxy instance. 2020-06-05 10:46:48.680 INFO [main] com.webank.wedatasphere.linkis.resourcemanager.client.ResourceManagerClient 42 info - ResourceManagerClient init 2020-06-05 10:46:48.685 INFO [main] com.webank.wedatasphere.linkis.enginemanager.hive.conf.HiveEngineManagerSpringConfiguration$$EnhancerBySpringCGLIB$$306074d8 45 createResource - create resource for hive 2020-06-05 10:46:48.747 INFO [main] com.webank.wedatasphere.linkis.rpc.RPCReceiveRestful 42 info - init all receiverChoosers in spring beans, list => List(com.webank.wedatasphere.linkis.rpc.CommonReceiverChooser@726ef6aa) 2020-06-05 10:46:48.750 INFO [main] com.webank.wedatasphere.linkis.rpc.RPCReceiveRestful 42 info - init all receiverSenderBuilders in spring beans, list => List(com.webank.wedatasphere.linkis.rpc.CommonReceiverSenderBuilder@e2f6a45) 2020-06-05 10:46:48.755 INFO [main] com.webank.wedatasphere.linkis.rpc.RPCReceiveRestful 42 info - init RPCReceiverListenerBus with queueSize 1000 and consumeThreadSize 10. 2020-06-05 10:46:48.757 INFO [main] com.webank.wedatasphere.linkis.rpc.AsynRPCMessageBus 42 info - RPC-Receiver-Asyn-Thread-ListenerBus add a new listener => class com.webank.wedatasphere.linkis.rpc.RPCReceiveRestful$$anon$1 2020-06-05 10:46:48,810 INFO (main) INFO InstanceInfoFactory - Setting initial instance status as: STARTING 2020-06-05 10:46:48.868 INFO [main] com.netflix.discovery.DiscoveryClient 349 - Initializing Eureka in region us-east-1 2020-06-05 10:46:49.045 INFO [main] com.netflix.discovery.provider.DiscoveryJerseyProvider 70 - Using JSON encoding codec LegacyJacksonJson 2020-06-05 10:46:49.045 INFO [main] com.netflix.discovery.provider.DiscoveryJerseyProvider 71 - Using JSON decoding codec LegacyJacksonJson 2020-06-05 10:46:49.187 INFO [main] com.netflix.discovery.provider.DiscoveryJerseyProvider 80 - Using XML encoding codec XStreamXml 2020-06-05 10:46:49.188 INFO [main] com.netflix.discovery.provider.DiscoveryJerseyProvider 81 - Using XML decoding codec XStreamXml 2020-06-05 10:46:49.343 INFO [main] com.netflix.discovery.shared.resolver.aws.ConfigClusterResolver 43 getClusterEndpoints - Resolving eureka endpoints via configuration 2020-06-05 10:46:49.364 INFO [main] com.netflix.discovery.DiscoveryClient 958 fetchRegistry - Disable delta property : false 2020-06-05 10:46:49.364 INFO [main] com.netflix.discovery.DiscoveryClient 959 fetchRegistry - Single vip registry refresh property : null 2020-06-05 10:46:49.364 INFO [main] com.netflix.discovery.DiscoveryClient 960 fetchRegistry - Force full registry fetch : false 2020-06-05 10:46:49.364 INFO [main] com.netflix.discovery.DiscoveryClient 961 fetchRegistry - Application is null : false 2020-06-05 10:46:49.365 INFO [main] com.netflix.discovery.DiscoveryClient 962 fetchRegistry - Registered Applications size is zero : true 2020-06-05 10:46:49.365 INFO [main] com.netflix.discovery.DiscoveryClient 964 fetchRegistry - Application version is -1: true 2020-06-05 10:46:49.365 INFO [main] com.netflix.discovery.DiscoveryClient 1047 getAndStoreFullRegistry - Getting all instance registry info from the eureka server 2020-06-05 10:46:49.496 INFO [main] com.netflix.discovery.DiscoveryClient 1056 getAndStoreFullRegistry - The response status is 200 2020-06-05 10:46:49.499 INFO [main] com.netflix.discovery.DiscoveryClient 1264 initScheduledTasks - Starting heartbeat executor: renew interval is: 30 2020-06-05 10:46:49.502 INFO [main] com.netflix.discovery.InstanceInfoReplicator 60 - InstanceInfoReplicator onDemand update allowed rate per min is 4 2020-06-05 10:46:49.505 INFO [main] com.netflix.discovery.DiscoveryClient 449 - Discovery Client initialized at timestamp 1591325209504 with initial instances count: 18 2020-06-05 10:46:49.739 WARN [main] com.netflix.config.sources.URLConfigurationSource 121 - No URLs will be polled as dynamic configuration sources. 2020-06-05 10:46:49.739 INFO [main] com.netflix.config.sources.URLConfigurationSource 122 - To enable URLs as dynamic configuration sources, define System property archaius.configurationSource.additionalUrls or make config.properties available on classpath. 2020-06-05 10:46:49,826 INFO (main) INFO SimpleUrlHandlerMapping - Mapped URL path [//favicon.ico] onto handler of type [class org.springframework.web.servlet.resource.ResourceHttpRequestHandler] 2020-06-05 10:46:49,955 INFO (main) INFO RequestMappingHandlerAdapter - Looking for @ControllerAdvice: org.springframework.boot.web.servlet.context.AnnotationConfigServletWebServerApplicationContext@444548a0: startup date [Fri Jun 05 10:46:45 CST 2020]; parent: org.springframework.context.annotation.AnnotationConfigApplicationContext@4b6579e8 2020-06-05 10:46:50,067 INFO (main) INFO RequestMappingHandlerMapping - Mapped "{[/error]}" onto public org.springframework.http.ResponseEntity<java.util.Map<java.lang.String, java.lang.Object>> org.springframework.boot.autoconfigure.web.servlet.error.BasicErrorController.error(javax.servlet.http.HttpServletRequest) 2020-06-05 10:46:50,068 INFO (main) INFO RequestMappingHandlerMapping - Mapped "{[/error],produces=[text/html]}" onto public org.springframework.web.servlet.ModelAndView org.springframework.boot.autoconfigure.web.servlet.error.BasicErrorController.errorHtml(javax.servlet.http.HttpServletRequest,javax.servlet.http.HttpServletResponse) 2020-06-05 10:46:50,115 INFO (main) INFO SimpleUrlHandlerMapping - Mapped URL path [/webjars/] onto handler of type [class org.springframework.web.servlet.resource.ResourceHttpRequestHandler] 2020-06-05 10:46:50,115 INFO (main) INFO SimpleUrlHandlerMapping - Mapped URL path [/**] onto handler of type [class org.springframework.web.servlet.resource.ResourceHttpRequestHandler] 2020-06-05 10:46:50,590 WARN (main) WARN EurekaStarterDeprecationWarningAutoConfiguration - spring-cloud-starter-eureka is deprecated as of Spring Cloud Netflix 1.4.0, please migrate to spring-cloud-starter-netflix-eureka 2020-06-05 10:46:50,602 INFO (main) INFO EndpointLinksResolver - Exposing 2 endpoint(s) beneath base path '/actuator' 2020-06-05 10:46:50,614 INFO (main) INFO WebMvcEndpointHandlerMapping - Mapped "{[/actuator/info],methods=[GET],produces=[application/vnd.spring-boot.actuator.v2+json || application/json]}" onto public java.lang.Object org.springframework.boot.actuate.endpoint.web.servlet.AbstractWebMvcEndpointHandlerMapping$OperationHandler.handle(javax.servlet.http.HttpServletRequest,java.util.Map<java.lang.String, java.lang.String>) 2020-06-05 10:46:50,615 INFO (main) INFO WebMvcEndpointHandlerMapping - Mapped "{[/actuator/refresh],methods=[POST],produces=[application/vnd.spring-boot.actuator.v2+json || application/json]}" onto public java.lang.Object org.springframework.boot.actuate.endpoint.web.servlet.AbstractWebMvcEndpointHandlerMapping$OperationHandler.handle(javax.servlet.http.HttpServletRequest,java.util.Map<java.lang.String, java.lang.String>) 2020-06-05 10:46:50,615 INFO (main) INFO WebMvcEndpointHandlerMapping - Mapped "{[/actuator],methods=[GET],produces=[application/vnd.spring-boot.actuator.v2+json || application/json]}" onto protected java.util.Map<java.lang.String, java.util.Map<java.lang.String, org.springframework.boot.actuate.endpoint.web.Link>> org.springframework.boot.actuate.endpoint.web.servlet.WebMvcEndpointHandlerMapping.links(javax.servlet.http.HttpServletRequest,javax.servlet.http.HttpServletResponse) 2020-06-05 10:46:50,663 INFO (main) INFO AnnotationMBeanExporter - Registering beans for JMX exposure on startup 2020-06-05 10:46:50,671 INFO (main) INFO AnnotationMBeanExporter - Bean with name 'environmentManager' has been autodetected for JMX exposure 2020-06-05 10:46:50,672 INFO (main) INFO AnnotationMBeanExporter - Bean with name 'configurationPropertiesRebinder' has been autodetected for JMX exposure 2020-06-05 10:46:50,672 INFO (main) INFO AnnotationMBeanExporter - Bean with name 'refreshScope' has been autodetected for JMX exposure 2020-06-05 10:46:50,675 INFO (main) INFO AnnotationMBeanExporter - Located managed bean 'environmentManager': registering with JMX server as MBean [org.springframework.cloud.context.environment:name=environmentManager,type=EnvironmentManager] 2020-06-05 10:46:50,683 INFO (main) INFO AnnotationMBeanExporter - Located managed bean 'refreshScope': registering with JMX server as MBean [org.springframework.cloud.context.scope.refresh:name=refreshScope,type=RefreshScope] 2020-06-05 10:46:50,693 INFO (main) INFO AnnotationMBeanExporter - Located managed bean 'configurationPropertiesRebinder': registering with JMX server as MBean [org.springframework.cloud.context.properties:name=configurationPropertiesRebinder,context=444548a0,type=ConfigurationPropertiesRebinder] 2020-06-05 10:46:50,704 INFO (main) INFO DefaultLifecycleProcessor - Starting beans in phase 0 2020-06-05 10:46:50,704 INFO (main) INFO EurekaServiceRegistry - Registering application hiveEngineManager with eureka with status UP 2020-06-05 10:46:50.704 INFO [main] com.netflix.discovery.DiscoveryClient 1299 notify - Saw local status change event StatusChangeEvent [timestamp=1591325210704, current=UP, previous=STARTING] 2020-06-05 10:46:50.706 INFO [DiscoveryClient-InstanceInfoReplicator-0] com.netflix.discovery.DiscoveryClient 826 register - DiscoveryClient_HIVEENGINEMANAGER/n4:hiveEngineManager:10099: registering service... 2020-06-05 10:46:50.720 INFO [main] com.webank.wedatasphere.linkis.rpc.conf.RPCSpringConfiguration$$EnhancerBySpringCGLIB$$ed3e1350 42 info - DataWorkCloud RPC need register RPCReceiveRestful, now add it to configuration. 2020-06-05 10:46:50,720 INFO (main) INFO DataWorkCloudApplication - add config from config server... 2020-06-05 10:46:50,724 INFO (main) INFO DataWorkCloudApplication - initialize DataWorkCloud spring application... 2020-06-05 10:46:50.746 INFO [main] org.eclipse.jetty.server.handler.ContextHandler.application 2318 log - Initializing Spring FrameworkServlet 'dispatcherServlet' 2020-06-05 10:46:50,746 INFO (main) INFO DispatcherServlet - FrameworkServlet 'dispatcherServlet': initialization started 2020-06-05 10:46:50.750 INFO [DiscoveryClient-InstanceInfoReplicator-0] com.netflix.discovery.DiscoveryClient 835 register - DiscoveryClient_HIVEENGINEMANAGER/n4:hiveEngineManager:10099 - registration status: 204 2020-06-05 10:46:50,763 INFO (main) INFO DispatcherServlet - FrameworkServlet 'dispatcherServlet': initialization completed in 17 ms 2020-06-05 10:46:50.783 INFO [main] org.eclipse.jetty.server.AbstractConnector 289 doStart - Started ServerConnector@38d895e8{HTTP/1.1,[http/1.1]}{0.0.0.0:10099} 2020-06-05 10:46:50,785 INFO (main) INFO JettyWebServer - Jetty started on port(s) 10099 (http/1.1) with context path '/' 2020-06-05 10:46:50,786 INFO (main) INFO EurekaAutoServiceRegistration - Updating port to 10099 2020-06-05 10:46:50,787 INFO (main) INFO DataWorkCloudApplication - Started DataWorkCloudApplication in 6.457 seconds (JVM running for 7.034) 2020-06-05 10:46:50.790 INFO [main] com.webank.wedatasphere.linkis.resourcemanager.service.annotation.RMAnnotationParser 37 info - Prepare to register resources with RM(准备向RM注册资源)... 2020-06-05 10:46:50.797 INFO [main] com.webank.wedatasphere.linkis.resourcemanager.service.annotation.RMAnnotationParser 37 info - Start registering resources with RM(开始向RM注册资源):ModuleInfo(ServiceInstance(hiveEngineManager, n4:10099),Number of instances(实例数):20,(RAM)内存:40.0 GB,cpu:20,Number of instances(实例数):2,(RAM)内存:4.0 GB,cpu:2,LoadInstance) 2020-06-05 10:46:50.833 INFO [main] com.webank.wedatasphere.linkis.rpc.transform.RPCProduct$ 42 info - RPC Serializers: List(JavaMapSerializer$, JavaCollectionSerializer$, ModuleInfoSerializer$, ModuleInstanceSerializer$, ModuleResourceInfoSerializer$, ResourceSerializer$, ResultResourceSerializer$), serializerClasses: List(class com.webank.wedatasphere.linkis.resourcemanager.domain.ModuleInfo, class com.webank.wedatasphere.linkis.common.ServiceInstance, class com.webank.wedatasphere.linkis.resourcemanager.domain.ModuleResourceInfo, class com.webank.wedatasphere.linkis.resourcemanager.Resource, interface com.webank.wedatasphere.linkis.resourcemanager.ResultResource, interface java.util.List, interface java.util.Map) 2020-06-05 10:46:50.876 INFO [main] com.webank.wedatasphere.linkis.rpc.AsynRPCMessageBus 42 info - RPC-Sender-Asyn-Thread-ListenerBus add a new listener => class com.webank.wedatasphere.linkis.rpc.BaseRPCSender$$anon$1 2020-06-05 10:46:50,968 INFO (main) INFO AnnotationConfigApplicationContext - Refreshing SpringClientFactory-ResourceManager: startup date [Fri Jun 05 10:46:50 CST 2020]; parent: org.springframework.boot.web.servlet.context.AnnotationConfigServletWebServerApplicationContext@444548a0 2020-06-05 10:46:51,014 INFO (main) INFO AutowiredAnnotationBeanPostProcessor - JSR-330 'javax.inject.Inject' annotation found and supported for autowiring 2020-06-05 10:46:51.136 INFO [main] com.netflix.config.ChainedDynamicProperty 115 checkAndFlip - Flipping property: ResourceManager.ribbon.ActiveConnectionsLimit to use NEXT property: niws.loadbalancer.availabilityFilteringRule.activeConnectionsLimit = 2147483647 2020-06-05 10:46:51.153 INFO [main] com.netflix.util.concurrent.ShutdownEnabledTimer 58 - Shutdown hook installed for: NFLoadBalancer-PingTimer-ResourceManager 2020-06-05 10:46:51.158 INFO [main] com.netflix.loadbalancer.BaseLoadBalancer 192 initWithConfig - Client: ResourceManager instantiated a LoadBalancer: DynamicServerListLoadBalancer:{NFLoadBalancer:name=ResourceManager,current list of Servers=[],Load balancer stats=Zone stats: {},Server stats: []}ServerList:null 2020-06-05 10:46:51.164 INFO [main] com.netflix.loadbalancer.DynamicServerListLoadBalancer 222 enableAndInitLearnNewServersFeature - Using serverListUpdater PollingServerListUpdater 2020-06-05 10:46:51.183 INFO [main] com.netflix.config.ChainedDynamicProperty 115 checkAndFlip - Flipping property: ResourceManager.ribbon.ActiveConnectionsLimit to use NEXT property: niws.loadbalancer.availabilityFilteringRule.activeConnectionsLimit = 2147483647 2020-06-05 10:46:51.185 INFO [main] com.netflix.loadbalancer.DynamicServerListLoadBalancer 150 restOfInit - DynamicServerListLoadBalancer for client ResourceManager initialized: DynamicServerListLoadBalancer:{NFLoadBalancer:name=ResourceManager,current list of Servers=[n4:9104],Load balancer stats=Zone stats: {defaultzone=[Zone:defaultzone; Instance count:1; Active connections count: 0; Circuit breaker tripped count: 0; Active connections per server: 0.0;] },Server stats: [[Server:n4:9104; Zone:defaultZone; Total Requests:0; Successive connection failure:0; Total blackout seconds:0; Last connection made:Thu Jan 01 08:00:00 CST 1970; First connection made: Thu Jan 01 08:00:00 CST 1970; Active Connections:0; total failure count in last (1000) msecs:0; average resp time:0.0; 90 percentile resp time:0.0; 95 percentile resp time:0.0; min resp time:0.0; max resp time:0.0; stddev resp time:0.0] ]}ServerList:org.springframework.cloud.netflix.ribbon.eureka.DomainExtractingServerList@2e7563f6 2020-06-05 10:46:52.167 INFO [PollingServerListUpdater-0] com.netflix.config.ChainedDynamicProperty 115 checkAndFlip - Flipping property: ResourceManager.ribbon.ActiveConnectionsLimit to use NEXT property: niws.loadbalancer.availabilityFilteringRule.activeConnectionsLimit = 2147483647 2020-06-05 10:47:02,234 INFO (qtp232681351-23) INFO RestfulApplication - register com.webank.wedatasphere.linkis.rpc.RPCReceiveRestful 2020-06-05 10:47:02,236 INFO (qtp232681351-23) INFO RestfulApplication - packages com.webank.wedatasphere.linkis.entrance.restful 2020-06-05 10:47:02.724 INFO [qtp232681351-23] com.webank.wedatasphere.linkis.enginemanager.impl.EngineManagerImpl 42 info - User hadoop wants to request a new engine, messages: TimeoutRequestNewEngine(1200000,hadoop,IDE,{_req_entrance_instance=hiveEntrance,n4:9108}) 2020-06-05 10:47:02,747 INFO (qtp232681351-23) INFO AnnotationConfigApplicationContext - Refreshing SpringClientFactory-cloud-publicservice: startup date [Fri Jun 05 10:47:02 CST 2020]; parent: org.springframework.boot.web.servlet.context.AnnotationConfigServletWebServerApplicationContext@444548a0 2020-06-05 10:47:02,776 INFO (qtp232681351-23) INFO AutowiredAnnotationBeanPostProcessor - JSR-330 'javax.inject.Inject' annotation found and supported for autowiring 2020-06-05 10:47:02.861 INFO [qtp232681351-23] com.netflix.config.ChainedDynamicProperty 115 checkAndFlip - Flipping property: cloud-publicservice.ribbon.ActiveConnectionsLimit to use NEXT property: niws.loadbalancer.availabilityFilteringRule.activeConnectionsLimit = 2147483647 2020-06-05 10:47:02.870 INFO [qtp232681351-23] com.netflix.util.concurrent.ShutdownEnabledTimer 58 - Shutdown hook installed for: NFLoadBalancer-PingTimer-cloud-publicservice 2020-06-05 10:47:02.871 INFO [qtp232681351-23] com.netflix.loadbalancer.BaseLoadBalancer 192 initWithConfig - Client: cloud-publicservice instantiated a LoadBalancer: DynamicServerListLoadBalancer:{NFLoadBalancer:name=cloud-publicservice,current list of Servers=[],Load balancer stats=Zone stats: {},Server stats: []}ServerList:null 2020-06-05 10:47:02.873 INFO [qtp232681351-23] com.netflix.loadbalancer.DynamicServerListLoadBalancer 222 enableAndInitLearnNewServersFeature - Using serverListUpdater PollingServerListUpdater 2020-06-05 10:47:02.876 INFO [qtp232681351-23] com.netflix.config.ChainedDynamicProperty 115 checkAndFlip - Flipping property: cloud-publicservice.ribbon.ActiveConnectionsLimit to use NEXT property: niws.loadbalancer.availabilityFilteringRule.activeConnectionsLimit = 2147483647 2020-06-05 10:47:02.877 INFO [qtp232681351-23] com.netflix.loadbalancer.DynamicServerListLoadBalancer 150 restOfInit - DynamicServerListLoadBalancer for client cloud-publicservice initialized: DynamicServerListLoadBalancer:{NFLoadBalancer:name=cloud-publicservice,current list of Servers=[n4:9102],Load balancer stats=Zone stats: {defaultzone=[Zone:defaultzone; Instance count:1; Active connections count: 0; Circuit breaker tripped count: 0; Active connections per server: 0.0;] },Server stats: [[Server:n4:9102; Zone:defaultZone; Total Requests:0; Successive connection failure:0; Total blackout seconds:0; Last connection made:Thu Jan 01 08:00:00 CST 1970; First connection made: Thu Jan 01 08:00:00 CST 1970; Active Connections:0; total failure count in last (1000) msecs:0; average resp time:0.0; 90 percentile resp time:0.0; 95 percentile resp time:0.0; min resp time:0.0; max resp time:0.0; stddev resp time:0.0] ]}ServerList:org.springframework.cloud.netflix.ribbon.eureka.DomainExtractingServerList@71abf639 2020-06-05 10:47:02.930 INFO [qtp232681351-23] com.webank.wedatasphere.linkis.enginemanager.hook.JarLoaderEngineHook 42 info - start loading UDFs 2020-06-05 10:47:03.000 INFO [qtp232681351-23] com.webank.wedatasphere.linkis.enginemanager.hook.JarLoaderEngineHook 42 info - added jars: 2020-06-05 10:47:03.000 INFO [qtp232681351-23] com.webank.wedatasphere.linkis.enginemanager.hook.JarLoaderEngineHook 42 info - end loading UDFs 2020-06-05 10:47:03.384 INFO [qtp232681351-23] com.webank.wedatasphere.linkis.enginemanager.EngineManagerReceiver 42 info - User hadoopcreated a new engine CommonProcessEngine(port: 41008, creator: IDE, user: hadoop), creator=IDE 2020-06-05 10:47:03.391 INFO [Engine-Manager-Thread-1] com.webank.wedatasphere.linkis.enginemanager.hive.process.HiveQLProcessBuilder 42 info - Running /opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/bin/rootScript.sh hadoop /usr/java/jdk1.8.0_181-cloudera/jre/bin/java -Xmx2g -Xms2g -server -XX:+UseG1GC -XX:MaxPermSize=250m -XX:PermSize=128m -Xloggc:/appcom/logs/dataworkcloud/hadoop/hiveEngine/hiveEngine/41008-gc.log20200605-10_47 -XX:+PrintGCDetails -XX:+PrintGCTimeStamps -XX:+PrintGCDateStamps -Dwds.linkis.configuration=linkis-engine.properties -Djava.library.path=/appcom/Install/hadoop/lib/native -cp /opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/linkis-ujes-enginemanager-0.9.3.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/conf:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/:/appcom/commonlib/webank_bdp_udf.jar:/etc/hadoop/conf:/etc/hbase/conf:/appcom/config/spark-config:/etc/hive/conf: com.webank.wedatasphere.linkis.engine.DataWorkCloudEngineApplication --dwc-conf _req_entrance_instance=hiveEntrance,n4:9108 --dwc-conf wds.linkis.yarnqueue.memory.max=300G --dwc-conf wds.linkis.preheating.time=9:00 --dwc-conf wds.linkis.instance=3 --dwc-conf hive.client.memory=2g --dwc-conf wds.linkis.tmpfile.clean.time=10:00 --dwc-conf mapred.reduce.tasks=10 --dwc-conf ticketId=734b1127-0289-4626-a629-d66af440ff87 --dwc-conf dwc.application.instance=n4:10099 --dwc-conf creator=IDE --dwc-conf wds.linkis.yarnqueue=default --dwc-conf dfs.block.size=10 --dwc-conf wds.linkis.yarnqueue.cores.max=150 --dwc-conf hive.exec.reduce.bytes.per.reducer=10 --dwc-conf wds.linkis.client.memory.max=20G --dwc-conf dwc.application.name=hiveEngineManager --dwc-conf user=hadoop --spring-conf eureka.client.serviceUrl.defaultZone=http://localhost:20303/eureka/ --spring-conf logging.config=classpath:log4j2-engine.xml --spring-conf spring.profiles.active=engine --spring-conf server.port=41008 --spring-conf spring.application.name=hiveEngine 41008: spawn sudo su - 41008: Last login: Fri Jun 5 10:46:34 CST 2020 on pts/4 41008: ]0;root@n4:~[?1034h[root@n4 ~]# su - hadoop 41008: Last login: Fri Jun 5 10:46:34 CST 2020 on pts/4 41008: ]0;hadoop@n4:~[?1034h[hadoop@n4 ~]$ /usr/java/jdk1.8.0_181-cloudera/jre/bin/java -Xmx2g -Xms2g -server -XX:+UseG1GC -XX:MaxPermSize=250m -XX:PermSize=128m -Xloggc:/appcom/logs/datawor 41008: kcloud/hadoop/hiveEngine/hiveEngine/41008-gc.log20200605-10_47 -XX:+PrintGCDetails -XX:+PrintGCTimeStamps -XX:+PrintGCDateStamps -Dwds.linkis.configuration=linkis 41008: -engine.properties -Djava.library.path=/appcom/Install/hadoop/lib/native -cp /opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/linkis-ujes-enginemana 41008: ger-0.9.3.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/conf:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/:/appcom/commonlib/w 41008: ebank_bdp_udf.jar:/etc/hadoop/conf:/etc/hbase/conf:/appcom/config/spark-config:/etc/hive/conf: com.webank.wedatasphere.linkis.engine.DataWorkCloudEngineApplicatio 41008: n --dwc-conf _req_entrance_instance=hiveEntrance,n4:9108 --dwc-conf wds.linkis.yarnqueue.memory.max=300G --dwc-conf wds.linkis.preheating.time=9:00 --dwc-conf wds 41008: .linkis.instance=3 --dwc-conf hive.client.memory=2g --dwc-conf wds.linkis.tmpfile.clean.time=10:00 --dwc-conf mapred.reduce.tasks=10 --dwc-conf ticketId=734b1127- 41008: 0289-4626-a629-d66af440ff87 --dwc-conf dwc.application.instance=n4:10099 --dwc-conf creator=IDE --dwc-conf wds.linkis.yarnqueue=default --dwc-conf dfs.block.size= 41008: 10 --dwc-conf wds.linkis.yarnqueue.cores.max=150 --dwc-conf hive.exec.reduce.bytes.per.reducer=10 --dwc-conf wds.linkis.client.memory.max=20G --dwc-conf dwc.appli 41008: cation.name=hiveEngineManager --dwc-conf user=hadoop --spring-conf eureka.client.serviceUrl.defaultZone=http://localhost:20303/eureka/ --spring-conf logging.confi 41008: g=classpath:log4j2-engine.xml --spring-conf spring.profiles.active=engine --spring-conf server.port=41008 --spring-conf spring.application.name=hiveEngine 41008: Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=250m; support was removed in 8.0 41008: Java HotSpot(TM) 64-Bit Server VM warning: ignoring option PermSize=128m; support was removed in 8.0 41008: Java HotSpot(TM) 64-Bit Server VM warning: Cannot open file /appcom/logs/dataworkcloud/hadoop/hiveEngine/hiveEngine/41008-gc.log20200605-10_47 due to No such file or directory 41008: 2020-06-05 10:47:03.875 INFO [PollingServerListUpdater-0] com.netflix.config.ChainedDynamicProperty 115 checkAndFlip - Flipping property: cloud-publicservice.ribbon.ActiveConnectionsLimit to use NEXT property: niws.loadbalancer.availabilityFilteringRule.activeConnectionsLimit = 2147483647 41008: 2020-06-05 10:47:04.284 INFO [main] com.webank.wedatasphere.linkis.engine.DataWorkCloudEngineApplication$ 61 - Now log4j2 Rolling File is set to be /appcom/logs/dataworkcloud/hiveEngine/hadoop/hiveEngine_n4_hadoop_2020-06-05_10:47:03.log 41008: 2020-06-05 10:47:04.289 INFO [main] com.webank.wedatasphere.linkis.engine.DataWorkCloudEngineApplication$ 62 - Now shortLogFile is set to be hiveEngine_n4_hadoop_2020-06-0510:47:03.log 41008: 2020-06-05 10:47:04,676 INFO (background-preinit) INFO Version - HV000001: Hibernate Validator 5.1.2.Final 41008: 2020-06-05 10:47:04,973 INFO (main) INFO AnnotationConfigApplicationContext - Refreshing org.springframework.context.annotation.AnnotationConfigApplicationContext@538613b3: startup date [Fri Jun 05 10:47:04 CST 2020]; root of context hierarchy 41008: 2020-06-05 10:47:05,210 INFO (main) INFO AutowiredAnnotationBeanPostProcessor - JSR-330 'javax.inject.Inject' annotation found and supported for autowiring 41008: 2020-06-05 10:47:05,244 INFO (main) INFO PostProcessorRegistrationDelegate$BeanPostProcessorChecker - Bean 'configurationPropertiesRebinderAutoConfiguration' of type [org.springframework.cloud.autoconfigure.ConfigurationPropertiesRebinderAutoConfiguration$$EnhancerBySpringCGLIB$$b3acd2be] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) 41008: 41008: . ____ 41008: /\ / __' () _ \ \ \ \ 41008: ( ( )__ | ' | '| | ' \/ _` | \ \ \ \ 41008: \/ _)| |)| | | | | || (| | ) ) ) ) 41008: ' |__| ._|| ||| |_, | / / / / 41008: =========|_|==============|__/=//// 41008:  :: Spring Boot ::   (v2.0.3.RELEASE) 41008: 41008: 2020-06-05 10:47:05,512 INFO (main) INFO ConfigServicePropertySourceLocator - Fetching config from server at : http://localhost:8888 41008: 2020-06-05 10:47:05,603 INFO (main) INFO ConfigServicePropertySourceLocator - Connect Timeout Exception on Url - http://localhost:8888. Will be trying the next url if available 41008: 2020-06-05 10:47:05,603 WARN (main) WARN ConfigServicePropertySourceLocator - Could not locate PropertySource: I/O error on GET request for "http://localhost:8888/hiveEngine/engine": Connection refused (Connection refused); nested exception is java.net.ConnectException: Connection refused (Connection refused) 41008: 2020-06-05 10:47:05,605 INFO (main) INFO DataWorkCloudApplication - The following profiles are active: engine 41008: 2020-06-05 10:47:05,625 INFO (main) INFO DataWorkCloudApplication - add config from config server... 41008: 2020-06-05 10:47:05,625 INFO (main) INFO DataWorkCloudApplication - initialize DataWorkCloud spring application... 41008: 2020-06-05 10:47:05,626 INFO (main) INFO AnnotationConfigServletWebServerApplicationContext - Refreshing org.springframework.boot.web.servlet.context.AnnotationConfigServletWebServerApplicationContext@3104351d: startup date [Fri Jun 05 10:47:05 CST 2020]; parent: org.springframework.context.annotation.AnnotationConfigApplicationContext@538613b3 41008: 2020-06-05 10:47:06,361 INFO (main) INFO DefaultListableBeanFactory - Overriding bean definition for bean 'codeParser' with a different definition: replacing [Root bean: class [null]; scope=; abstract=false; lazyInit=false; autowireMode=3; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=engineServerSpringConfiguration; factoryMethodName=createCodeParser; initMethodName=null; destroyMethodName=(inferred); defined in class path resource [com/webank/wedatasphere/linkis/engine/EngineServerSpringConfiguration.class]] with [Root bean: class [null]; scope=; abstract=false; lazyInit=false; autowireMode=3; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=hiveEngineSpringConfiguration; factoryMethodName=generateCodeParser; initMethodName=null; destroyMethodName=(inferred); defined in class path resource [com/webank/wedatasphere/linkis/engine/hive/HiveEngineSpringConfiguration.class]] 41008: 2020-06-05 10:47:06,361 INFO (main) INFO DefaultListableBeanFactory - Overriding bean definition for bean 'engineHooks' with a different definition: replacing [Root bean: class [null]; scope=; abstract=false; lazyInit=false; autowireMode=3; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=engineServerSpringConfiguration; factoryMethodName=createEngineHooks; initMethodName=null; destroyMethodName=(inferred); defined in class path resource [com/webank/wedatasphere/linkis/engine/EngineServerSpringConfiguration.class]] with [Root bean: class [null]; scope=; abstract=false; lazyInit=false; autowireMode=3; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=hiveEngineSpringConfiguration; factoryMethodName=generateEngineHooks; initMethodName=null; destroyMethodName=(inferred); defined in class path resource [com/webank/wedatasphere/linkis/engine/hive/HiveEngineSpringConfiguration.class]] 41008: 2020-06-05 10:47:06,908 INFO (main) INFO GenericScope - BeanFactory id=3941643f-ac46-39ab-bd99-d05290400d6c 41008: 2020-06-05 10:47:06,929 INFO (main) INFO AutowiredAnnotationBeanPostProcessor - JSR-330 'javax.inject.Inject' annotation found and supported for autowiring 41008: 2020-06-05 10:47:07,176 INFO (main) INFO PostProcessorRegistrationDelegate$BeanPostProcessorChecker - Bean 'org.springframework.cloud.autoconfigure.ConfigurationPropertiesRebinderAutoConfiguration' of type [org.springframework.cloud.autoconfigure.ConfigurationPropertiesRebinderAutoConfiguration$$EnhancerBySpringCGLIB$$b3acd2be] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) 41008: 2020-06-05 10:47:07-379 INFO [main] org.eclipse.jetty.util.log org.eclipse.jetty.util.log.Log.initialized(Log.java:193) initialized - Logging initialized @3864ms to org.eclipse.jetty.util.log.Slf4jLog 41008: 2020-06-05 10:47:07,468 INFO (main) INFO JettyServletWebServerFactory - Server initialized with port: 41008 41008: 2020-06-05 10:47:07-503 INFO [main] org.eclipse.jetty.server.Server org.eclipse.jetty.server.Server.doStart(Server.java:374) doStart - jetty-9.4.11.v20180605; built: 2018-06-05T18:24:03.829Z; git: d5fc0523cfa96bfebfbda19606cad384d772f04c; jvm 1.8.0_181-b13 41008: 2020-06-05 10:47:07-630 INFO [main] org.eclipse.jetty.server.session org.eclipse.jetty.server.session.DefaultSessionIdManager.doStart(DefaultSessionIdManager.java:365) doStart - DefaultSessionIdManager workerName=node0 41008: 2020-06-05 10:47:07-630 INFO [main] org.eclipse.jetty.server.session org.eclipse.jetty.server.session.DefaultSessionIdManager.doStart(DefaultSessionIdManager.java:370) doStart - No SessionScavenger set, using defaults 41008: 2020-06-05 10:47:07-632 INFO [main] org.eclipse.jetty.server.session org.eclipse.jetty.server.session.HouseKeeper.startScavenging(HouseKeeper.java:149) startScavenging - node0 Scavenging every 600000ms 41008: 2020-06-05 10:47:07-637 INFO [main] org.eclipse.jetty.server.handler.ContextHandler.application org.eclipse.jetty.server.handler.ContextHandler$Context.log(ContextHandler.java:2318) log - Initializing Spring embedded WebApplicationContext 41008: 2020-06-05 10:47:07,638 INFO (main) INFO ContextLoader - Root WebApplicationContext: initialization completed in 2012 ms 41008: 2020-06-05 10:47:07-896 WARN [main] com.netflix.config.sources.URLConfigurationSource com.netflix.config.sources.URLConfigurationSource.(URLConfigurationSource.java:121) - No URLs will be polled as dynamic configuration sources. 41008: 2020-06-05 10:47:07-897 INFO [main] com.netflix.config.sources.URLConfigurationSource com.netflix.config.sources.URLConfigurationSource.(URLConfigurationSource.java:122) - To enable URLs as dynamic configuration sources, define System property archaius.configurationSource.additionalUrls or make config.properties available on classpath. 41008: 2020-06-05 10:47:07-910 INFO [main] com.netflix.config.DynamicPropertyFactory com.netflix.config.DynamicPropertyFactory.getInstance(DynamicPropertyFactory.java:281) getInstance - DynamicPropertyFactory is initialized with configuration sources: com.netflix.config.ConcurrentCompositeConfiguration@4bca8eaf 41008: 2020-06-05 10:47:08,688 INFO (main) INFO ServletRegistrationBean - Servlet dispatcherServlet mapped to [/] 41008: 2020-06-05 10:47:08,690 INFO (main) INFO FilterRegistrationBean - Mapping filter: 'characterEncodingFilter' to: [/] 41008: 2020-06-05 10:47:08,691 INFO (main) INFO FilterRegistrationBean - Mapping filter: 'hiddenHttpMethodFilter' to: [/] 41008: 2020-06-05 10:47:08,691 INFO (main) INFO FilterRegistrationBean - Mapping filter: 'httpPutFormContentFilter' to: [/] 41008: 2020-06-05 10:47:08,691 INFO (main) INFO FilterRegistrationBean - Mapping filter: 'requestContextFilter' to: [/] 41008: 2020-06-05 10:47:08,691 INFO (main) INFO FilterRegistrationBean - Mapping filter: 'httpTraceFilter' to: [/] 41008: 2020-06-05 10:47:08,692 INFO (main) INFO FilterRegistrationBean - Mapping filter: 'webMvcMetricsFilter' to: [/] 41008: 2020-06-05 10:47:08-696 INFO [main] org.eclipse.jetty.server.handler.ContextHandler org.eclipse.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:851) doStart - Started o.s.b.w.e.j.JettyEmbeddedWebAppContext@705d914f{application,/,[file:///tmp/jetty-docbase.9160803241436237391.41008/],AVAILABLE} 41008: 2020-06-05 10:47:08-697 INFO [main] org.eclipse.jetty.server.Server org.eclipse.jetty.server.Server.doStart(Server.java:411) doStart - Started @5183ms 41008: 2020-06-05 10:47:08,706 INFO (main) INFO CglibAopProxy - Final method [protected final void org.springframework.boot.web.servlet.support.SpringBootServletInitializer.setRegisterErrorPageFilter(boolean)] cannot get proxied via CGLIB: Calls to this method will NOT be routed to the target instance and might lead to NPEs against uninitialized fields in the proxy instance. 41008: 2020-06-05 10:47:08-724 INFO [main] com.webank.wedatasphere.linkis.resourcemanager.client.ResourceManagerClient com.webank.wedatasphere.linkis.common.utils.Logging$class.info(Logging.scala:42) info - ResourceManagerClient init 41008: 2020-06-05 10:47:08-751 INFO [hiveEngineEngineConsumerThread] com.webank.wedatasphere.linkis.scheduler.queue.fifoqueue.FIFOUserConsumer com.webank.wedatasphere.linkis.common.utils.Logging$class.info(Logging.scala:42) info - hiveEngineEngineConsumer thread started! 41008: 2020-06-05 10:47:08-769 INFO [main] com.webank.wedatasphere.linkis.engine.hive.HiveEngineSpringConfiguration$$EnhancerBySpringCGLIB$$66ccebb3 com.webank.wedatasphere.linkis.engine.hive.HiveEngineSpringConfiguration.generateCodeParser(HiveEngineSpringConfiguration.scala:36) generateCodeParser - code Parser is set in hive 41008: 2020-06-05 10:47:08-778 INFO [main] com.webank.wedatasphere.linkis.engine.hive.HiveEngineSpringConfiguration$$EnhancerBySpringCGLIB$$66ccebb3 com.webank.wedatasphere.linkis.engine.hive.HiveEngineSpringConfiguration.generateEngineHooks(HiveEngineSpringConfiguration.scala:42) generateEngineHooks - engineHooks are set in hive. 41008: 2020-06-05 10:47:08-827 INFO [main] com.webank.wedatasphere.linkis.engine.EngineReceiver com.webank.wedatasphere.linkis.common.utils.Logging$class.info(Logging.scala:42) info - Starting engineServer(17335)... 41008: 2020-06-05 10:47:08-984 INFO [main] com.webank.wedatasphere.linkis.rpc.transform.RPCProduct$ com.webank.wedatasphere.linkis.common.utils.Logging$class.info(Logging.scala:42) info - RPC Serializers: List(JavaMapSerializer$, JavaCollectionSerializer$, ModuleInfoSerializer$, ModuleInstanceSerializer$, ModuleResourceInfoSerializer$, ResourceSerializer$, ResultResourceSerializer$), serializerClasses: List(class com.webank.wedatasphere.linkis.resourcemanager.domain.ModuleInfo, class com.webank.wedatasphere.linkis.common.ServiceInstance, class com.webank.wedatasphere.linkis.resourcemanager.domain.ModuleResourceInfo, class com.webank.wedatasphere.linkis.resourcemanager.Resource, interface com.webank.wedatasphere.linkis.resourcemanager.ResultResource, interface java.util.List, interface java.util.Map) 41008: 2020-06-05 10:47:09-067 INFO [main] com.webank.wedatasphere.linkis.rpc.AsynRPCMessageBus com.webank.wedatasphere.linkis.common.utils.Logging$class.info(Logging.scala:42) info - RPC-Sender-Asyn-Thread-ListenerBus add a new listener => class com.webank.wedatasphere.linkis.rpc.BaseRPCSender$$anon$1 41008: 2020-06-05 10:47:09,418 INFO (main) INFO AnnotationConfigApplicationContext - Refreshing SpringClientFactory-hiveEngineManager: startup date [Fri Jun 05 10:47:09 CST 2020]; parent: org.springframework.boot.web.servlet.context.AnnotationConfigServletWebServerApplicationContext@3104351d 41008: 2020-06-05 10:47:09,467 INFO (main) INFO AutowiredAnnotationBeanPostProcessor - JSR-330 'javax.inject.Inject' annotation found and supported for autowiring 41008: 2020-06-05 10:47:09-716 INFO [main] com.netflix.config.ChainedDynamicProperty com.netflix.config.ChainedDynamicProperty$ChainLink.checkAndFlip(ChainedDynamicProperty.java:115) checkAndFlip - Flipping property: hiveEngineManager.ribbon.ActiveConnectionsLimit to use NEXT property: niws.loadbalancer.availabilityFilteringRule.activeConnectionsLimit = 2147483647 41008: 2020-06-05 10:47:09-738 INFO [main] com.netflix.util.concurrent.ShutdownEnabledTimer com.netflix.util.concurrent.ShutdownEnabledTimer.(ShutdownEnabledTimer.java:58) - Shutdown hook installed for: NFLoadBalancer-PingTimer-hiveEngineManager 41008: 2020-06-05 10:47:09-749 INFO [main] com.netflix.loadbalancer.BaseLoadBalancer com.netflix.loadbalancer.BaseLoadBalancer.initWithConfig(BaseLoadBalancer.java:192) initWithConfig - Client: hiveEngineManager instantiated a LoadBalancer: DynamicServerListLoadBalancer:{NFLoadBalancer:name=hiveEngineManager,current list of Servers=[],Load balancer stats=Zone stats: {},Server stats: []}ServerList:null 41008: 2020-06-05 10:47:09-756 INFO [main] com.netflix.loadbalancer.DynamicServerListLoadBalancer com.netflix.loadbalancer.DynamicServerListLoadBalancer.enableAndInitLearnNewServersFeature(DynamicServerListLoadBalancer.java:222) enableAndInitLearnNewServersFeature - Using serverListUpdater PollingServerListUpdater 41008: 2020-06-05 10:47:09,793 INFO (main) INFO InstanceInfoFactory - Setting initial instance status as: STARTING 41008: 2020-06-05 10:47:09-812 INFO [main] com.netflix.discovery.DiscoveryClient com.netflix.discovery.DiscoveryClient.(DiscoveryClient.java:349) - Initializing Eureka in region us-east-1 41008: 2020-06-05 10:47:09-853 INFO [main] com.netflix.discovery.provider.DiscoveryJerseyProvider com.netflix.discovery.provider.DiscoveryJerseyProvider.(DiscoveryJerseyProvider.java:70) - Using JSON encoding codec LegacyJacksonJson 41008: 2020-06-05 10:47:09-854 INFO [main] com.netflix.discovery.provider.DiscoveryJerseyProvider com.netflix.discovery.provider.DiscoveryJerseyProvider.(DiscoveryJerseyProvider.java:71) - Using JSON decoding codec LegacyJacksonJson 41008: 2020-06-05 10:47:10-018 INFO [main] com.netflix.discovery.provider.DiscoveryJerseyProvider com.netflix.discovery.provider.DiscoveryJerseyProvider.(DiscoveryJerseyProvider.java:80) - Using XML encoding codec XStreamXml 41008: 2020-06-05 10:47:10-018 INFO [main] com.netflix.discovery.provider.DiscoveryJerseyProvider com.netflix.discovery.provider.DiscoveryJerseyProvider.(DiscoveryJerseyProvider.java:81) - Using XML decoding codec XStreamXml 41008: 2020-06-05 10:47:10-284 INFO [main] com.netflix.discovery.shared.resolver.aws.ConfigClusterResolver com.netflix.discovery.shared.resolver.aws.ConfigClusterResolver.getClusterEndpoints(ConfigClusterResolver.java:43) getClusterEndpoints - Resolving eureka endpoints via configuration 41008: 2020-06-05 10:47:10-306 INFO [main] com.netflix.discovery.DiscoveryClient com.netflix.discovery.DiscoveryClient.fetchRegistry(DiscoveryClient.java:958) fetchRegistry - Disable delta property : false 41008: 2020-06-05 10:47:10-306 INFO [main] com.netflix.discovery.DiscoveryClient com.netflix.discovery.DiscoveryClient.fetchRegistry(DiscoveryClient.java:959) fetchRegistry - Single vip registry refresh property : null 41008: 2020-06-05 10:47:10-306 INFO [main] com.netflix.discovery.DiscoveryClient com.netflix.discovery.DiscoveryClient.fetchRegistry(DiscoveryClient.java:960) fetchRegistry - Force full registry fetch : false 41008: 2020-06-05 10:47:10-307 INFO [main] com.netflix.discovery.DiscoveryClient com.netflix.discovery.DiscoveryClient.fetchRegistry(DiscoveryClient.java:961) fetchRegistry - Application is null : false 41008: 2020-06-05 10:47:10-307 INFO [main] com.netflix.discovery.DiscoveryClient com.netflix.discovery.DiscoveryClient.fetchRegistry(DiscoveryClient.java:962) fetchRegistry - Registered Applications size is zero : true 41008: 2020-06-05 10:47:10-307 INFO [main] com.netflix.discovery.DiscoveryClient com.netflix.discovery.DiscoveryClient.fetchRegistry(DiscoveryClient.java:964) fetchRegistry - Application version is -1: true 41008: 2020-06-05 10:47:10-308 INFO [main] com.netflix.discovery.DiscoveryClient com.netflix.discovery.DiscoveryClient.getAndStoreFullRegistry(DiscoveryClient.java:1047) getAndStoreFullRegistry - Getting all instance registry info from the eureka server 41008: 2020-06-05 10:47:10-423 INFO [main] com.netflix.discovery.DiscoveryClient com.netflix.discovery.DiscoveryClient.getAndStoreFullRegistry(DiscoveryClient.java:1056) getAndStoreFullRegistry - The response status is 200 41008: 2020-06-05 10:47:10-426 INFO [main] com.netflix.discovery.DiscoveryClient com.netflix.discovery.DiscoveryClient.initScheduledTasks(DiscoveryClient.java:1264) initScheduledTasks - Starting heartbeat executor: renew interval is: 30 41008: 2020-06-05 10:47:10-430 INFO [main] com.netflix.discovery.InstanceInfoReplicator com.netflix.discovery.InstanceInfoReplicator.(InstanceInfoReplicator.java:60) - InstanceInfoReplicator onDemand update allowed rate per min is 4 41008: 2020-06-05 10:47:10-433 INFO [main] com.netflix.discovery.DiscoveryClient com.netflix.discovery.DiscoveryClient.(DiscoveryClient.java:449) - Discovery Client initialized at timestamp 1591325230432 with initial instances count: 19 41008: 2020-06-05 10:47:10-456 INFO [main] com.netflix.config.ChainedDynamicProperty com.netflix.config.ChainedDynamicProperty$ChainLink.checkAndFlip(ChainedDynamicProperty.java:115) checkAndFlip - Flipping property: hiveEngineManager.ribbon.ActiveConnectionsLimit to use NEXT property: niws.loadbalancer.availabilityFilteringRule.activeConnectionsLimit = 2147483647 41008: 2020-06-05 10:47:10-458 INFO [main] com.netflix.loadbalancer.DynamicServerListLoadBalancer com.netflix.loadbalancer.DynamicServerListLoadBalancer.restOfInit(DynamicServerListLoadBalancer.java:150) restOfInit - DynamicServerListLoadBalancer for client hiveEngineManager initialized: DynamicServerListLoadBalancer:{NFLoadBalancer:name=hiveEngineManager,current list of Servers=[n4:10099],Load balancer stats=Zone stats: {defaultzone=[Zone:defaultzone; Instance count:1; Active connections count: 0; Circuit breaker tripped count: 0; Active connections per server: 0.0;] 41008: },Server stats: [[Server:n4:10099; Zone:defaultZone; Total Requests:0; Successive connection failure:0; Total blackout seconds:0; Last connection made:Thu Jan 01 08:00:00 CST 1970; First connection made: Thu Jan 01 08:00:00 CST 1970; Active Connections:0; total failure count in last (1000) msecs:0; average resp time:0.0; 90 percentile resp time:0.0; 95 percentile resp time:0.0; min resp time:0.0; max resp time:0.0; stddev resp time:0.0] 41008: ]}ServerList:org.springframework.cloud.netflix.ribbon.eureka.DomainExtractingServerList@4b3eaf39 2020-06-05 10:47:10.619 INFO [RPC-Receiver-Asyn-Thread-Thread-0] com.webank.wedatasphere.linkis.rpc.AsynRPCMessageBus 42 info - RPC-Receiver-Asyn-Thread-Thread-0 begin. 41008: 2020-06-05 10:47:10-627 INFO [main] com.webank.wedatasphere.linkis.engine.hive.hook.HiveAddJarsEngineHook com.webank.wedatasphere.linkis.engine.hive.hook.HiveAddJarsEngineHook.beforeCreateEngine(HiveAddJarsEngineHook.scala:44) beforeCreateEngine - jarArray is null 41008: 2020-06-05 10:47:10-628 INFO [main] com.webank.wedatasphere.linkis.engine.execute.hook.JarUdfEngineHook com.webank.wedatasphere.linkis.common.utils.Logging$class.info(Logging.scala:42) info - start loading UDFs 41008: 2020-06-05 10:47:10,636 INFO (main) INFO AnnotationConfigApplicationContext - Refreshing SpringClientFactory-cloud-publicservice: startup date [Fri Jun 05 10:47:10 CST 2020]; parent: org.springframework.boot.web.servlet.context.AnnotationConfigServletWebServerApplicationContext@3104351d 41008: 2020-06-05 10:47:10,665 INFO (main) INFO AutowiredAnnotationBeanPostProcessor - JSR-330 'javax.inject.Inject' annotation found and supported for autowiring 41008: 2020-06-05 10:47:10-737 INFO [main] com.netflix.config.ChainedDynamicProperty com.netflix.config.ChainedDynamicProperty$ChainLink.checkAndFlip(ChainedDynamicProperty.java:115) checkAndFlip - Flipping property: cloud-publicservice.ribbon.ActiveConnectionsLimit to use NEXT property: niws.loadbalancer.availabilityFilteringRule.activeConnectionsLimit = 2147483647 41008: 2020-06-05 10:47:10-745 INFO [main] com.netflix.util.concurrent.ShutdownEnabledTimer com.netflix.util.concurrent.ShutdownEnabledTimer.(ShutdownEnabledTimer.java:58) - Shutdown hook installed for: NFLoadBalancer-PingTimer-cloud-publicservice 41008: 2020-06-05 10:47:10-746 INFO [main] com.netflix.loadbalancer.BaseLoadBalancer com.netflix.loadbalancer.BaseLoadBalancer.initWithConfig(BaseLoadBalancer.java:192) initWithConfig - Client: cloud-publicservice instantiated a LoadBalancer: DynamicServerListLoadBalancer:{NFLoadBalancer:name=cloud-publicservice,current list of Servers=[],Load balancer stats=Zone stats: {},Server stats: []}ServerList:null 41008: 2020-06-05 10:47:10-747 INFO [main] com.netflix.loadbalancer.DynamicServerListLoadBalancer com.netflix.loadbalancer.DynamicServerListLoadBalancer.enableAndInitLearnNewServersFeature(DynamicServerListLoadBalancer.java:222) enableAndInitLearnNewServersFeature - Using serverListUpdater PollingServerListUpdater 41008: 2020-06-05 10:47:10-749 INFO [main] com.netflix.config.ChainedDynamicProperty com.netflix.config.ChainedDynamicProperty$ChainLink.checkAndFlip(ChainedDynamicProperty.java:115) checkAndFlip - Flipping property: cloud-publicservice.ribbon.ActiveConnectionsLimit to use NEXT property: niws.loadbalancer.availabilityFilteringRule.activeConnectionsLimit = 2147483647 41008: 2020-06-05 10:47:10-751 INFO [main] com.netflix.loadbalancer.DynamicServerListLoadBalancer com.netflix.loadbalancer.DynamicServerListLoadBalancer.restOfInit(DynamicServerListLoadBalancer.java:150) restOfInit - DynamicServerListLoadBalancer for client cloud-publicservice initialized: DynamicServerListLoadBalancer:{NFLoadBalancer:name=cloud-publicservice,current list of Servers=[n4:9102],Load balancer stats=Zone stats: {defaultzone=[Zone:defaultzone; Instance count:1; Active connections count: 0; Circuit breaker tripped count: 0; Active connections per server: 0.0;] 41008: },Server stats: [[Server:n4:9102; Zone:defaultZone; Total Requests:0; Successive connection failure:0; Total blackout seconds:0; Last connection made:Thu Jan 01 08:00:00 CST 1970; First connection made: Thu Jan 01 08:00:00 CST 1970; Active Connections:0; total failure count in last (1000) msecs:0; average resp time:0.0; 90 percentile resp time:0.0; 95 percentile resp time:0.0; min resp time:0.0; max resp time:0.0; stddev resp time:0.0] 41008: ]}ServerList:org.springframework.cloud.netflix.ribbon.eureka.DomainExtractingServerList@3cdc5155 41008: 2020-06-05 10:47:10-764 INFO [PollingServerListUpdater-0] com.netflix.config.ChainedDynamicProperty com.netflix.config.ChainedDynamicProperty$ChainLink.checkAndFlip(ChainedDynamicProperty.java:115) checkAndFlip - Flipping property: hiveEngineManager.ribbon.ActiveConnectionsLimit to use NEXT property: niws.loadbalancer.availabilityFilteringRule.activeConnectionsLimit = 2147483647 41008: 2020-06-05 10:47:10-867 INFO [main] org.apache.hadoop.hive.conf.HiveConf org.apache.hadoop.hive.conf.HiveConf.findConfigFile(HiveConf.java:176) findConfigFile - Found configuration file file:/etc/hive/conf.cloudera.hive/hive-site.xml 41008: 2020-06-05 10:47:11-101 WARN [main] org.apache.hadoop.hive.conf.HiveConf org.apache.hadoop.hive.conf.HiveConf.initialize(HiveConf.java:3765) initialize - HiveConf of name hive.vectorized.use.checked.expressions does not exist 41008: 2020-06-05 10:47:11-101 WARN [main] org.apache.hadoop.hive.conf.HiveConf org.apache.hadoop.hive.conf.HiveConf.initialize(HiveConf.java:3765) initialize - HiveConf of name hive.strict.checks.no.partition.filter does not exist 41008: 2020-06-05 10:47:11-102 WARN [main] org.apache.hadoop.hive.conf.HiveConf org.apache.hadoop.hive.conf.HiveConf.initialize(HiveConf.java:3765) initialize - HiveConf of name hive.strict.checks.orderby.no.limit does not exist 41008: 2020-06-05 10:47:11-102 WARN [main] org.apache.hadoop.hive.conf.HiveConf org.apache.hadoop.hive.conf.HiveConf.initialize(HiveConf.java:3765) initialize - HiveConf of name hive.vectorized.adaptor.usage.mode does not exist 41008: 2020-06-05 10:47:11-102 WARN [main] org.apache.hadoop.hive.conf.HiveConf org.apache.hadoop.hive.conf.HiveConf.initialize(HiveConf.java:3765) initialize - HiveConf of name hive.vectorized.input.format.excludes does not exist 41008: 2020-06-05 10:47:11-103 WARN [main] org.apache.hadoop.hive.conf.HiveConf org.apache.hadoop.hive.conf.HiveConf.initialize(HiveConf.java:3765) initialize - HiveConf of name hive.strict.checks.bucketing does not exist 41008: 2020-06-05 10:47:11-106 INFO [main] com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutorFactory com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutorFactory$$anonfun$createExecutor$2.apply(HiveEngineExecutorFactory.scala:46) apply - key is wds.linkis.instance, value is 3 41008: 2020-06-05 10:47:11-106 INFO [main] com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutorFactory com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutorFactory$$anonfun$createExecutor$2.apply(HiveEngineExecutorFactory.scala:46) apply - key is mapred.reduce.tasks, value is 10 41008: 2020-06-05 10:47:11-107 INFO [main] com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutorFactory com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutorFactory$$anonfun$createExecutor$2.apply(HiveEngineExecutorFactory.scala:46) apply - key is creator, value is IDE 41008: 2020-06-05 10:47:11-108 INFO [main] com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutorFactory com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutorFactory$$anonfun$createExecutor$2.apply(HiveEngineExecutorFactory.scala:46) apply - key is wds.linkis.yarnqueue.cores.max, value is 150 41008: 2020-06-05 10:47:11-108 INFO [main] com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutorFactory com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutorFactory$$anonfun$createExecutor$2.apply(HiveEngineExecutorFactory.scala:46) apply - key is wds.linkis.tmpfile.clean.time, value is 10:00 41008: 2020-06-05 10:47:11-108 INFO [main] com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutorFactory com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutorFactory$$anonfun$createExecutor$2.apply(HiveEngineExecutorFactory.scala:46) apply - key is wds.linkis.yarnqueue.memory.max, value is 300G 41008: 2020-06-05 10:47:11-109 INFO [main] com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutorFactory com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutorFactory$$anonfun$createExecutor$2.apply(HiveEngineExecutorFactory.scala:46) apply - key is dfs.block.size, value is 10 41008: 2020-06-05 10:47:11-110 INFO [main] com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutorFactory com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutorFactory$$anonfun$createExecutor$2.apply(HiveEngineExecutorFactory.scala:46) apply - key is hive.client.memory, value is 2g 41008: 2020-06-05 10:47:11-110 INFO [main] com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutorFactory com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutorFactory$$anonfun$createExecutor$2.apply(HiveEngineExecutorFactory.scala:46) apply - key is _req_entrance_instance, value is hiveEntrance,n4:9108 41008: 2020-06-05 10:47:11-110 INFO [main] com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutorFactory com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutorFactory$$anonfun$createExecutor$2.apply(HiveEngineExecutorFactory.scala:46) apply - key is hive.exec.reduce.bytes.per.reducer, value is 10 41008: 2020-06-05 10:47:11-110 INFO [main] com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutorFactory com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutorFactory$$anonfun$createExecutor$2.apply(HiveEngineExecutorFactory.scala:46) apply - key is dwc.application.instance, value is n4:10099 41008: 2020-06-05 10:47:11-111 INFO [main] com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutorFactory com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutorFactory$$anonfun$createExecutor$2.apply(HiveEngineExecutorFactory.scala:46) apply - key is dwc.application.name, value is hiveEngineManager 41008: 2020-06-05 10:47:11-111 INFO [main] com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutorFactory com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutorFactory$$anonfun$createExecutor$2.apply(HiveEngineExecutorFactory.scala:46) apply - key is wds.linkis.preheating.time, value is 9:00 41008: 2020-06-05 10:47:11-111 INFO [main] com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutorFactory com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutorFactory$$anonfun$createExecutor$2.apply(HiveEngineExecutorFactory.scala:46) apply - key is wds.linkis.yarnqueue, value is default 41008: 2020-06-05 10:47:11-111 INFO [main] com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutorFactory com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutorFactory$$anonfun$createExecutor$2.apply(HiveEngineExecutorFactory.scala:46) apply - key is user, value is hadoop 41008: 2020-06-05 10:47:11-111 INFO [main] com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutorFactory com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutorFactory$$anonfun$createExecutor$2.apply(HiveEngineExecutorFactory.scala:46) apply - key is ticketId, value is 734b1127-0289-4626-a629-d66af440ff87 41008: 2020-06-05 10:47:11-111 INFO [main] com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutorFactory com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutorFactory$$anonfun$createExecutor$2.apply(HiveEngineExecutorFactory.scala:46) apply - key is wds.linkis.client.memory.max, value is 20G 41008: 2020-06-05 10:47:11-113 INFO [main] com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutorFactory com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutorFactory$$anonfun$createExecutor$4.apply(HiveEngineExecutorFactory.scala:48) apply - key is wds.linkis.instance, value is 3 41008: 2020-06-05 10:47:11-113 INFO [main] com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutorFactory com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutorFactory$$anonfun$createExecutor$4.apply(HiveEngineExecutorFactory.scala:48) apply - key is wds.linkis.yarnqueue.cores.max, value is 150 41008: 2020-06-05 10:47:11-114 INFO [main] com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutorFactory com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutorFactory$$anonfun$createExecutor$4.apply(HiveEngineExecutorFactory.scala:48) apply - key is hive.exec.reduce.bytes.per.reducer, value is 10 41008: 2020-06-05 10:47:11-114 INFO [main] com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutorFactory com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutorFactory$$anonfun$createExecutor$4.apply(HiveEngineExecutorFactory.scala:48) apply - key is wds.linkis.tmpfile.clean.time, value is 10:00 41008: 2020-06-05 10:47:11-114 INFO [main] com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutorFactory com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutorFactory$$anonfun$createExecutor$4.apply(HiveEngineExecutorFactory.scala:48) apply - key is wds.linkis.yarnqueue.memory.max, value is 300G 41008: 2020-06-05 10:47:11-115 INFO [main] com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutorFactory com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutorFactory$$anonfun$createExecutor$4.apply(HiveEngineExecutorFactory.scala:48) apply - key is wds.linkis.preheating.time, value is 9:00 41008: 2020-06-05 10:47:11-115 INFO [main] com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutorFactory com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutorFactory$$anonfun$createExecutor$4.apply(HiveEngineExecutorFactory.scala:48) apply - key is hive.client.memory, value is 2g 41008: 2020-06-05 10:47:11-115 INFO [main] com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutorFactory com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutorFactory$$anonfun$createExecutor$4.apply(HiveEngineExecutorFactory.scala:48) apply - key is wds.linkis.yarnqueue, value is default 41008: 2020-06-05 10:47:11-115 INFO [main] com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutorFactory com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutorFactory$$anonfun$createExecutor$4.apply(HiveEngineExecutorFactory.scala:48) apply - key is wds.linkis.client.memory.max, value is 20G 41008: 2020-06-05 10:47:11-370 INFO [main] hive.metastore org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:433) open - Trying to connect to metastore with URI thrift://n1:9083 41008: 2020-06-05 10:47:11-385 INFO [main] hive.metastore org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:478) open - Opened a connection to metastore, current connections: 1 41008: 2020-06-05 10:47:11-398 INFO [main] hive.metastore org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:530) open - Connected to metastore. 2020-06-05 10:47:11.655 INFO [RPC-Receiver-Asyn-Thread-Thread-0] com.webank.wedatasphere.linkis.enginemanager.process.CommonProcessEngine 42 info - Call back to change engine state0 error msg is null 2020-06-05 10:47:11.657 INFO [RPC-Receiver-Asyn-Thread-Thread-0] com.webank.wedatasphere.linkis.enginemanager.process.CommonProcessEngine 42 info - CommonProcessEngine(port: 41008, creator: IDE, user: hadoop) change state Starting => Starting. 2020-06-05 10:47:11,662 INFO (RPC-Receiver-Asyn-Thread-Thread-0) INFO AnnotationConfigApplicationContext - Refreshing SpringClientFactory-hiveEntrance: startup date [Fri Jun 05 10:47:11 CST 2020]; parent: org.springframework.boot.web.servlet.context.AnnotationConfigServletWebServerApplicationContext@444548a0 2020-06-05 10:47:11,702 INFO (RPC-Receiver-Asyn-Thread-Thread-0) INFO AutowiredAnnotationBeanPostProcessor - JSR-330 'javax.inject.Inject' annotation found and supported for autowiring 41008: 2020-06-05 10:47:11-749 INFO [PollingServerListUpdater-1] com.netflix.config.ChainedDynamicProperty com.netflix.config.ChainedDynamicProperty$ChainLink.checkAndFlip(ChainedDynamicProperty.java:115) checkAndFlip - Flipping property: cloud-publicservice.ribbon.ActiveConnectionsLimit to use NEXT property: niws.loadbalancer.availabilityFilteringRule.activeConnectionsLimit = 2147483647 2020-06-05 10:47:11.793 INFO [RPC-Receiver-Asyn-Thread-Thread-0] com.netflix.config.ChainedDynamicProperty 115 checkAndFlip - Flipping property: hiveEntrance.ribbon.ActiveConnectionsLimit to use NEXT property: niws.loadbalancer.availabilityFilteringRule.activeConnectionsLimit = 2147483647 2020-06-05 10:47:11.801 INFO [RPC-Receiver-Asyn-Thread-Thread-0] com.netflix.util.concurrent.ShutdownEnabledTimer 58 - Shutdown hook installed for: NFLoadBalancer-PingTimer-hiveEntrance 2020-06-05 10:47:11.802 INFO [RPC-Receiver-Asyn-Thread-Thread-0] com.netflix.loadbalancer.BaseLoadBalancer 192 initWithConfig - Client: hiveEntrance instantiated a LoadBalancer: DynamicServerListLoadBalancer:{NFLoadBalancer:name=hiveEntrance,current list of Servers=[],Load balancer stats=Zone stats: {},Server stats: []}ServerList:null 2020-06-05 10:47:11.803 INFO [RPC-Receiver-Asyn-Thread-Thread-0] com.netflix.loadbalancer.DynamicServerListLoadBalancer 222 enableAndInitLearnNewServersFeature - Using serverListUpdater PollingServerListUpdater 2020-06-05 10:47:11.804 INFO [RPC-Receiver-Asyn-Thread-Thread-0] com.netflix.config.ChainedDynamicProperty 115 checkAndFlip - Flipping property: hiveEntrance.ribbon.ActiveConnectionsLimit to use NEXT property: niws.loadbalancer.availabilityFilteringRule.activeConnectionsLimit = 2147483647 2020-06-05 10:47:11.805 INFO [RPC-Receiver-Asyn-Thread-Thread-0] com.netflix.loadbalancer.DynamicServerListLoadBalancer 150 restOfInit - DynamicServerListLoadBalancer for client hiveEntrance initialized: DynamicServerListLoadBalancer:{NFLoadBalancer:name=hiveEntrance,current list of Servers=[n4:9108],Load balancer stats=Zone stats: {defaultzone=[Zone:defaultzone; Instance count:1; Active connections count: 0; Circuit breaker tripped count: 0; Active connections per server: 0.0;] },Server stats: [[Server:n4:9108; Zone:defaultZone; Total Requests:0; Successive connection failure:0; Total blackout seconds:0; Last connection made:Thu Jan 01 08:00:00 CST 1970; First connection made: Thu Jan 01 08:00:00 CST 1970; Active Connections:0; total failure count in last (1000) msecs:0; average resp time:0.0; 90 percentile resp time:0.0; 95 percentile resp time:0.0; min resp time:0.0; max resp time:0.0; stddev resp time:0.0] ]}ServerList:org.springframework.cloud.netflix.ribbon.eureka.DomainExtractingServerList@3bdd067e 41008: 2020-06-05 10:47:12,153 WARN (main) WARN NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 41008: 2020-06-05 10:47:12-280 INFO [main] org.apache.hadoop.hive.ql.session.SessionState org.apache.hadoop.hive.ql.session.SessionState.createPath(SessionState.java:734) createPath - Created HDFS directory: /tmp/hive/hadoop/ca056791-7702-477c-b15e-a25ad1fa6dcd 41008: 2020-06-05 10:47:12-287 INFO [main] org.apache.hadoop.hive.ql.session.SessionState org.apache.hadoop.hive.ql.session.SessionState.createPath(SessionState.java:734) createPath - Created local directory: /tmp/hadoop/ca056791-7702-477c-b15e-a25ad1fa6dcd 41008: 2020-06-05 10:47:12-294 INFO [main] org.apache.hadoop.hive.ql.session.SessionState org.apache.hadoop.hive.ql.session.SessionState.createPath(SessionState.java:734) createPath - Created HDFS directory: /tmp/hive/hadoop/ca056791-7702-477c-b15e-a25ad1fa6dcd/_tmp_space.db 41008: 2020-06-05 10:47:12-329 INFO [main] com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutor com.webank.wedatasphere.linkis.common.utils.Logging$class.info(Logging.scala:42) info - com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutor@2ed9e59b change state Starting => Idle. 41008: 2020-06-05 10:47:12-330 INFO [main] com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutor com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutor.init(HiveEngineExecutor.scala:91) init - Ready to change engine state! 41008: 2020-06-05 10:47:12-331 WARN [main] com.webank.wedatasphere.linkis.engine.hive.hook.HiveAddJarsEngineHook com.webank.wedatasphere.linkis.engine.hive.hook.HiveAddJarsEngineHook.afterCreatedEngine(HiveAddJarsEngineHook.scala:50) afterCreatedEngine - hive added jars is empty 41008: 2020-06-05 10:47:12-332 INFO [main] com.webank.wedatasphere.linkis.engine.EngineReceiver com.webank.wedatasphere.linkis.common.utils.Logging$class.info(Logging.scala:42) info - created engine com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutor@2ed9e59b. 41008: 2020-06-05 10:47:12-333 INFO [main] com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutor com.webank.wedatasphere.linkis.common.utils.Logging$class.info(Logging.scala:42) info - com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutor@2ed9e59b change state Idle => Idle. 41008: 2020-06-05 10:47:12-334 INFO [main] com.webank.wedatasphere.linkis.resourcemanager.client.ResourceManagerClient com.webank.wedatasphere.linkis.common.utils.Logging$class.info(Logging.scala:42) info - ResourceManagerClient init 41008: 2020-06-05 10:47:12,343 INFO (main) INFO AnnotationConfigApplicationContext - Refreshing SpringClientFactory-ResourceManager: startup date [Fri Jun 05 10:47:12 CST 2020]; parent: org.springframework.boot.web.servlet.context.AnnotationConfigServletWebServerApplicationContext@3104351d 41008: 2020-06-05 10:47:12,377 INFO (main) INFO AutowiredAnnotationBeanPostProcessor - JSR-330 'javax.inject.Inject' annotation found and supported for autowiring 41008: 2020-06-05 10:47:12-470 INFO [main] com.netflix.config.ChainedDynamicProperty com.netflix.config.ChainedDynamicProperty$ChainLink.checkAndFlip(ChainedDynamicProperty.java:115) checkAndFlip - Flipping property: ResourceManager.ribbon.ActiveConnectionsLimit to use NEXT property: niws.loadbalancer.availabilityFilteringRule.activeConnectionsLimit = 2147483647 41008: 2020-06-05 10:47:12-480 INFO [main] com.netflix.util.concurrent.ShutdownEnabledTimer com.netflix.util.concurrent.ShutdownEnabledTimer.(ShutdownEnabledTimer.java:58) - Shutdown hook installed for: NFLoadBalancer-PingTimer-ResourceManager 41008: 2020-06-05 10:47:12-481 INFO [main] com.netflix.loadbalancer.BaseLoadBalancer com.netflix.loadbalancer.BaseLoadBalancer.initWithConfig(BaseLoadBalancer.java:192) initWithConfig - Client: ResourceManager instantiated a LoadBalancer: DynamicServerListLoadBalancer:{NFLoadBalancer:name=ResourceManager,current list of Servers=[],Load balancer stats=Zone stats: {},Server stats: []}ServerList:null 41008: 2020-06-05 10:47:12-483 INFO [main] com.netflix.loadbalancer.DynamicServerListLoadBalancer com.netflix.loadbalancer.DynamicServerListLoadBalancer.enableAndInitLearnNewServersFeature(DynamicServerListLoadBalancer.java:222) enableAndInitLearnNewServersFeature - Using serverListUpdater PollingServerListUpdater 41008: 2020-06-05 10:47:12-484 INFO [main] com.netflix.config.ChainedDynamicProperty com.netflix.config.ChainedDynamicProperty$ChainLink.checkAndFlip(ChainedDynamicProperty.java:115) checkAndFlip - Flipping property: ResourceManager.ribbon.ActiveConnectionsLimit to use NEXT property: niws.loadbalancer.availabilityFilteringRule.activeConnectionsLimit = 2147483647 41008: 2020-06-05 10:47:12-485 INFO [main] com.netflix.loadbalancer.DynamicServerListLoadBalancer com.netflix.loadbalancer.DynamicServerListLoadBalancer.restOfInit(DynamicServerListLoadBalancer.java:150) restOfInit - DynamicServerListLoadBalancer for client ResourceManager initialized: DynamicServerListLoadBalancer:{NFLoadBalancer:name=ResourceManager,current list of Servers=[n4:9104],Load balancer stats=Zone stats: {defaultzone=[Zone:defaultzone; Instance count:1; Active connections count: 0; Circuit breaker tripped count: 0; Active connections per server: 0.0;] 41008: },Server stats: [[Server:n4:9104; Zone:defaultZone; Total Requests:0; Successive connection failure:0; Total blackout seconds:0; Last connection made:Thu Jan 01 08:00:00 CST 1970; First connection made: Thu Jan 01 08:00:00 CST 1970; Active Connections:0; total failure count in last (1000) msecs:0; average resp time:0.0; 90 percentile resp time:0.0; 95 percentile resp time:0.0; min resp time:0.0; max resp time:0.0; stddev resp time:0.0] 41008: ]}ServerList:org.springframework.cloud.netflix.ribbon.eureka.DomainExtractingServerList@23c24c0e 2020-06-05 10:47:12.532 INFO [RPC-Receiver-Asyn-Thread-Thread-0] com.webank.wedatasphere.linkis.enginemanager.process.CommonProcessEngine 42 info - Call back to change engine state1 error msg is null 2020-06-05 10:47:12.533 INFO [RPC-Receiver-Asyn-Thread-Thread-0] com.webank.wedatasphere.linkis.enginemanager.process.CommonProcessEngine 42 info - CommonProcessEngine(port: 41008, creator: IDE, user: hadoop) change state Starting => Idle. 41008: 2020-06-05 10:47:12-536 INFO [main] com.webank.wedatasphere.linkis.engine.EngineReceiver com.webank.wedatasphere.linkis.common.utils.Logging$class.info(Logging.scala:42) info - hiveEngine engineServer started. 2020-06-05 10:47:12.547 WARN [RPC-Receiver-Asyn-Thread-Thread-0] com.webank.wedatasphere.linkis.enginemanager.EngineManagerReceiver 51 warn - CommonProcessEngine(port: 41008, creator: IDE, user: hadoop) start successfully, now try to broadcast it to all related entrances(RPCSender(hiveEntrance, n4:9108)). 41008: 2020-06-05 10:47:12-573 INFO [main] com.webank.wedatasphere.linkis.rpc.RPCReceiveRestful com.webank.wedatasphere.linkis.common.utils.Logging$class.info(Logging.scala:42) info - init all receiverChoosers in spring beans, list => List(com.webank.wedatasphere.linkis.rpc.CommonReceiverChooser@56d4481f) 41008: 2020-06-05 10:47:12-578 INFO [main] com.webank.wedatasphere.linkis.rpc.RPCReceiveRestful com.webank.wedatasphere.linkis.common.utils.Logging$class.info(Logging.scala:42) info - init all receiverSenderBuilders in spring beans, list => List(com.webank.wedatasphere.linkis.rpc.CommonReceiverSenderBuilder@468dbd07) 41008: 2020-06-05 10:47:12-580 INFO [main] com.webank.wedatasphere.linkis.rpc.RPCReceiveRestful com.webank.wedatasphere.linkis.common.utils.Logging$class.info(Logging.scala:42) info - init RPCReceiverListenerBus with queueSize 1000 and consumeThreadSize 10. 41008: 2020-06-05 10:47:12-581 INFO [main] com.webank.wedatasphere.linkis.rpc.AsynRPCMessageBus com.webank.wedatasphere.linkis.common.utils.Logging$class.info(Logging.scala:42) info - RPC-Receiver-Asyn-Thread-ListenerBus add a new listener => class com.webank.wedatasphere.linkis.rpc.RPCReceiveRestful$$anon$1 41008: 2020-06-05 10:47:12-707 WARN [main] com.netflix.config.sources.URLConfigurationSource com.netflix.config.sources.URLConfigurationSource.(URLConfigurationSource.java:121) - No URLs will be polled as dynamic configuration sources. 41008: 2020-06-05 10:47:12-708 INFO [main] com.netflix.config.sources.URLConfigurationSource com.netflix.config.sources.URLConfigurationSource.(URLConfigurationSource.java:122) - To enable URLs as dynamic configuration sources, define System property archaius.configurationSource.additionalUrls or make config.properties available on classpath. 2020-06-05 10:47:12.804 INFO [PollingServerListUpdater-1] com.netflix.config.ChainedDynamicProperty 115 checkAndFlip - Flipping property: hiveEntrance.ribbon.ActiveConnectionsLimit to use NEXT property: niws.loadbalancer.availabilityFilteringRule.activeConnectionsLimit = 2147483647 41008: 2020-06-05 10:47:12,863 INFO (main) INFO SimpleUrlHandlerMapping - Mapped URL path [//favicon.ico] onto handler of type [class org.springframework.web.servlet.resource.ResourceHttpRequestHandler] 41008: 2020-06-05 10:47:12,934 INFO (main) INFO RequestMappingHandlerAdapter - Looking for @ControllerAdvice: org.springframework.boot.web.servlet.context.AnnotationConfigServletWebServerApplicationContext@3104351d: startup date [Fri Jun 05 10:47:05 CST 2020]; parent: org.springframework.context.annotation.AnnotationConfigApplicationContext@538613b3 41008: 2020-06-05 10:47:13,066 INFO (main) INFO RequestMappingHandlerMapping - Mapped "{[/error],produces=[text/html]}" onto public org.springframework.web.servlet.ModelAndView org.springframework.boot.autoconfigure.web.servlet.error.BasicErrorController.errorHtml(javax.servlet.http.HttpServletRequest,javax.servlet.http.HttpServletResponse) 41008: 2020-06-05 10:47:13,076 INFO (main) INFO RequestMappingHandlerMapping - Mapped "{[/error]}" onto public org.springframework.http.ResponseEntity<java.util.Map<java.lang.String, java.lang.Object>> org.springframework.boot.autoconfigure.web.servlet.error.BasicErrorController.error(javax.servlet.http.HttpServletRequest) 41008: 2020-06-05 10:47:13,150 INFO (main) INFO SimpleUrlHandlerMapping - Mapped URL path [/webjars/] onto handler of type [class org.springframework.web.servlet.resource.ResourceHttpRequestHandler] 41008: 2020-06-05 10:47:13,150 INFO (main) INFO SimpleUrlHandlerMapping - Mapped URL path [/*] onto handler of type [class org.springframework.web.servlet.resource.ResourceHttpRequestHandler] 41008: 2020-06-05 10:47:13-484 INFO [PollingServerListUpdater-0] com.netflix.config.ChainedDynamicProperty com.netflix.config.ChainedDynamicProperty$ChainLink.checkAndFlip(ChainedDynamicProperty.java:115) checkAndFlip - Flipping property: ResourceManager.ribbon.ActiveConnectionsLimit to use NEXT property: niws.loadbalancer.availabilityFilteringRule.activeConnectionsLimit = 2147483647 41008: 2020-06-05 10:47:13,745 WARN (main) WARN EurekaStarterDeprecationWarningAutoConfiguration - spring-cloud-starter-eureka is deprecated as of Spring Cloud Netflix 1.4.0, please migrate to spring-cloud-starter-netflix-eureka 41008: 2020-06-05 10:47:13,763 INFO (main) INFO EndpointLinksResolver - Exposing 2 endpoint(s) beneath base path '/actuator' 41008: 2020-06-05 10:47:13,782 INFO (main) INFO WebMvcEndpointHandlerMapping - Mapped "{[/actuator/info],methods=[GET],produces=[application/vnd.spring-boot.actuator.v2+json || application/json]}" onto public java.lang.Object org.springframework.boot.actuate.endpoint.web.servlet.AbstractWebMvcEndpointHandlerMapping$OperationHandler.handle(javax.servlet.http.HttpServletRequest,java.util.Map<java.lang.String, java.lang.String>) 41008: 2020-06-05 10:47:13,784 INFO (main) INFO WebMvcEndpointHandlerMapping - Mapped "{[/actuator/refresh],methods=[POST],produces=[application/vnd.spring-boot.actuator.v2+json || application/json]}" onto public java.lang.Object org.springframework.boot.actuate.endpoint.web.servlet.AbstractWebMvcEndpointHandlerMapping$OperationHandler.handle(javax.servlet.http.HttpServletRequest,java.util.Map<java.lang.String, java.lang.String>) 41008: 2020-06-05 10:47:13,785 INFO (main) INFO WebMvcEndpointHandlerMapping - Mapped "{[/actuator],methods=[GET],produces=[application/vnd.spring-boot.actuator.v2+json || application/json]}" onto protected java.util.Map<java.lang.String, java.util.Map<java.lang.String, org.springframework.boot.actuate.endpoint.web.Link>> org.springframework.boot.actuate.endpoint.web.servlet.WebMvcEndpointHandlerMapping.links(javax.servlet.http.HttpServletRequest,javax.servlet.http.HttpServletResponse) 41008: 2020-06-05 10:47:13,865 INFO (main) INFO AnnotationMBeanExporter - Registering beans for JMX exposure on startup 41008: 2020-06-05 10:47:13,875 INFO (main) INFO AnnotationMBeanExporter - Bean with name 'environmentManager' has been autodetected for JMX exposure 41008: 2020-06-05 10:47:13,876 INFO (main) INFO AnnotationMBeanExporter - Bean with name 'configurationPropertiesRebinder' has been autodetected for JMX exposure 41008: 2020-06-05 10:47:13,877 INFO (main) INFO AnnotationMBeanExporter - Bean with name 'refreshScope' has been autodetected for JMX exposure 41008: 2020-06-05 10:47:13,880 INFO (main) INFO AnnotationMBeanExporter - Located managed bean 'environmentManager': registering with JMX server as MBean [org.springframework.cloud.context.environment:name=environmentManager,type=EnvironmentManager] 41008: 2020-06-05 10:47:13,893 INFO (main) INFO AnnotationMBeanExporter - Located managed bean 'refreshScope': registering with JMX server as MBean [org.springframework.cloud.context.scope.refresh:name=refreshScope,type=RefreshScope] 2020-06-05 10:47:13.907 INFO [Engine-Manager-Thread-2] com.webank.wedatasphere.linkis.enginemanager.impl.EngineManagerImpl 42 info - init CommonProcessEngine(port: 41008, creator: IDE, user: hadoop) succeed. 41008: 2020-06-05 10:47:13,911 INFO (main) INFO AnnotationMBeanExporter - Located managed bean 'configurationPropertiesRebinder': registering with JMX server as MBean [org.springframework.cloud.context.properties:name=configurationPropertiesRebinder,context=3104351d,type=ConfigurationPropertiesRebinder] 41008: 2020-06-05 10:47:13,927 INFO (main) INFO DefaultLifecycleProcessor - Starting beans in phase 0 41008: 2020-06-05 10:47:13,927 INFO (main) INFO EurekaServiceRegistry - Registering application hiveEngine with eureka with status UP 41008: 2020-06-05 10:47:13-928 INFO [main] com.netflix.discovery.DiscoveryClient com.netflix.discovery.DiscoveryClient$3.notify(DiscoveryClient.java:1299) notify - Saw local status change event StatusChangeEvent [timestamp=1591325233928, current=UP, previous=STARTING] 41008: 2020-06-05 10:47:13-930 INFO [DiscoveryClient-InstanceInfoReplicator-0] com.netflix.discovery.DiscoveryClient com.netflix.discovery.DiscoveryClient.register(DiscoveryClient.java:826) register - DiscoveryClient_HIVEENGINE/n4:hiveEngine:41008: registering service... 41008: 2020-06-05 10:47:13-943 INFO [main] com.webank.wedatasphere.linkis.rpc.conf.RPCSpringConfiguration$$EnhancerBySpringCGLIB$$f7a8e353 com.webank.wedatasphere.linkis.common.utils.Logging$class.info(Logging.scala:42) info - DataWorkCloud RPC need register RPCReceiveRestful, now add it to configuration. 41008: 2020-06-05 10:47:13,943 INFO (main) INFO DataWorkCloudApplication - add config from config server... 41008: 2020-06-05 10:47:13,943 INFO (main) INFO DataWorkCloudApplication - initialize DataWorkCloud spring application... 41008: 2020-06-05 10:47:13-948 INFO [DiscoveryClient-InstanceInfoReplicator-0] com.netflix.discovery.DiscoveryClient com.netflix.discovery.DiscoveryClient.register(DiscoveryClient.java:835) register - DiscoveryClient_HIVEENGINE/n4:hiveEngine:41008 - registration status: 204 41008: 2020-06-05 10:47:13-968 INFO [main] org.eclipse.jetty.server.handler.ContextHandler.application org.eclipse.jetty.server.handler.ContextHandler$Context.log(ContextHandler.java:2318) log - Initializing Spring FrameworkServlet 'dispatcherServlet' 41008: 2020-06-05 10:47:13,968 INFO (main) INFO DispatcherServlet - FrameworkServlet 'dispatcherServlet': initialization started 41008: 2020-06-05 10:47:13,990 INFO (main) INFO DispatcherServlet - FrameworkServlet 'dispatcherServlet': initialization completed in 22 ms 41008: 2020-06-05 10:47:14-009 INFO [main] org.eclipse.jetty.server.AbstractConnector org.eclipse.jetty.server.AbstractConnector.doStart(AbstractConnector.java:289) doStart - Started ServerConnector@7882379b{HTTP/1.1,[http/1.1]}{0.0.0.0:41008} 41008: 2020-06-05 10:47:14,012 INFO (main) INFO JettyWebServer - Jetty started on port(s) 41008 (http/1.1) with context path '/' 41008: 2020-06-05 10:47:14,012 INFO (main) INFO EurekaAutoServiceRegistration - Updating port to 41008 41008: 2020-06-05 10:47:14,014 INFO (main) INFO DataWorkCloudApplication - Started DataWorkCloudApplication in 9.488 seconds (JVM running for 10.5) 41008: 2020-06-05 10:47:16,035 INFO (qtp1867326100-43) INFO RestfulApplication - register com.webank.wedatasphere.linkis.rpc.RPCReceiveRestful 41008: 2020-06-05 10:47:16,037 INFO (qtp1867326100-43) INFO RestfulApplication - packages com.webank.wedatasphere.linkis.engine.restful 41008: 2020-06-05 10:47:16-475 INFO [qtp1867326100-43] com.webank.wedatasphere.linkis.engine.EngineReceiver com.webank.wedatasphere.linkis.common.utils.Logging$class.info(Logging.scala:42) info - broadcastSender => RPCSender(hiveEntrance) 41008: 2020-06-05 10:47:16-480 INFO [qtp1867326100-43] com.webank.wedatasphere.linkis.engine.EngineReceiver com.webank.wedatasphere.linkis.common.utils.Logging$class.info(Logging.scala:42) info - locked a lock 0 for instance n4:41008. 41008: 2020-06-05 10:47:16-579 INFO [qtp1867326100-39] com.webank.wedatasphere.linkis.engine.EngineReceiver com.webank.wedatasphere.linkis.common.utils.Logging$class.info(Logging.scala:42) info - received a new request 41008: 41008: 41008: 41008: select from test11 limit 5000 41008: 2020-06-05 10:47:16-584 INFO [hiveEngineEngineConsumerThread] com.webank.wedatasphere.linkis.engine.execute.CommonEngineJob com.webank.wedatasphere.linkis.common.utils.Logging$class.info(Logging.scala:42) info - hiveEngineEngine_0 change state Inited => Scheduled. 41008: 2020-06-05 10:47:16,598 INFO (hiveEngineEngineConsumerThread) INFO AnnotationConfigApplicationContext - Refreshing SpringClientFactory-hiveEntrance: startup date [Fri Jun 05 10:47:16 CST 2020]; parent: org.springframework.boot.web.servlet.context.AnnotationConfigServletWebServerApplicationContext@3104351d 41008: 2020-06-05 10:47:16,627 INFO (hiveEngineEngineConsumerThread) INFO AutowiredAnnotationBeanPostProcessor - JSR-330 'javax.inject.Inject' annotation found and supported for autowiring 41008: 2020-06-05 10:47:16-717 INFO [hiveEngineEngineConsumerThread] com.netflix.config.ChainedDynamicProperty com.netflix.config.ChainedDynamicProperty$ChainLink.checkAndFlip(ChainedDynamicProperty.java:115) checkAndFlip - Flipping property: hiveEntrance.ribbon.ActiveConnectionsLimit to use NEXT property: niws.loadbalancer.availabilityFilteringRule.activeConnectionsLimit = 2147483647 41008: 2020-06-05 10:47:16-728 INFO [hiveEngineEngineConsumerThread] com.netflix.util.concurrent.ShutdownEnabledTimer com.netflix.util.concurrent.ShutdownEnabledTimer.(ShutdownEnabledTimer.java:58) - Shutdown hook installed for: NFLoadBalancer-PingTimer-hiveEntrance 41008: 2020-06-05 10:47:16-729 INFO [hiveEngineEngineConsumerThread] com.netflix.loadbalancer.BaseLoadBalancer com.netflix.loadbalancer.BaseLoadBalancer.initWithConfig(BaseLoadBalancer.java:192) initWithConfig - Client: hiveEntrance instantiated a LoadBalancer: DynamicServerListLoadBalancer:{NFLoadBalancer:name=hiveEntrance,current list of Servers=[],Load balancer stats=Zone stats: {},Server stats: []}ServerList:null 41008: 2020-06-05 10:47:16-730 INFO [hiveEngineEngineConsumerThread] com.netflix.loadbalancer.DynamicServerListLoadBalancer com.netflix.loadbalancer.DynamicServerListLoadBalancer.enableAndInitLearnNewServersFeature(DynamicServerListLoadBalancer.java:222) enableAndInitLearnNewServersFeature - Using serverListUpdater PollingServerListUpdater 41008: 2020-06-05 10:47:16-734 INFO [hiveEngineEngineConsumerThread] com.netflix.config.ChainedDynamicProperty com.netflix.config.ChainedDynamicProperty$ChainLink.checkAndFlip(ChainedDynamicProperty.java:115) checkAndFlip - Flipping property: hiveEntrance.ribbon.ActiveConnectionsLimit to use NEXT property: niws.loadbalancer.availabilityFilteringRule.activeConnectionsLimit = 2147483647 41008: 2020-06-05 10:47:16-735 INFO [hiveEngineEngineConsumerThread] com.netflix.loadbalancer.DynamicServerListLoadBalancer com.netflix.loadbalancer.DynamicServerListLoadBalancer.restOfInit(DynamicServerListLoadBalancer.java:150) restOfInit - DynamicServerListLoadBalancer for client hiveEntrance initialized: DynamicServerListLoadBalancer:{NFLoadBalancer:name=hiveEntrance,current list of Servers=[n4:9108],Load balancer stats=Zone stats: {defaultzone=[Zone:defaultzone; Instance count:1; Active connections count: 0; Circuit breaker tripped count: 0; Active connections per server: 0.0;] 41008: },Server stats: [[Server:n4:9108; Zone:defaultZone; Total Requests:0; Successive connection failure:0; Total blackout seconds:0; Last connection made:Thu Jan 01 08:00:00 CST 1970; First connection made: Thu Jan 01 08:00:00 CST 1970; Active Connections:0; total failure count in last (1000) msecs:0; average resp time:0.0; 90 percentile resp time:0.0; 95 percentile resp time:0.0; min resp time:0.0; max resp time:0.0; stddev resp time:0.0] 41008: ]}ServerList:org.springframework.cloud.netflix.ribbon.eureka.DomainExtractingServerList@768a1e7f 41008: 2020-06-05 10:47:16-776 INFO [hiveEngineEngine-Thread-2] com.webank.wedatasphere.linkis.engine.execute.CommonEngineJob com.webank.wedatasphere.linkis.common.utils.Logging$class.info(Logging.scala:42) info - hiveEngineEngine_0 change state Scheduled => Running. 41008: 2020-06-05 10:47:16-804 INFO [hiveEngineEngine-Thread-2] com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutor com.webank.wedatasphere.linkis.common.utils.Logging$class.info(Logging.scala:42) info - com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutor@2ed9e59b change state Idle => Busy. 41008: 2020-06-05 10:47:16-819 INFO [hiveEngineEngine-Thread-2] com.webank.wedatasphere.linkis.engine.EngineReceiver com.webank.wedatasphere.linkis.common.utils.Logging$class.info(Logging.scala:42) info - broadcast the state of UserWithCreator(hadoop,IDE) from Idle to Busy. 41008: 2020-06-05 10:47:16-873 INFO [hiveEngineEngine-Thread-2] com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutor com.webank.wedatasphere.linkis.engine.execute.EngineExecutor$$anonfun$execute$1$$anonfun$apply$1$$anonfun$apply$mcV$sp$1.apply(EngineExecutor.scala:122) apply - BmlEnginePreExecuteHook begins to do a hook 41008: 2020-06-05 10:47:16-874 INFO [hiveEngineEngine-Thread-2] com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutor com.webank.wedatasphere.linkis.engine.execute.EngineExecutor$$anonfun$execute$1$$anonfun$apply$1$$anonfun$apply$mcV$sp$1.apply(EngineExecutor.scala:124) apply - BmlEnginePreExecuteHook ends to do a hook 41008: 2020-06-05 10:47:16-876 INFO [hiveEngineEngine-Thread-2] com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutor com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutor.executeLine(HiveEngineExecutor.scala:108) executeLine - hive client begins to run hql code: 41008: select from test11 limit 5000 41008: 2020-06-05 10:47:16-922 WARN [hiveEngineEngine-Thread-2] com.webank.wedatasphere.linkis.engine.hive.executor.HiveDriverProxy$ com.webank.wedatasphere.linkis.common.utils.Utils$$anonfun$tryAndWarn$1.apply(Utils.scala:84) apply - java.lang.ClassNotFoundException: org.apache.hadoop.hive.ql.IDriver 41008: at java.net.URLClassLoader.findClass(URLClassLoader.java:381) ~[?:1.8.0_181] 41008: at java.lang.ClassLoader.loadClass(ClassLoader.java:424) ~[?:1.8.0_181] 41008: at java.lang.ClassLoader.loadClass(ClassLoader.java:357) ~[?:1.8.0_181] 41008: at com.webank.wedatasphere.linkis.engine.hive.executor.HiveDriverProxy$$anonfun$3.apply(HiveEngineExecutor.scala:378) ~[linkis-hive-engine-0.9.3.jar:?] 41008: at com.webank.wedatasphere.linkis.engine.hive.executor.HiveDriverProxy$$anonfun$3.apply(HiveEngineExecutor.scala:378) ~[linkis-hive-engine-0.9.3.jar:?] 41008: at com.webank.wedatasphere.linkis.common.utils.Utils$.tryCatch(Utils.scala:48) [linkis-common-0.9.3.jar:?] 41008: at com.webank.wedatasphere.linkis.common.utils.Utils$.tryAndWarn(Utils.scala:74) [linkis-common-0.9.3.jar:?] 41008: at com.webank.wedatasphere.linkis.engine.hive.executor.HiveDriverProxy$.(HiveEngineExecutor.scala:376) [linkis-hive-engine-0.9.3.jar:?] 41008: at com.webank.wedatasphere.linkis.engine.hive.executor.HiveDriverProxy$.(HiveEngineExecutor.scala) [linkis-hive-engine-0.9.3.jar:?] 41008: at com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutor$$anon$1.run(HiveEngineExecutor.scala:124) [linkis-hive-engine-0.9.3.jar:?] 41008: at com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutor$$anon$1.run(HiveEngineExecutor.scala:121) [linkis-hive-engine-0.9.3.jar:?] 41008: at java.security.AccessController.doPrivileged(Native Method) [?:1.8.0_181] 41008: at javax.security.auth.Subject.doAs(Subject.java:422) [?:1.8.0_181] 41008: at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) [hadoop-common-2.7.2.jar:?] 41008: at com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutor.executeLine(HiveEngineExecutor.scala:121) [linkis-hive-engine-0.9.3.jar:?] 41008: at com.webank.wedatasphere.linkis.engine.execute.EngineExecutor$$anonfun$execute$1$$anonfun$apply$9$$anonfun$apply$10.apply(EngineExecutor.scala:141) [linkis-ujes-engine-0.9.3.jar:?] 41008: at com.webank.wedatasphere.linkis.engine.execute.EngineExecutor$$anonfun$execute$1$$anonfun$apply$9$$anonfun$apply$10.apply(EngineExecutor.scala:140) [linkis-ujes-engine-0.9.3.jar:?] 41008: at com.webank.wedatasphere.linkis.common.utils.Utils$.tryCatch(Utils.scala:48) [linkis-common-0.9.3.jar:?] 41008: at com.webank.wedatasphere.linkis.engine.execute.EngineExecutor$$anonfun$execute$1$$anonfun$apply$9.apply(EngineExecutor.scala:141) [linkis-ujes-engine-0.9.3.jar:?] 41008: at com.webank.wedatasphere.linkis.engine.execute.EngineExecutor$$anonfun$execute$1$$anonfun$apply$9.apply(EngineExecutor.scala:136) [linkis-ujes-engine-0.9.3.jar:?] 41008: at scala.collection.immutable.Range.foreach(Range.scala:160) [scala-library-2.11.8.jar:?] 41008: at com.webank.wedatasphere.linkis.engine.execute.EngineExecutor$$anonfun$execute$1.apply(EngineExecutor.scala:136) [linkis-ujes-engine-0.9.3.jar:?] 41008: at com.webank.wedatasphere.linkis.engine.execute.EngineExecutor$$anonfun$execute$1.apply(EngineExecutor.scala:118) [linkis-ujes-engine-0.9.3.jar:?] 41008: at com.webank.wedatasphere.linkis.common.utils.Utils$.tryFinally(Utils.scala:62) [linkis-common-0.9.3.jar:?] 41008: at com.webank.wedatasphere.linkis.scheduler.executer.AbstractExecutor.ensureIdle(AbstractExecutor.scala:60) [linkis-scheduler-0.9.3.jar:?] 41008: at com.webank.wedatasphere.linkis.scheduler.executer.AbstractExecutor.ensureIdle(AbstractExecutor.scala:54) [linkis-scheduler-0.9.3.jar:?] 41008: at com.webank.wedatasphere.linkis.engine.execute.EngineExecutor.ensureOp$1(EngineExecutor.scala:117) [linkis-ujes-engine-0.9.3.jar:?] 41008: at com.webank.wedatasphere.linkis.engine.execute.EngineExecutor.execute(EngineExecutor.scala:118) [linkis-ujes-engine-0.9.3.jar:?] 41008: at com.webank.wedatasphere.linkis.scheduler.queue.Job$$anonfun$3.apply(Job.scala:254) [linkis-scheduler-0.9.3.jar:?] 41008: at com.webank.wedatasphere.linkis.scheduler.queue.Job$$anonfun$3.apply(Job.scala:254) [linkis-scheduler-0.9.3.jar:?] 41008: at com.webank.wedatasphere.linkis.common.utils.Utils$.tryCatch(Utils.scala:48) [linkis-common-0.9.3.jar:?] 41008: at com.webank.wedatasphere.linkis.scheduler.queue.Job.run(Job.scala:254) [linkis-scheduler-0.9.3.jar:?] 41008: at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_181] 41008: at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_181] 41008: at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_181] 41008: at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_181] 41008: at java.lang.Thread.run(Thread.java:748) [?:1.8.0_181] 41008: 41008: 2020-06-05 10:47:16-985 INFO [hiveEngineEngine-Thread-2] org.apache.hadoop.hive.ql.Driver org.apache.hadoop.hive.ql.Driver.compile(Driver.java:408) compile - Compiling command(queryId=hadoop_20200605104716_f032415c-a7f0-41e6-bfbf-5d36e73d84c4): select from test11 limit 5000 41008: 2020-06-05 10:47:17-676 INFO [hiveEngineEngine-Thread-2] hive.metastore org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:433) open - Trying to connect to metastore with URI thrift://n1:9083 41008: 2020-06-05 10:47:17-677 INFO [hiveEngineEngine-Thread-2] hive.metastore org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:478) open - Opened a connection to metastore, current connections: 2 41008: 2020-06-05 10:47:17-678 INFO [hiveEngineEngine-Thread-2] hive.metastore org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:530) open - Connected to metastore. 41008: 2020-06-05 10:47:17-732 INFO [PollingServerListUpdater-1] com.netflix.config.ChainedDynamicProperty com.netflix.config.ChainedDynamicProperty$ChainLink.checkAndFlip(ChainedDynamicProperty.java:115) checkAndFlip - Flipping property: hiveEntrance.ribbon.ActiveConnectionsLimit to use NEXT property: niws.loadbalancer.availabilityFilteringRule.activeConnectionsLimit = 2147483647 41008: 2020-06-05 10:47:17-791 INFO [hiveEngineEngine-Thread-3] com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutor com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutor.progress(HiveEngineExecutor.scala:276) progress - hive progress is 0.0 41008: 2020-06-05 10:47:17-794 INFO [hiveEngineEngine-Thread-2] org.apache.hadoop.hive.ql.parse.SemanticAnalyzer org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:10825) analyzeInternal - Starting Semantic Analysis 41008: 2020-06-05 10:47:17-809 INFO [hiveEngineEngine-Thread-2] org.apache.hadoop.hive.ql.security.authorization.plugin.sqlstd.SQLStdHiveAccessController org.apache.hadoop.hive.ql.security.authorization.plugin.sqlstd.SQLStdHiveAccessController.(SQLStdHiveAccessController.java:95) - Created SQLStdHiveAccessController for session context : HiveAuthzSessionContext [sessionString=ca056791-7702-477c-b15e-a25ad1fa6dcd, clientType=HIVECLI] 41008: 2020-06-05 10:47:17-812 WARN [hiveEngineEngine-Thread-2] org.apache.hadoop.hive.ql.session.SessionState org.apache.hadoop.hive.ql.session.SessionState.setAuthorizerV2Config(SessionState.java:888) setAuthorizerV2Config - METASTORE_FILTER_HOOK will be ignored, since hive.security.authorization.manager is set to instance of HiveAuthorizerFactory. 41008: 2020-06-05 10:47:17-812 INFO [hiveEngineEngine-Thread-2] hive.metastore org.apache.hadoop.hive.metastore.HiveMetaStoreClient.isCompatibleWith(HiveMetaStoreClient.java:353) isCompatibleWith - Mestastore configuration hive.metastore.filter.hook changed from org.apache.hadoop.hive.metastore.DefaultMetaStoreFilterHookImpl to org.apache.hadoop.hive.ql.security.authorization.plugin.AuthorizationMetaStoreFilterHook 41008: 2020-06-05 10:47:17-814 INFO [hiveEngineEngine-Thread-2] hive.metastore org.apache.hadoop.hive.metastore.HiveMetaStoreClient.close(HiveMetaStoreClient.java:559) close - Closed a connection to metastore, current connections: 1 41008: 2020-06-05 10:47:17-816 INFO [hiveEngineEngine-Thread-2] hive.metastore org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:433) open - Trying to connect to metastore with URI thrift://n1:9083 41008: 2020-06-05 10:47:17-817 INFO [hiveEngineEngine-Thread-2] hive.metastore org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:478) open - Opened a connection to metastore, current connections: 2 41008: 2020-06-05 10:47:17-818 INFO [hiveEngineEngine-Thread-2] hive.metastore org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:530) open - Connected to metastore. 41008: 2020-06-05 10:47:17-830 INFO [hiveEngineEngine-Thread-2] org.apache.hadoop.hive.ql.parse.SemanticAnalyzer org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genResolvedParseTree(SemanticAnalyzer.java:10771) genResolvedParseTree - Completed phase 1 of Semantic Analysis 41008: 2020-06-05 10:47:17-830 INFO [hiveEngineEngine-Thread-2] org.apache.hadoop.hive.ql.parse.SemanticAnalyzer org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1877) getMetaData - Get metadata for source tables 41008: 2020-06-05 10:47:17-831 INFO [hiveEngineEngine-Thread-2] hive.metastore org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:433) open - Trying to connect to metastore with URI thrift://n1:9083 41008: 2020-06-05 10:47:17-834 INFO [hiveEngineEngine-Thread-2] hive.metastore org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:478) open - Opened a connection to metastore, current connections: 3 41008: 2020-06-05 10:47:17-836 INFO [hiveEngineEngine-Thread-2] hive.metastore org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:530) open - Connected to metastore. 41008: 2020-06-05 10:47:17-978 INFO [hiveEngineEngine-Thread-2] org.apache.hadoop.hive.ql.parse.SemanticAnalyzer org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:2025) getMetaData - Get metadata for subqueries 41008: 2020-06-05 10:47:17-983 INFO [hiveEngineEngine-Thread-2] org.apache.hadoop.hive.ql.parse.SemanticAnalyzer org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:2049) getMetaData - Get metadata for destination tables 41008: 2020-06-05 10:47:17,989 ERROR (hiveEngineEngine-Thread-2) ERROR KeyProviderCache - Could not find uri with key [dfs.encryption.key.provider.uri] to create a keyProvider !! 41008: 2020-06-05 10:47:18-029 INFO [hiveEngineEngine-Thread-2] hive.ql.Context org.apache.hadoop.hive.ql.Context.getMRScratchDir(Context.java:342) getMRScratchDir - New scratch dir is hdfs://n1:8020/tmp/hive/hadoop/ca056791-7702-477c-b15e-a25ad1fa6dcd/hive_2020-06-05_10-47-17_000_3385156849931869068-1 41008: 2020-06-05 10:47:18-032 INFO [hiveEngineEngine-Thread-2] org.apache.hadoop.hive.ql.parse.SemanticAnalyzer org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genResolvedParseTree(SemanticAnalyzer.java:10776) genResolvedParseTree - Completed getting MetaData in Semantic Analysis 41008: 2020-06-05 10:47:18-139 INFO [hiveEngineEngine-Thread-2] org.apache.hadoop.hive.common.FileUtils org.apache.hadoop.hive.common.FileUtils.mkdir(FileUtils.java:521) mkdir - Creating directory if it doesn't exist: hdfs://n1:8020/tmp/hive/hadoop/ca056791-7702-477c-b15e-a25ad1fa6dcd/hive_2020-06-05_10-47-17_000_3385156849931869068-1/-mr-10000/.hive-staging_hive_2020-06-05_10-47-17_000_3385156849931869068-1 41008: 2020-06-05 10:47:18-289 INFO [hiveEngineEngine-Thread-2] org.apache.hadoop.hive.ql.ppd.OpProcFactory org.apache.hadoop.hive.ql.ppd.OpProcFactory$DefaultPPD.process(OpProcFactory.java:741) process - Processing for FS(3) 41008: 2020-06-05 10:47:18-289 INFO [hiveEngineEngine-Thread-2] org.apache.hadoop.hive.ql.ppd.OpProcFactory org.apache.hadoop.hive.ql.ppd.OpProcFactory$DefaultPPD.process(OpProcFactory.java:741) process - Processing for SEL(1) 41008: 2020-06-05 10:47:18-290 INFO [hiveEngineEngine-Thread-2] org.apache.hadoop.hive.ql.ppd.OpProcFactory org.apache.hadoop.hive.ql.ppd.OpProcFactory$TableScanPPD.process(OpProcFactory.java:415) process - Processing for TS(0) 41008: 2020-06-05 10:47:18-332 INFO [hiveEngineEngine-Thread-2] org.apache.hadoop.hive.ql.parse.SemanticAnalyzer org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:10950) analyzeInternal - Completed plan generation 41008: 2020-06-05 10:47:18-332 INFO [hiveEngineEngine-Thread-2] org.apache.hadoop.hive.ql.Driver org.apache.hadoop.hive.ql.Driver.compile(Driver.java:483) compile - Semantic Analysis Completed 41008: 2020-06-05 10:47:18-334 INFO [hiveEngineEngine-Thread-2] org.apache.hadoop.hive.ql.Driver org.apache.hadoop.hive.ql.Driver.getSchema(Driver.java:275) getSchema - Returning Hive schema: Schema(fieldSchemas:[FieldSchema(name:test11.add5, type:int, comment:null), FieldSchema(name:test11.add6, type:decimal(10,2), comment:null)], properties:null) 41008: 2020-06-05 10:47:18-357 INFO [hiveEngineEngine-Thread-2] org.apache.hadoop.hive.ql.exec.TableScanOperator org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:326) initialize - Initializing operator TS[0] 41008: 2020-06-05 10:47:18-358 INFO [hiveEngineEngine-Thread-2] org.apache.hadoop.hive.ql.exec.SelectOperator org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:326) initialize - Initializing operator SEL[1] 41008: 2020-06-05 10:47:18-363 INFO [hiveEngineEngine-Thread-2] org.apache.hadoop.hive.ql.exec.SelectOperator org.apache.hadoop.hive.ql.exec.SelectOperator.initializeOp(SelectOperator.java:73) initializeOp - SELECT struct<add5:int,add6:decimal(10,2)> 41008: 2020-06-05 10:47:18-364 INFO [hiveEngineEngine-Thread-2] org.apache.hadoop.hive.ql.exec.LimitOperator org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:326) initialize - Initializing operator LIM[2] 41008: 2020-06-05 10:47:18,364 INFO (hiveEngineEngine-Thread-2) INFO deprecation - mapred.task.is.map is deprecated. Instead, use mapreduce.task.ismap 41008: 2020-06-05 10:47:18-364 INFO [hiveEngineEngine-Thread-2] org.apache.hadoop.hive.ql.exec.ListSinkOperator org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:326) initialize - Initializing operator LIST_SINK[4] 41008: 2020-06-05 10:47:18-377 INFO [hiveEngineEngine-Thread-2] org.apache.hadoop.hive.ql.Driver org.apache.hadoop.hive.ql.Driver.compile(Driver.java:591) compile - Completed compiling command(queryId=hadoop_20200605104716_f032415c-a7f0-41e6-bfbf-5d36e73d84c4); Time taken: 1.42 seconds 41008: 2020-06-05 10:47:18-378 INFO [hiveEngineEngine-Thread-2] org.apache.hadoop.hive.ql.lockmgr.DummyTxnManager org.apache.hadoop.hive.ql.lockmgr.DummyTxnManager.getLockManager(DummyTxnManager.java:82) getLockManager - Creating lock manager of type org.apache.hadoop.hive.ql.lockmgr.zookeeper.ZooKeeperHiveLockManager 41008: 2020-06-05 10:47:18-454 INFO [hiveEngineEngine-Thread-2] org.apache.curator.utils.Compatibility org.apache.curator.utils.Compatibility.(Compatibility.java:41) - Running in ZooKeeper 3.4.x compatibility mode 41008: 2020-06-05 10:47:18-489 INFO [hiveEngineEngine-Thread-2] org.apache.curator.framework.imps.CuratorFrameworkImpl org.apache.curator.framework.imps.CuratorFrameworkImpl.start(CuratorFrameworkImpl.java:290) start - Starting 41008: 2020-06-05 10:47:18-503 INFO [hiveEngineEngine-Thread-2] org.apache.zookeeper.ZooKeeper org.apache.zookeeper.Environment.logEnv(Environment.java:100) logEnv - Client environment:zookeeper.version=3.4.9-1757313, built on 08/23/2016 06:50 GMT 41008: 2020-06-05 10:47:18-504 INFO [hiveEngineEngine-Thread-2] org.apache.zookeeper.ZooKeeper org.apache.zookeeper.Environment.logEnv(Environment.java:100) logEnv - Client environment:host.name=n4 41008: 2020-06-05 10:47:18-504 INFO [hiveEngineEngine-Thread-2] org.apache.zookeeper.ZooKeeper org.apache.zookeeper.Environment.logEnv(Environment.java:100) logEnv - Client environment:java.version=1.8.0_181 41008: 2020-06-05 10:47:18-504 INFO [hiveEngineEngine-Thread-2] org.apache.zookeeper.ZooKeeper org.apache.zookeeper.Environment.logEnv(Environment.java:100) logEnv - Client environment:java.vendor=Oracle Corporation 41008: 2020-06-05 10:47:18-504 INFO [hiveEngineEngine-Thread-2] org.apache.zookeeper.ZooKeeper org.apache.zookeeper.Environment.logEnv(Environment.java:100) logEnv - Client environment:java.home=/usr/java/jdk1.8.0_181-cloudera/jre 41008: 2020-06-05 10:47:18-504 INFO [hiveEngineEngine-Thread-2] org.apache.zookeeper.ZooKeeper org.apache.zookeeper.Environment.logEnv(Environment.java:100) logEnv - Client environment:java.class.path=/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/linkis-ujes-enginemanager-0.9.3.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/conf:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/commons-exec-1.3.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/linkis-ujes-enginemanager-0.9.3.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/linkis-udf-0.9.3.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/linkis-resourcemanager-client-0.9.3.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/linkis-resourcemanager-common-0.9.3.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/linkis-protocol-0.9.3.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/scala-library-2.11.8.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/scala-reflect-2.11.8.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/scala-xml_2.11-1.0.2.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/guava-14.0.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/paranamer-2.8.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/slf4j-api-1.7.12.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/javassist-3.19.0-GA.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/reflections-0.9.10.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/annotations-2.0.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/json4s-jackson_2.11-3.5.3.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/json4s-core_2.11-3.5.3.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/json4s-ast_2.11-3.5.3.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/json4s-scalap_2.11-3.5.3.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/hive-exec-2.1.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/linkis-ujes-engine-0.9.3.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/linkis-scheduler-0.9.3.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/py4j-0.10.4.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/scalatest_2.11-2.2.6.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/linkis-bml-hook-0.9.3.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/linkis-bmlclient-0.9.3.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/linkis-bmlcommon-0.9.3.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/linkis-gateway-httpclient-support-0.9.3.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/linkis-httpclient-0.9.3.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/dispatch-core_2.11-0.11.2.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/async-http-client-1.8.10.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/netty-3.10.5.Final.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/dispatch-json4s-jackson_2.11-0.11.2.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/commons-beanutils-1.9.3.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/commons-logging-1.1.3.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/accessors-smart-1.2.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/activation-1.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/antlr-2.7.7.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/antlr-runtime-3.4.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/aopalliance-1.0.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/aopalliance-repackaged-2.4.0-b31.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/apache-el-8.5.24.2.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/archaius-core-0.7.6.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/asm-3.3.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/asm-5.0.4.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/asm-all-repackaged-2.4.0-b34.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/asm-analysis-6.2.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/asm-commons-6.2.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/asm-tree-6.2.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/aspectjweaver-1.8.13.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/bcpkix-jdk15on-1.56.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/bcprov-jdk15on-1.56.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/bean-validator-2.4.0-b34.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/cglib-2.2.2.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/class-model-2.4.0-b34.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/commons-cli-1.2.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/commons-codec-1.10.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/commons-collections-3.2.2.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/commons-collections4-4.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/commons-compress-1.4.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/commons-configuration-1.8.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/commons-configuration2-2.1.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/commons-daemon-1.0.13.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/commons-dbcp-1.4.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/commons-io-2.4.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/commons-jxpath-1.3.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/commons-lang-2.6.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/commons-lang3-3.4.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/commons-logging-1.2.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/commons-math-2.2.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/commons-math3-3.1.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/commons-net-3.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/commons-pool-1.5.4.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/compactmap-1.2.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/config-types-2.4.0-b34.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/hbase-client-2.1.0.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/curator-framework-4.0.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/curator-recipes-4.0.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/curvesapi-1.04.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/dexx-collections-0.2.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/eureka-client-1.9.2.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/eureka-core-1.9.2.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/feign-core-9.5.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/feign-hystrix-9.5.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/feign-java8-9.5.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/feign-slf4j-9.5.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/gson-2.8.5.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/guice-4.1.0.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/libfb303-0.9.2.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/libthrift-0.9.2.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/htrace-core-3.1.0-incubating.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/hive-common-2.1.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/HdrHistogram-2.1.10.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/hk2-2.4.0-b34.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/hk2-api-2.4.0-b31.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/hk2-config-2.4.0-b34.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/hk2-core-2.4.0-b34.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/hk2-locator-2.4.0-b31.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/hk2-runlevel-2.4.0-b34.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/hk2-utils-2.4.0-b34.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/hbase-shaded-miscellaneous-2.1.0.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/httpclient-4.5.4.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/httpcore-4.4.7.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/hystrix-core-1.5.12.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/icu4j-4.6.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/poi-3.17.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/jackson-annotations-2.9.6.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/re2j-1.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/jackson-core-2.9.6.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/jackson-core-asl-1.9.2.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/jackson-databind-2.9.6.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/jackson-datatype-jdk8-2.9.6.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/jackson-datatype-jsr310-2.9.6.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/jackson-jaxrs-1.9.2.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/jackson-jaxrs-base-2.3.2.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/jackson-jaxrs-json-provider-2.3.2.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/jackson-mapper-asl-1.9.2.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/jackson-module-jaxb-annotations-2.3.2.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/jackson-module-parameter-names-2.9.6.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/jackson-module-paranamer-2.9.5.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/jackson-module-scala_2.11-2.9.5.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/jackson-xc-1.9.2.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/javacsv-2.0.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/java-cup-10k.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/javax.annotation-api-1.3.2.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/javax.inject-1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/javax.inject-2.4.0-b34.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/javax.servlet-api-3.1.0.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/javax.websocket-api-1.0.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/javax-websocket-client-impl-9.4.11.v20180605.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/javax-websocket-server-impl-9.4.11.v20180605.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/javax.ws.rs-api-2.0.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/jaxb-api-2.2.2.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/jaxb-impl-2.2.3-1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/jaxrs-ri-2.21.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/jcip-annotations-1.0-1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/jersey-apache-client4-1.19.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/jersey-client-1.19.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/jersey-client-2.21.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/jersey-common-2.21.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/jersey-container-servlet-2.23.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/jersey-container-servlet-core-2.23.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/jersey-core-1.19.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/jersey-entity-filtering-2.16.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/jersey-guava-2.21.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/jersey-json-1.19.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/jersey-media-jaxb-2.21.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/jersey-media-json-jackson-2.16.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/jersey-media-multipart-2.16.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/jersey-server-1.19.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/jersey-server-2.21.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/jersey-servlet-1.19.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/jersey-spring3-2.23.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/jettison-1.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/jetty-annotations-9.4.11.v20180605.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/jetty-client-9.4.11.v20180605.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/jetty-continuation-9.4.11.v20180605.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/jetty-http-9.4.11.v20180605.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/jetty-io-9.4.11.v20180605.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/jetty-plus-9.4.11.v20180605.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/jetty-security-9.4.11.v20180605.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/jetty-server-9.4.11.v20180605.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/jetty-servlet-9.4.11.v20180605.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/jetty-servlets-9.4.11.v20180605.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/jetty-util-9.4.11.v20180605.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/jetty-webapp-9.4.11.v20180605.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/jetty-xml-9.4.11.v20180605.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/jline-0.9.94.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/joda-time-2.3.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/json-smart-2.3.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/jsp-api-2.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/jsr305-3.0.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/jul-to-slf4j-1.7.25.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/kerb-admin-1.0.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/kerb-client-1.0.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/kerb-common-1.0.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/kerb-core-1.0.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/kerb-crypto-1.0.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/kerb-identity-1.0.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/kerb-server-1.0.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/kerb-simplekdc-1.0.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/kerb-util-1.0.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/kerby-asn1-1.0.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/kerby-config-1.0.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/kerby-pkix-1.0.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/kerby-util-1.0.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/kerby-xdr-1.0.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/LatencyUtils-2.0.3.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/leveldbjni-all-1.8.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/linkis-cloudRPC-0.9.3.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/linkis-common-0.9.3.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/linkis-hadoop-common-0.9.3.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/linkis-module-0.9.3.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/linkis-storage-0.9.3.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/log4j-1.2.17.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/log4j-api-2.10.0.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/log4j-core-2.10.0.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/log4j-jul-2.10.0.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/log4j-slf4j-impl-2.10.0.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/micrometer-core-1.0.5.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/mimepull-1.9.3.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/mysql-connector-java-5.1.34.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/netflix-commons-util-0.3.0.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/netflix-eventbus-0.3.0.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/netflix-infix-0.3.0.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/netflix-statistics-0.1.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/netty-all-4.1.17.Final.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/nimbus-jose-jwt-4.41.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/ooxml-schemas-1.3.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/org.eclipse.wst.xml.xpath2.processor-2.1.100.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/osgi-resource-locator-1.0.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/poi-ooxml-3.17.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/poi-ooxml-schemas-3.17.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/protobuf-java-2.5.0.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/ribbon-2.2.5.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/ribbon-core-2.2.5.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/ribbon-eureka-2.2.5.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/ribbon-httpclient-2.2.5.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/ribbon-loadbalancer-2.2.5.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/ribbon-transport-2.2.5.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/rxjava-1.2.0.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/rxnetty-0.4.9.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/rxnetty-contexts-0.4.9.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/rxnetty-servo-0.4.9.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/scala-compiler-2.11.8.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/scalap-2.11.8.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/scala-parser-combinators_2.11-1.0.4.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/scala-xml_2.11-1.0.4.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/servo-core-0.12.21.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/snakeyaml-1.19.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/spring-aop-5.0.7.RELEASE.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/spring-beans-5.0.7.RELEASE.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/spring-boot-2.0.3.RELEASE.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/spring-boot-actuator-2.0.3.RELEASE.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/spring-boot-actuator-autoconfigure-2.0.3.RELEASE.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/spring-boot-autoconfigure-2.0.3.RELEASE.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/spring-boot-starter-2.0.3.RELEASE.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/spring-boot-starter-actuator-2.0.3.RELEASE.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/spring-boot-starter-aop-2.0.3.RELEASE.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/spring-boot-starter-jetty-2.0.3.RELEASE.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/spring-boot-starter-json-2.0.3.RELEASE.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/spring-boot-starter-log4j2-2.0.3.RELEASE.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/spring-boot-starter-web-2.0.3.RELEASE.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/spring-bridge-2.4.0-b34.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/spring-cloud-commons-2.0.0.RELEASE.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/spring-cloud-config-client-2.0.0.RELEASE.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/spring-cloud-context-2.0.0.RELEASE.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/spring-cloud-netflix-archaius-2.0.0.RELEASE.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/spring-cloud-netflix-core-2.0.0.RELEASE.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/spring-cloud-netflix-eureka-client-2.0.0.RELEASE.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/spring-cloud-netflix-ribbon-2.0.0.RELEASE.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/spring-cloud-openfeign-core-2.0.0.RELEASE.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/spring-cloud-starter-2.0.0.RELEASE.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/spring-cloud-starter-config-2.0.0.RELEASE.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/spring-cloud-starter-eureka-1.4.4.RELEASE.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/spring-cloud-starter-feign-1.4.4.RELEASE.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/spring-cloud-starter-netflix-archaius-2.0.0.RELEASE.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/spring-cloud-starter-netflix-eureka-client-2.0.0.RELEASE.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/spring-cloud-starter-netflix-ribbon-2.0.0.RELEASE.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/spring-cloud-starter-openfeign-2.0.0.RELEASE.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/spring-context-5.0.7.RELEASE.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/spring-core-5.0.7.RELEASE.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/spring-expression-5.0.7.RELEASE.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/spring-jcl-5.0.7.RELEASE.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/spring-security-crypto-5.0.6.RELEASE.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/spring-security-rsa-1.0.5.RELEASE.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/spring-web-5.0.7.RELEASE.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/spring-webmvc-5.0.7.RELEASE.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/stax2-api-3.1.4.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/stax-api-1.0.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/stax-api-1.0-2.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/stringtemplate-3.2.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/tiger-types-1.4.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/token-provider-1.0.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/validation-api-1.1.0.Final.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/websocket-api-9.4.11.v20180605.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/websocket-client-9.4.11.v20180605.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/websocket-common-9.4.11.v20180605.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/websocket-server-9.4.11.v20180605.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/websocket-servlet-9.4.11.v20180605.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/woodstox-core-5.0.3.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/woodstox-core-asl-4.4.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/xerces2-xsd11-2.11.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/xlsx-streamer-1.2.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/xml-apis-1.4.01.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/xmlbeans-2.3.0.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/xmlpull-1.1.3.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/xml-resolver-1.2.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/xpp3_min-1.1.4c.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/xstream-1.4.10.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/xz-1.0.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/zookeeper-3.4.9.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/hive-shims-0.23-2.1.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/linkis-hive-engine-0.9.3.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/linkis-hive-engineManager-0.9.3.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/hive-serde-2.1.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/hive-service-rpc-2.1.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/hive-shims-2.1.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/hbase-annotations-1.1.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/hive-llap-client-2.1.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/hive-llap-server-2.1.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/hive-orc-2.1.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/eigenbase-properties-1.1.5.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/pentaho-aggdesigner-algorithm-5.1.5-jhyde.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/hbase-common-2.0.0.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/hbase-mapreduce-2.0.0.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/curator-client-4.0.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/hbase-shaded-protobuf-2.1.0.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/hbase-protocol-2.1.0.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/hbase-protocol-shaded-2.1.0.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/hbase-shaded-netty-2.1.0.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/hive-shims-common-2.1.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/hive-shims-scheduler-2.1.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/hive-service-2.1.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/hive-metastore-2.1.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/hive-hbase-handler.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/hbase-server-2.1.0.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/hive-storage-api-2.1.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/htrace-core4-4.1.0-incubating.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/hadoop-annotations-2.7.2.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/hadoop-auth-2.7.2.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/hadoop-client-2.7.2.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/hadoop-common-2.7.2.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/hadoop-hdfs-2.7.2.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/hadoop-mapreduce-client-app-2.7.2.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/hadoop-mapreduce-client-common-2.7.2.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/hadoop-mapreduce-client-core-2.7.2.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/hadoop-mapreduce-client-jobclient-2.7.2.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/hadoop-mapreduce-client-shuffle-2.7.2.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/hadoop-yarn-api-2.7.2.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/hadoop-yarn-client-2.7.2.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/hadoop-yarn-common-2.7.2.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/hadoop-yarn-server-common-2.7.2.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/hive-hbase-handler-2.1.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/hive-llap-common-2.1.1.jar:/opt/soft/dss_linkis/linkis/linkis-ujes-hive-enginemanager/lib/hive-llap-tez-2.1.1.jar:/appcom/commonlib/webank_bdp_udf.jar:/etc/hadoop/conf:/etc/hbase/conf:/appcom/config/spark-config:/etc/hive/conf: 41008: 2020-06-05 10:47:18-507 INFO [hiveEngineEngine-Thread-2] org.apache.zookeeper.ZooKeeper org.apache.zookeeper.Environment.logEnv(Environment.java:100) logEnv - Client environment:java.library.path=/appcom/Install/hadoop/lib/native 41008: 2020-06-05 10:47:18-510 INFO [hiveEngineEngine-Thread-2] org.apache.zookeeper.ZooKeeper org.apache.zookeeper.Environment.logEnv(Environment.java:100) logEnv - Client environment:java.io.tmpdir=/tmp 41008: 2020-06-05 10:47:18-510 INFO [hiveEngineEngine-Thread-2] org.apache.zookeeper.ZooKeeper org.apache.zookeeper.Environment.logEnv(Environment.java:100) logEnv - Client environment:java.compiler= 41008: 2020-06-05 10:47:18-511 INFO [hiveEngineEngine-Thread-2] org.apache.zookeeper.ZooKeeper org.apache.zookeeper.Environment.logEnv(Environment.java:100) logEnv - Client environment:os.name=Linux 41008: 2020-06-05 10:47:18-511 INFO [hiveEngineEngine-Thread-2] org.apache.zookeeper.ZooKeeper org.apache.zookeeper.Environment.logEnv(Environment.java:100) logEnv - Client environment:os.arch=amd64 41008: 2020-06-05 10:47:18-511 INFO [hiveEngineEngine-Thread-2] org.apache.zookeeper.ZooKeeper org.apache.zookeeper.Environment.logEnv(Environment.java:100) logEnv - Client environment:os.version=3.10.0-327.el7.x86_64 41008: 2020-06-05 10:47:18-511 INFO [hiveEngineEngine-Thread-2] org.apache.zookeeper.ZooKeeper org.apache.zookeeper.Environment.logEnv(Environment.java:100) logEnv - Client environment:user.name=hadoop 41008: 2020-06-05 10:47:18-511 INFO [hiveEngineEngine-Thread-2] org.apache.zookeeper.ZooKeeper org.apache.zookeeper.Environment.logEnv(Environment.java:100) logEnv - Client environment:user.home=/home/hadoop 41008: 2020-06-05 10:47:18-512 INFO [hiveEngineEngine-Thread-2] org.apache.zookeeper.ZooKeeper org.apache.zookeeper.Environment.logEnv(Environment.java:100) logEnv - Client environment:user.dir=/home/hadoop 41008: 2020-06-05 10:47:18-513 INFO [hiveEngineEngine-Thread-2] org.apache.zookeeper.ZooKeeper org.apache.zookeeper.ZooKeeper.(ZooKeeper.java:438) - Initiating client connection, connectString=n1:2181 sessionTimeout=1200000 watcher=org.apache.curator.ConnectionState@6ac062c5 41008: 2020-06-05 10:47:18-548 INFO [hiveEngineEngine-Thread-2-SendThread(n1:2181)] org.apache.zookeeper.ClientCnxn org.apache.zookeeper.ClientCnxn$SendThread.logStartConnect(ClientCnxn.java:1032) logStartConnect - Opening socket connection to server n1/172.18.50.193:2181. Will not attempt to authenticate using SASL (unknown error) 41008: 2020-06-05 10:47:18-549 INFO [hiveEngineEngine-Thread-2-SendThread(n1:2181)] org.apache.zookeeper.ClientCnxn org.apache.zookeeper.ClientCnxn$SendThread.primeConnection(ClientCnxn.java:876) primeConnection - Socket connection established to n1/172.18.50.193:2181, initiating session 41008: 2020-06-05 10:47:18-550 INFO [hiveEngineEngine-Thread-2] org.apache.curator.framework.imps.CuratorFrameworkImpl org.apache.curator.framework.imps.CuratorFrameworkImpl.start(CuratorFrameworkImpl.java:332) start - Default schema 41008: 2020-06-05 10:47:18-563 INFO [hiveEngineEngine-Thread-2-SendThread(n1:2181)] org.apache.zookeeper.ClientCnxn org.apache.zookeeper.ClientCnxn$SendThread.onConnected(ClientCnxn.java:1299) onConnected - Session establishment complete on server n1/172.18.50.193:2181, sessionid = 0x172821f79440227, negotiated timeout = 60000 41008: 2020-06-05 10:47:18-577 INFO [hiveEngineEngine-Thread-2-EventThread] org.apache.curator.framework.state.ConnectionStateManager org.apache.curator.framework.state.ConnectionStateManager.postState(ConnectionStateManager.java:237) postState - State change: CONNECTED 41008: 2020-06-05 10:47:18-641 INFO [hiveEngineEngine-Thread-2] org.apache.hadoop.hive.ql.Driver org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1658) execute - Executing command(queryId=hadoop_20200605104716_f032415c-a7f0-41e6-bfbf-5d36e73d84c4): select from test11 limit 5000 41008: 2020-06-05 10:47:18-644 INFO [hiveEngineEngine-Thread-2] org.apache.hadoop.hive.ql.Driver org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1940) execute - Completed executing command(queryId=hadoop_20200605104716_f032415c-a7f0-41e6-bfbf-5d36e73d84c4); Time taken: 0.003 seconds 41008: OK 41008: 2020-06-05 10:47:18-644 INFO [hiveEngineEngine-Thread-2] org.apache.hadoop.hive.ql.Driver org.apache.hadoop.hive.ql.session.SessionState$LogHelper.printInfo(SessionState.java:1088) printInfo - OK 41008: 2020-06-05 10:47:18-690 INFO [hiveEngineEngine-Thread-2] com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutor com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutor$$anon$1.run(HiveEngineExecutor.scala:145) run - n4:41008 >> Time taken: 1.8 s, begin to fetch results. 41008: 2020-06-05 10:47:18-792 INFO [hiveEngineEngine-Thread-3] com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutor com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutor.progress(HiveEngineExecutor.scala:276) progress - hive progress is 0.0 41008: 2020-06-05 10:47:18-880 INFO [hiveEngineEngine-Thread-2] com.webank.wedatasphere.linkis.storage.utils.FileSystemUtils$ com.webank.wedatasphere.linkis.common.utils.Logging$class.info(Logging.scala:42) info - doesn't need to call setOwner 41008: 2020-06-05 10:47:18-894 INFO [hiveEngineEngine-Thread-2] com.webank.wedatasphere.linkis.storage.utils.FileSystemUtils$ com.webank.wedatasphere.linkis.common.utils.Logging$class.info(Logging.scala:42) info - doesn't need to call setOwner 41008: 2020-06-05 10:47:18-903 INFO [hiveEngineEngine-Thread-2] com.webank.wedatasphere.linkis.storage.utils.FileSystemUtils$ com.webank.wedatasphere.linkis.common.utils.Logging$class.info(Logging.scala:42) info - doesn't need to call setOwner 41008: 2020-06-05 10:47:18-963 INFO [hiveEngineEngine-Thread-2] com.webank.wedatasphere.linkis.storage.utils.FileSystemUtils$ com.webank.wedatasphere.linkis.common.utils.Logging$class.info(Logging.scala:42) info - doesn't need to call setOwner 41008: 2020-06-05 10:47:18-982 INFO [hiveEngineEngine-Thread-2] com.webank.wedatasphere.linkis.storage.resultset.StorageResultSetWriter com.webank.wedatasphere.linkis.common.utils.Logging$class.info(Logging.scala:42) info - Succeed to create a new file:FsPath{path=/tmp/linkis/hadoop/dwc/20200605/IDE/421/_0.dolphin; isDirectory=false; length=0; modification_time=0; access_time=0; owner=null; group=null; permission=null} 41008: 2020-06-05 10:47:19-152 INFO [hiveEngineEngine-Thread-2] com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutor com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutor$$anon$1.run(HiveEngineExecutor.scala:200) run - n4:41008 >> Fetched 2 col(s) : 0 row(s) in hive 41008: 2020-06-05 10:47:19-178 INFO [hiveEngineEngine-Thread-2] com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutor com.webank.wedatasphere.linkis.common.utils.Logging$class.info(Logging.scala:42) info - com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutor@2ed9e59b change state Busy => Idle. 41008: 2020-06-05 10:47:19-178 INFO [hiveEngineEngine-Thread-2] com.webank.wedatasphere.linkis.engine.EngineReceiver com.webank.wedatasphere.linkis.common.utils.Logging$class.info(Logging.scala:42) info - broadcast the state of UserWithCreator(hadoop,IDE) from Busy to Idle. 41008: 2020-06-05 10:47:19-188 INFO [hiveEngineEngine-Thread-2] com.webank.wedatasphere.linkis.engine.execute.CommonEngineJob com.webank.wedatasphere.linkis.common.utils.Logging$class.info(Logging.scala:42) info - hiveEngineEngine_0 change state Running => Succeed. 41008: 2020-06-05 10:47:19-193 INFO [hiveEngineEngine-Thread-2] com.webank.wedatasphere.linkis.engine.log.LogHelper$ com.webank.wedatasphere.linkis.engine.log.LogHelper$.pushAllRemainLogs(LogHelper.scala:39) pushAllRemainLogs - start to push all remain logs, and size is 0 41008: 2020-06-05 10:47:19-224 INFO [hiveEngineEngine-Thread-2] com.webank.wedatasphere.linkis.engine.log.LogHelper$ com.webank.wedatasphere.linkis.engine.log.LogHelper$.pushAllRemainLogs(LogHelper.scala:58) pushAllRemainLogs - end to push all remain logs 41008: 2020-06-05 10:47:33-469 INFO [qtp1867326100-38] com.webank.wedatasphere.linkis.engine.EngineReceiver com.webank.wedatasphere.linkis.common.utils.Logging$class.info(Logging.scala:42) info - locked a lock 0 for instance n4:41008. 41008: 2020-06-05 10:47:33-521 INFO [qtp1867326100-42] com.webank.wedatasphere.linkis.engine.EngineReceiver com.webank.wedatasphere.linkis.common.utils.Logging$class.info(Logging.scala:42) info - received a new request select from dwd.dwd_o_prjshuijin limit 100 41008: 2020-06-05 10:47:33-522 INFO [hiveEngineEngineConsumerThread] com.webank.wedatasphere.linkis.engine.execute.CommonEngineJob com.webank.wedatasphere.linkis.common.utils.Logging$class.info(Logging.scala:42) info - hiveEngineEngine_1 change state Inited => Scheduled. 41008: 2020-06-05 10:47:33-528 INFO [hiveEngineEngine-Thread-2] com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutor com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutor.progress(HiveEngineExecutor.scala:276) progress - hive progress is -0.0 41008: 2020-06-05 10:47:33-528 INFO [hiveEngineEngine-Thread-3] com.webank.wedatasphere.linkis.engine.execute.CommonEngineJob com.webank.wedatasphere.linkis.common.utils.Logging$class.info(Logging.scala:42) info - hiveEngineEngine_1 change state Scheduled => Running. 41008: 2020-06-05 10:47:33-542 INFO [hiveEngineEngine-Thread-3] com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutor com.webank.wedatasphere.linkis.common.utils.Logging$class.info(Logging.scala:42) info - com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutor@2ed9e59b change state Idle => Busy. 41008: 2020-06-05 10:47:33-542 INFO [hiveEngineEngine-Thread-3] com.webank.wedatasphere.linkis.engine.EngineReceiver com.webank.wedatasphere.linkis.common.utils.Logging$class.info(Logging.scala:42) info - broadcast the state of UserWithCreator(hadoop,IDE) from Idle to Busy. 41008: 2020-06-05 10:47:33-551 INFO [hiveEngineEngine-Thread-3] com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutor com.webank.wedatasphere.linkis.engine.execute.EngineExecutor$$anonfun$execute$1$$anonfun$apply$1$$anonfun$apply$mcV$sp$1.apply(EngineExecutor.scala:122) apply - BmlEnginePreExecuteHook begins to do a hook 41008: 2020-06-05 10:47:33-552 INFO [hiveEngineEngine-Thread-3] com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutor com.webank.wedatasphere.linkis.engine.execute.EngineExecutor$$anonfun$execute$1$$anonfun$apply$1$$anonfun$apply$mcV$sp$1.apply(EngineExecutor.scala:124) apply - BmlEnginePreExecuteHook ends to do a hook 41008: 2020-06-05 10:47:33-552 INFO [hiveEngineEngine-Thread-3] com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutor com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutor.executeLine(HiveEngineExecutor.scala:108) executeLine - hive client begins to run hql code: 41008: select from dwd.dwd_o_prjshuijin limit 100 41008: 2020-06-05 10:47:33-586 INFO [hiveEngineEngine-Thread-3] org.apache.hadoop.hive.ql.Driver org.apache.hadoop.hive.ql.Driver.compile(Driver.java:408) compile - Compiling command(queryId=hadoop_20200605104733_fe8f3d35-b78c-4ad9-ac99-8fdf3ad6281e): select from dwd.dwd_o_prjshuijin limit 100 41008: 2020-06-05 10:47:33-588 INFO [hiveEngineEngine-Thread-3] hive.metastore org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:433) open - Trying to connect to metastore with URI thrift://n1:9083 41008: 2020-06-05 10:47:33-589 INFO [hiveEngineEngine-Thread-3] hive.metastore org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:478) open - Opened a connection to metastore, current connections: 4 41008: 2020-06-05 10:47:33-589 INFO [hiveEngineEngine-Thread-3] hive.metastore org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:530) open - Connected to metastore. 41008: 2020-06-05 10:47:33-692 INFO [hiveEngineEngine-Thread-3] org.apache.hadoop.hive.ql.parse.SemanticAnalyzer org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:10825) analyzeInternal - Starting Semantic Analysis 41008: 2020-06-05 10:47:33-692 INFO [hiveEngineEngine-Thread-3] org.apache.hadoop.hive.ql.parse.SemanticAnalyzer org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genResolvedParseTree(SemanticAnalyzer.java:10771) genResolvedParseTree - Completed phase 1 of Semantic Analysis 41008: 2020-06-05 10:47:33-692 INFO [hiveEngineEngine-Thread-3] org.apache.hadoop.hive.ql.parse.SemanticAnalyzer org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1877) getMetaData - Get metadata for source tables 41008: 2020-06-05 10:47:33-752 INFO [hiveEngineEngine-Thread-3] org.apache.hadoop.hive.ql.parse.SemanticAnalyzer org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:2025) getMetaData - Get metadata for subqueries 41008: 2020-06-05 10:47:33-752 INFO [hiveEngineEngine-Thread-3] org.apache.hadoop.hive.ql.parse.SemanticAnalyzer org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:2049) getMetaData - Get metadata for destination tables 41008: 2020-06-05 10:47:33-762 INFO [hiveEngineEngine-Thread-3] hive.ql.Context org.apache.hadoop.hive.ql.Context.getMRScratchDir(Context.java:342) getMRScratchDir - New scratch dir is hdfs://n1:8020/tmp/hive/hadoop/ca056791-7702-477c-b15e-a25ad1fa6dcd/hive_2020-06-05_10-47-33_587_6302378125791493795-2 41008: 2020-06-05 10:47:33-763 INFO [hiveEngineEngine-Thread-3] org.apache.hadoop.hive.ql.parse.SemanticAnalyzer org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genResolvedParseTree(SemanticAnalyzer.java:10776) genResolvedParseTree - Completed getting MetaData in Semantic Analysis 41008: 2020-06-05 10:47:33-785 INFO [hiveEngineEngine-Thread-3] org.apache.hadoop.hive.common.FileUtils org.apache.hadoop.hive.common.FileUtils.mkdir(FileUtils.java:521) mkdir - Creating directory if it doesn't exist: hdfs://n1:8020/tmp/hive/hadoop/ca056791-7702-477c-b15e-a25ad1fa6dcd/hive_2020-06-05_10-47-33_587_6302378125791493795-2/-mr-10000/.hive-staging_hive_2020-06-05_10-47-33_587_6302378125791493795-2 41008: 2020-06-05 10:47:33-822 INFO [hiveEngineEngine-Thread-3] org.apache.hadoop.hive.ql.ppd.OpProcFactory org.apache.hadoop.hive.ql.ppd.OpProcFactory$DefaultPPD.process(OpProcFactory.java:741) process - Processing for FS(3) 41008: 2020-06-05 10:47:33-822 INFO [hiveEngineEngine-Thread-3] org.apache.hadoop.hive.ql.ppd.OpProcFactory org.apache.hadoop.hive.ql.ppd.OpProcFactory$DefaultPPD.process(OpProcFactory.java:741) process - Processing for SEL(1) 41008: 2020-06-05 10:47:33-822 INFO [hiveEngineEngine-Thread-3] org.apache.hadoop.hive.ql.ppd.OpProcFactory org.apache.hadoop.hive.ql.ppd.OpProcFactory$TableScanPPD.process(OpProcFactory.java:415) process - Processing for TS(0) 41008: 2020-06-05 10:47:33-855 INFO [hiveEngineEngine-Thread-3] org.apache.hadoop.hive.hbase.HBaseStorageHandler org.apache.hadoop.hive.hbase.HBaseStorageHandler.configureTableJobProperties(HBaseStorageHandler.java:215) configureTableJobProperties - Configuring input job properties 41008: 2020-06-05 10:47:33-858 INFO [hiveEngineEngine-Thread-3] org.apache.hadoop.hive.ql.parse.SemanticAnalyzer org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:10950) analyzeInternal - Completed plan generation 41008: 2020-06-05 10:47:33-858 INFO [hiveEngineEngine-Thread-3] org.apache.hadoop.hive.ql.Driver org.apache.hadoop.hive.ql.Driver.compile(Driver.java:483) compile - Semantic Analysis Completed 41008: 2020-06-05 10:47:33-858 INFO [hiveEngineEngine-Thread-3] org.apache.hadoop.hive.ql.Driver org.apache.hadoop.hive.ql.Driver.getSchema(Driver.java:275) getSchema - Returning Hive schema: Schema(fieldSchemas:[FieldSchema(name:dwd_o_prjshuijin.id, type:varchar(256), comment:null), FieldSchema(name:dwd_o_prjshuijin.prj_name, type:varchar(400), comment:null), FieldSchema(name:dwd_o_prjshuijin.prj_code, type:varchar(201), comment:null), FieldSchema(name:dwd_o_prjshuijin.shuijin_month, type:varchar(10), comment:null), FieldSchema(name:dwd_o_prjshuijin.zengzhishui_fujian_amt, type:decimal(18,2), comment:null), FieldSchema(name:dwd_o_prjshuijin.tudi_zengzhi_amt, type:decimal(18,2), comment:null), FieldSchema(name:dwd_o_prjshuijin.qiyi_suodeshui_amt, type:decimal(18,2), comment:null), FieldSchema(name:dwd_o_prjshuijin.tidi_shiyongshui_amt, type:decimal(18,2), comment:null), FieldSchema(name:dwd_o_prjshuijin.fangchanshui_amt, type:decimal(18,2), comment:null), FieldSchema(name:dwd_o_prjshuijin.other_amt, type:decimal(18,2), comment:null)], properties:null) 41008: 2020-06-05 10:47:33-862 INFO [hiveEngineEngine-Thread-3] org.apache.hadoop.hive.ql.exec.TableScanOperator org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:326) initialize - Initializing operator TS[0] 41008: 2020-06-05 10:47:33-863 INFO [hiveEngineEngine-Thread-3] org.apache.hadoop.hive.ql.exec.SelectOperator org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:326) initialize - Initializing operator SEL[1] 41008: 2020-06-05 10:47:33-863 INFO [hiveEngineEngine-Thread-3] org.apache.hadoop.hive.ql.exec.SelectOperator org.apache.hadoop.hive.ql.exec.SelectOperator.initializeOp(SelectOperator.java:73) initializeOp - SELECT struct<id:varchar(256),prj_name:varchar(400),prj_code:varchar(201),shuijin_month:varchar(10),zengzhishui_fujian_amt:decimal(18,2),tudi_zengzhi_amt:decimal(18,2),qiyi_suodeshui_amt:decimal(18,2),tidi_shiyongshui_amt:decimal(18,2),fangchanshui_amt:decimal(18,2),other_amt:decimal(18,2)> 41008: 2020-06-05 10:47:33-864 INFO [hiveEngineEngine-Thread-3] org.apache.hadoop.hive.ql.exec.LimitOperator org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:326) initialize - Initializing operator LIM[2] 41008: 2020-06-05 10:47:33-870 INFO [hiveEngineEngine-Thread-3] org.apache.hadoop.hive.ql.exec.ListSinkOperator org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:326) initialize - Initializing operator LIST_SINK[4] 41008: 2020-06-05 10:47:33-871 INFO [hiveEngineEngine-Thread-3] org.apache.hadoop.hive.ql.Driver org.apache.hadoop.hive.ql.Driver.compile(Driver.java:591) compile - Completed compiling command(queryId=hadoop_20200605104733_fe8f3d35-b78c-4ad9-ac99-8fdf3ad6281e); Time taken: 0.31 seconds 41008: 2020-06-05 10:47:33-907 INFO [hiveEngineEngine-Thread-3] org.apache.hadoop.hive.ql.Driver org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1658) execute - Executing command(queryId=hadoop_20200605104733_fe8f3d35-b78c-4ad9-ac99-8fdf3ad6281e): select * from dwd.dwd_o_prjshuijin limit 100 41008: 2020-06-05 10:47:33-908 INFO [hiveEngineEngine-Thread-3] org.apache.hadoop.hive.ql.Driver org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1940) execute - Completed executing command(queryId=hadoop_20200605104733_fe8f3d35-b78c-4ad9-ac99-8fdf3ad6281e); Time taken: 0.0 seconds 41008: OK 41008: 2020-06-05 10:47:33-909 INFO [hiveEngineEngine-Thread-3] org.apache.hadoop.hive.ql.Driver org.apache.hadoop.hive.ql.session.SessionState$LogHelper.printInfo(SessionState.java:1088) printInfo - OK 41008: 2020-06-05 10:47:33-956 INFO [hiveEngineEngine-Thread-3] com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutor com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutor$$anon$1.run(HiveEngineExecutor.scala:145) run - n4:41008 >> Time taken: 395 ms, begin to fetch results. 41008: 2020-06-05 10:47:34-052 INFO [hiveEngineEngine-Thread-3] com.webank.wedatasphere.linkis.storage.utils.FileSystemUtils$ com.webank.wedatasphere.linkis.common.utils.Logging$class.info(Logging.scala:42) info - doesn't need to call setOwner 41008: 2020-06-05 10:47:34-079 INFO [hiveEngineEngine-Thread-3] com.webank.wedatasphere.linkis.storage.utils.FileSystemUtils$ com.webank.wedatasphere.linkis.common.utils.Logging$class.info(Logging.scala:42) info - doesn't need to call setOwner 41008: 2020-06-05 10:47:34-105 INFO [hiveEngineEngine-Thread-3] com.webank.wedatasphere.linkis.storage.resultset.StorageResultSetWriter com.webank.wedatasphere.linkis.common.utils.Logging$class.info(Logging.scala:42) info - Succeed to create a new file:FsPath{path=/tmp/linkis/hadoop/dwc/20200605/IDE/422/_0.dolphin; isDirectory=false; length=0; modification_time=0; access_time=0; owner=null; group=null; permission=null} 41008: 2020-06-05 10:47:34,106 INFO (hiveEngineEngine-Thread-3) INFO deprecation - mapred.input.dir is deprecated. Instead, use mapreduce.input.fileinputformat.inputdir 41008: 2020-06-05 10:47:34-240 WARN [hiveEngineEngine-Thread-3] com.webank.wedatasphere.linkis.engine.hive.executor.HiveDriverProxy com.webank.wedatasphere.linkis.common.utils.Utils$$anonfun$tryAndWarn$1.apply(Utils.scala:84) apply - java.lang.reflect.InvocationTargetException: null 41008: at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_181] 41008: at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_181] 41008: at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_181] 41008: at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_181] 41008: at com.webank.wedatasphere.linkis.engine.hive.executor.HiveDriverProxy$$anonfun$getResults$1.apply$mcZ$sp(HiveEngineExecutor.scala:362) ~[linkis-hive-engine-0.9.3.jar:?] 41008: at com.webank.wedatasphere.linkis.engine.hive.executor.HiveDriverProxy$$anonfun$getResults$1.apply(HiveEngineExecutor.scala:362) ~[linkis-hive-engine-0.9.3.jar:?] 41008: at com.webank.wedatasphere.linkis.engine.hive.executor.HiveDriverProxy$$anonfun$getResults$1.apply(HiveEngineExecutor.scala:362) ~[linkis-hive-engine-0.9.3.jar:?] 41008: at com.webank.wedatasphere.linkis.common.utils.Utils$.tryCatch(Utils.scala:48) [linkis-common-0.9.3.jar:?] 41008: at com.webank.wedatasphere.linkis.common.utils.Utils$.tryAndWarn(Utils.scala:74) [linkis-common-0.9.3.jar:?] 41008: at com.webank.wedatasphere.linkis.engine.hive.executor.HiveDriverProxy.getResults(HiveEngineExecutor.scala:361) [linkis-hive-engine-0.9.3.jar:?] 41008: at com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutor$$anon$1.run(HiveEngineExecutor.scala:166) [linkis-hive-engine-0.9.3.jar:?] 41008: at com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutor$$anon$1.run(HiveEngineExecutor.scala:121) [linkis-hive-engine-0.9.3.jar:?] 41008: at java.security.AccessController.doPrivileged(Native Method) [?:1.8.0_181] 41008: at javax.security.auth.Subject.doAs(Subject.java:422) [?:1.8.0_181] 41008: at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) [hadoop-common-2.7.2.jar:?] 41008: at com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutor.executeLine(HiveEngineExecutor.scala:121) [linkis-hive-engine-0.9.3.jar:?] 41008: at com.webank.wedatasphere.linkis.engine.execute.EngineExecutor$$anonfun$execute$1$$anonfun$apply$9$$anonfun$apply$10.apply(EngineExecutor.scala:141) [linkis-ujes-engine-0.9.3.jar:?] 41008: at com.webank.wedatasphere.linkis.engine.execute.EngineExecutor$$anonfun$execute$1$$anonfun$apply$9$$anonfun$apply$10.apply(EngineExecutor.scala:140) [linkis-ujes-engine-0.9.3.jar:?] 41008: at com.webank.wedatasphere.linkis.common.utils.Utils$.tryCatch(Utils.scala:48) [linkis-common-0.9.3.jar:?] 41008: at com.webank.wedatasphere.linkis.engine.execute.EngineExecutor$$anonfun$execute$1$$anonfun$apply$9.apply(EngineExecutor.scala:141) [linkis-ujes-engine-0.9.3.jar:?] 41008: at com.webank.wedatasphere.linkis.engine.execute.EngineExecutor$$anonfun$execute$1$$anonfun$apply$9.apply(EngineExecutor.scala:136) [linkis-ujes-engine-0.9.3.jar:?] 41008: at scala.collection.immutable.Range.foreach(Range.scala:160) [scala-library-2.11.8.jar:?] 41008: at com.webank.wedatasphere.linkis.engine.execute.EngineExecutor$$anonfun$execute$1.apply(EngineExecutor.scala:136) [linkis-ujes-engine-0.9.3.jar:?] 41008: at com.webank.wedatasphere.linkis.engine.execute.EngineExecutor$$anonfun$execute$1.apply(EngineExecutor.scala:118) [linkis-ujes-engine-0.9.3.jar:?] 41008: at com.webank.wedatasphere.linkis.common.utils.Utils$.tryFinally(Utils.scala:62) [linkis-common-0.9.3.jar:?] 41008: at com.webank.wedatasphere.linkis.scheduler.executer.AbstractExecutor.ensureIdle(AbstractExecutor.scala:60) [linkis-scheduler-0.9.3.jar:?] 41008: at com.webank.wedatasphere.linkis.scheduler.executer.AbstractExecutor.ensureIdle(AbstractExecutor.scala:54) [linkis-scheduler-0.9.3.jar:?] 41008: at com.webank.wedatasphere.linkis.engine.execute.EngineExecutor.ensureOp$1(EngineExecutor.scala:117) [linkis-ujes-engine-0.9.3.jar:?] 41008: at com.webank.wedatasphere.linkis.engine.execute.EngineExecutor.execute(EngineExecutor.scala:118) [linkis-ujes-engine-0.9.3.jar:?] 41008: at com.webank.wedatasphere.linkis.scheduler.queue.Job$$anonfun$3.apply(Job.scala:254) [linkis-scheduler-0.9.3.jar:?] 41008: at com.webank.wedatasphere.linkis.scheduler.queue.Job$$anonfun$3.apply(Job.scala:254) [linkis-scheduler-0.9.3.jar:?] 41008: at com.webank.wedatasphere.linkis.common.utils.Utils$.tryCatch(Utils.scala:48) [linkis-common-0.9.3.jar:?] 41008: at com.webank.wedatasphere.linkis.scheduler.queue.Job.run(Job.scala:254) [linkis-scheduler-0.9.3.jar:?] 41008: at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_181] 41008: at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_181] 41008: at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_181] 41008: at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_181] 41008: at java.lang.Thread.run(Thread.java:748) [?:1.8.0_181] 41008: Caused by: java.io.IOException: java.io.IOException: java.lang.reflect.InvocationTargetException 41008: at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:521) ~[hive-exec-2.1.1.jar:2.1.1] 41008: at org.apache.hadoop.hive.ql.exec.FetchOperator.pushRow(FetchOperator.java:428) ~[hive-exec-2.1.1.jar:2.1.1] 41008: at org.apache.hadoop.hive.ql.exec.FetchTask.fetch(FetchTask.java:146) ~[hive-exec-2.1.1.jar:2.1.1] 41008: at org.apache.hadoop.hive.ql.Driver.getResults(Driver.java:2098) ~[hive-exec-2.1.1.jar:2.1.1] 41008: ... 38 more 41008: Caused by: java.io.IOException: java.lang.reflect.InvocationTargetException 41008: at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:221) ~[hbase-client-2.1.0.jar:2.1.0] 41008: at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:114) ~[hbase-client-2.1.0.jar:2.1.0] 41008: at org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat.getSplitsInternal(HiveHBaseTableInputFormat.java:315) ~[hive-hbase-handler.jar:2.1.1-cdh6.3.2] 41008: at org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat.getSplits(HiveHBaseTableInputFormat.java:302) ~[hive-hbase-handler.jar:2.1.1-cdh6.3.2] 41008: at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextSplits(FetchOperator.java:372) ~[hive-exec-2.1.1.jar:2.1.1] 41008: at org.apache.hadoop.hive.ql.exec.FetchOperator.getRecordReader(FetchOperator.java:304) ~[hive-exec-2.1.1.jar:2.1.1] 41008: at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:459) ~[hive-exec-2.1.1.jar:2.1.1] 41008: at org.apache.hadoop.hive.ql.exec.FetchOperator.pushRow(FetchOperator.java:428) ~[hive-exec-2.1.1.jar:2.1.1] 41008: at org.apache.hadoop.hive.ql.exec.FetchTask.fetch(FetchTask.java:146) ~[hive-exec-2.1.1.jar:2.1.1] 41008: at org.apache.hadoop.hive.ql.Driver.getResults(Driver.java:2098) ~[hive-exec-2.1.1.jar:2.1.1] 41008: ... 38 more 41008: Caused by: java.lang.reflect.InvocationTargetException 41008: at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_181] 41008: at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_181] 41008: at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_181] 41008: at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_181] 41008: at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:219) ~[hbase-client-2.1.0.jar:2.1.0] 41008: at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:114) ~[hbase-client-2.1.0.jar:2.1.0] 41008: at org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat.getSplitsInternal(HiveHBaseTableInputFormat.java:315) ~[hive-hbase-handler.jar:2.1.1-cdh6.3.2] 41008: at org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat.getSplits(HiveHBaseTableInputFormat.java:302) ~[hive-hbase-handler.jar:2.1.1-cdh6.3.2] 41008: at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextSplits(FetchOperator.java:372) ~[hive-exec-2.1.1.jar:2.1.1] 41008: at org.apache.hadoop.hive.ql.exec.FetchOperator.getRecordReader(FetchOperator.java:304) ~[hive-exec-2.1.1.jar:2.1.1] 41008: at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:459) ~[hive-exec-2.1.1.jar:2.1.1] 41008: at org.apache.hadoop.hive.ql.exec.FetchOperator.pushRow(FetchOperator.java:428) ~[hive-exec-2.1.1.jar:2.1.1] 41008: at org.apache.hadoop.hive.ql.exec.FetchTask.fetch(FetchTask.java:146) ~[hive-exec-2.1.1.jar:2.1.1] 41008: at org.apache.hadoop.hive.ql.Driver.getResults(Driver.java:2098) ~[hive-exec-2.1.1.jar:2.1.1] 41008: ... 38 more 41008: Caused by: java.lang.NullPointerException 41008: at org.apache.hadoop.hbase.client.ConnectionImplementation.close(ConnectionImplementation.java:1920) ~[hbase-client-2.1.0.jar:2.1.0] 41008: at org.apache.hadoop.hbase.client.ConnectionImplementation.(ConnectionImplementation.java:310) ~[hbase-client-2.1.0.jar:2.1.0] 41008: at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_181] 41008: at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_181] 41008: at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_181] 41008: at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_181] 41008: at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:219) ~[hbase-client-2.1.0.jar:2.1.0] 41008: at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:114) ~[hbase-client-2.1.0.jar:2.1.0] 41008: at org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat.getSplitsInternal(HiveHBaseTableInputFormat.java:315) ~[hive-hbase-handler.jar:2.1.1-cdh6.3.2] 41008: at org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat.getSplits(HiveHBaseTableInputFormat.java:302) ~[hive-hbase-handler.jar:2.1.1-cdh6.3.2] 41008: at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextSplits(FetchOperator.java:372) ~[hive-exec-2.1.1.jar:2.1.1] 41008: at org.apache.hadoop.hive.ql.exec.FetchOperator.getRecordReader(FetchOperator.java:304) ~[hive-exec-2.1.1.jar:2.1.1] 41008: at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:459) ~[hive-exec-2.1.1.jar:2.1.1] 41008: at org.apache.hadoop.hive.ql.exec.FetchOperator.pushRow(FetchOperator.java:428) ~[hive-exec-2.1.1.jar:2.1.1] 41008: at org.apache.hadoop.hive.ql.exec.FetchTask.fetch(FetchTask.java:146) ~[hive-exec-2.1.1.jar:2.1.1] 41008: at org.apache.hadoop.hive.ql.Driver.getResults(Driver.java:2098) ~[hive-exec-2.1.1.jar:2.1.1] 41008: ... 38 more 41008: 41008: 2020-06-05 10:47:34-314 INFO [hiveEngineEngine-Thread-3] com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutor com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutor$$anon$1.run(HiveEngineExecutor.scala:200) run - n4:41008 >> Fetched 10 col(s) : 0 row(s) in hive 41008: 2020-06-05 10:47:34-331 INFO [hiveEngineEngine-Thread-3] com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutor com.webank.wedatasphere.linkis.common.utils.Logging$class.info(Logging.scala:42) info - com.webank.wedatasphere.linkis.engine.hive.executor.HiveEngineExecutor@2ed9e59b change state Busy => Idle. 41008: 2020-06-05 10:47:34-331 INFO [hiveEngineEngine-Thread-3] com.webank.wedatasphere.linkis.engine.EngineReceiver com.webank.wedatasphere.linkis.common.utils.Logging$class.info(Logging.scala:42) info - broadcast the state of UserWithCreator(hadoop,IDE) from Busy to Idle. 41008: 2020-06-05 10:47:34-340 INFO [hiveEngineEngine-Thread-3] com.webank.wedatasphere.linkis.engine.execute.CommonEngineJob com.webank.wedatasphere.linkis.common.utils.Logging$class.info(Logging.scala:42) info - hiveEngineEngine_1 change state Running => Succeed. 41008: 2020-06-05 10:47:34-340 INFO [hiveEngineEngine-Thread-3] com.webank.wedatasphere.linkis.engine.log.LogHelper$ com.webank.wedatasphere.linkis.engine.log.LogHelper$.pushAllRemainLogs(LogHelper.scala:39) pushAllRemainLogs - start to push all remain logs, and size is 1 41008: 2020-06-05 10:47:34-371 INFO [hiveEngineEngine-Thread-3] com.webank.wedatasphere.linkis.engine.log.LogHelper$ com.webank.wedatasphere.linkis.engine.log.LogHelper$.pushAllRemainLogs(LogHelper.scala:58) pushAllRemainLogs - end to push all remain logs

这是完整日志,请帮忙查看一下。

wForget commented 4 years ago

image 创建 hbase 链接失败了,可以看看 hbase 相关的配置是不是有问题,Hbase Zookeeper 地址有没有配置?

ittechblog commented 4 years ago

image 创建 hbase 链接失败了,可以看看 hbase 相关的配置是不是有问题,Hbase Zookeeper 地址有没有配置?

这台机器上也安装了hdfs,hbase,hive,spark,直接通过hbase命令和hive命令查询外部表也是没问题的。只有通过dss和links查询hive外部表才有问题,直接查询hive内部表也是没问题的。

wForget commented 4 years ago

image 把日志级别调成 debug,可以看一下具体的错误信息

bertramlau commented 3 years ago

image 创建 hbase 链接失败了,可以看看 hbase 相关的配置是不是有问题,Hbase Zookeeper 地址有没有配置?

这台机器上也安装了hdfs,hbase,hive,spark,直接通过hbase命令和hive命令查询外部表也是没问题的。只有通过dss和links查询hive外部表才有问题,直接查询hive内部表也是没问题的。

这个问题最后怎么解决的?求教