microsoft / azure-tools-for-java

Azure tools for Java, including Azure Toolkits for Eclipse, IntelliJ and related projects.
Other
238 stars 158 forks source link

[IntelliJ] Remote submit on spark 2.3 get errors #1807

Closed jingyanjingyan closed 6 years ago

jingyanjingyan commented 6 years ago

Build: 1757

Repro Steps:

  1. Remote submit with spark 2.3, SparkCore_WasbIOTest get error 1
  2. Remote submit with linked spark 2.3, SparkCore_WasbIOTest, error2
  3. Remote submit with spark 2.3, LogQuery, pass
  4. Remote submit with linked spark 2.3, LogQuery, error 2
  5. Remote submit with linked spark 2.2, IO, fail... Result:

    error 1

    
    Package and deploy the job to Spark cluster
    INFO: Begin uploading file C:\Users\v-yajing\IdeaProjects\sbt-spark22-0619\out\artifacts\sbt_spark22_0619_DefaultArtifact\default_artifact.jar to Azure Blob Storage Account wasbs://enhancementtesetltian@enhancementstorage.blob.core.windows.net/SparkSubmission/2018/07/17/f43d6cde-4b69-4753-b8df-47350795a5ea/default_artifact.jar ...
    INFO: Submit file to azure blob 'wasbs://enhancementtesetltian@enhancementstorage.blob.core.windows.net/SparkSubmission/2018/07/17/f43d6cde-4b69-4753-b8df-47350795a5ea/default_artifact.jar' successfully.
    LOG: SLF4J: Class path contains multiple SLF4J bindings.
    LOG: SLF4J: Found binding in [jar:file:/usr/hdp/2.6.5.8-7/spark2/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    LOG: SLF4J: Found binding in [jar:file:/usr/hdp/2.6.5.8-7/spark_llap/spark-llap-assembly-1.0.0.2.6.5.8-7.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    LOG: SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    LOG: SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
    LOG: Warning: Master yarn-cluster is deprecated since 2.0. Please use master "yarn" with specified deploy mode instead.
    LOG: 18/07/17 07:34:45 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
    LOG: Warning: Skip remote jar wasbs://enhancementtesetltian@enhancementstorage.blob.core.windows.net/SparkSubmission/2018/07/17/f43d6cde-4b69-4753-b8df-47350795a5ea/default_artifact.jar.
    LOG: 18/07/17 07:34:46 INFO MetricsConfig: loaded properties from hadoop-metrics2-azure-file-system.properties
    LOG: 18/07/17 07:34:46 INFO WasbAzureIaasSink: Init starting.
    LOG: 18/07/17 07:34:46 INFO AzureIaasSink: Init starting. Initializing MdsLogger.
    LOG: 18/07/17 07:34:46 INFO AzureIaasSink: Init completed.
    LOG: 18/07/17 07:34:46 INFO WasbAzureIaasSink: Init completed.
    LOG: 18/07/17 07:34:46 INFO MetricsSinkAdapter: Sink azurefs2 started
    LOG: 18/07/17 07:34:46 INFO RequestHedgingRMFailoverProxyProvider: Looking for the active RM in [rm1, rm2]...
    LOG: 18/07/17 07:34:46 INFO RequestHedgingRMFailoverProxyProvider: Found active RM [rm2]
    LOG: 18/07/17 07:34:46 INFO Client: Requesting a new application from cluster with 2 NodeManagers
    LOG: 18/07/17 07:34:46 INFO Client: Verifying our application has not requested more than the maximum memory capability of the cluster (51200 MB per container)
    LOG: 18/07/17 07:34:46 INFO Client: Will allocate AM container, with 1408 MB memory including 384 MB overhead
    LOG: 18/07/17 07:34:46 INFO Client: Setting up container launch context for our AM
    LOG: 18/07/17 07:34:46 INFO Client: Setting up the launch environment for our AM container
    LOG: 18/07/17 07:34:46 INFO Client: Preparing resources for our AM container
    LOG: 18/07/17 07:34:48 INFO Client: Uploading resource wasbs://enhancementtesetltian@enhancementstorage.blob.core.windows.net/SparkSubmission/2018/07/17/f43d6cde-4b69-4753-b8df-47350795a5ea/default_artifact.jar -> wasb://enhancementtesetltian@enhancementstorage.blob.core.windows.net/user/livy/.sparkStaging/application_1531388713196_0058/default_artifact.jar
    LOG: 18/07/17 07:34:49 INFO SecurityManager: Changing modify acls to: livy
    LOG: 18/07/17 07:34:49 INFO SecurityManager: Changing view acls groups to: 
    LOG: 18/07/17 07:34:49 INFO SecurityManager: Changing modify acls groups to: 
    LOG: 18/07/17 07:34:49 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(livy); groups with view permissions: Set(); users  with modify permissions: Set(livy); groups with modify permissions: Set()
    LOG: 18/07/17 07:34:49 INFO Client: Submitting application application_1531388713196_0058 to ResourceManager
    LOG: 18/07/17 07:34:49 INFO YarnClientImpl: Submitted application application_1531388713196_0058
    LOG: 18/07/17 07:34:49 INFO Client: Application report for application_1531388713196_0058 (state: ACCEPTED)
    LOG: 18/07/17 07:34:49 INFO Client: 
    LOG:     client token: N/A
    LOG:     diagnostics: [Tue Jul 17 07:34:49 +0000 2018] Application is Activated, waiting for resources to be assigned for AM.  Details : AM Partition = <DEFAULT_PARTITION> ; Partition Resource = <memory:102400, vCores:30> ; Queue's Absolute capacity = 50.0 % ; Queue's Absolute used capacity = 23.5 % ; Queue's Absolute max capacity = 100.0 % ; 
    LOG:     ApplicationMaster host: N/A
    LOG:     ApplicationMaster RPC port: -1
    LOG:     queue: default
    LOG:     start time: 1531812889366
    LOG:     final status: UNDEFINED
    LOG:     tracking URL: http://hn1-enhanc.bwjauwfnc4kufeitxcpn1gw4ga.cx.internal.cloudapp.net:8088/proxy/application_1531388713196_0058/
    LOG:     user: livy
    LOG: 18/07/17 07:34:49 INFO ShutdownHookManager: Shutdown hook called
    LOG: 18/07/17 07:34:49 INFO ShutdownHookManager: Deleting directory /tmp/spark-dbb5f2e2-5879-425d-94c8-12035db05555
    LOG: 18/07/17 07:34:49 INFO ShutdownHookManager: Deleting directory /tmp/spark-36dd469f-d5c9-47f3-8be4-01c65ddd1574
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:file:/usr/hdp/2.6.5.8-7/spark2/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: Found binding in [jar:file:/usr/hdp/2.6.5.8-7/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: Found binding in [jar:file:/usr/hdp/2.6.5.8-7/spark_llap/spark-llap-assembly-1.0.0.2.6.5.8-7.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
    18/07/17 07:35:21 INFO SignalUtils: Registered signal handler for TERM
    18/07/17 07:35:21 INFO SignalUtils: Registered signal handler for HUP
    18/07/17 07:35:21 INFO SignalUtils: Registered signal handler for INT
    18/07/17 07:35:21 INFO SecurityManager: Changing view acls to: yarn,livy
    18/07/17 07:35:21 INFO SecurityManager: Changing modify acls to: yarn,livy
    18/07/17 07:35:21 INFO SecurityManager: Changing view acls groups to: 
    18/07/17 07:35:21 INFO SecurityManager: Changing modify acls groups to: 
    18/07/17 07:35:21 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(yarn, livy); groups with view permissions: Set(); users  with modify permissions: Set(yarn, livy); groups with modify permissions: Set()
    18/07/17 07:35:22 INFO ApplicationMaster: Preparing Local resources
    18/07/17 07:35:22 INFO MetricsConfig: loaded properties from hadoop-metrics2-azure-file-system.properties
    18/07/17 07:35:22 INFO WasbAzureIaasSink: Init starting.
    18/07/17 07:35:22 INFO AzureIaasSink: Init starting. Initializing MdsLogger.
    18/07/17 07:35:22 INFO AzureIaasSink: Init completed.
    18/07/17 07:35:22 INFO WasbAzureIaasSink: Init completed.
    18/07/17 07:35:22 INFO MetricsSinkAdapter: Sink azurefs2 started
    18/07/17 07:35:22 INFO MetricsSystemImpl: Scheduled snapshot period at 60 second(s).
    18/07/17 07:35:22 INFO MetricsSystemImpl: azure-file-system metrics system started
    18/07/17 07:35:23 INFO ApplicationMaster: ApplicationAttemptId: appattempt_1531388713196_0058_000004
    18/07/17 07:35:23 INFO ApplicationMaster: Starting the user application in a separate Thread
    18/07/17 07:35:23 INFO ApplicationMaster: Waiting for spark context initialization...
    18/07/17 07:35:23 INFO SparkContext: Running Spark version 2.3.0.2.6.5.8-7
    18/07/17 07:35:23 INFO SparkContext: Submitted application: SparkCore_WasbIOTest
    18/07/17 07:35:23 INFO SecurityManager: Changing view acls to: yarn,livy
    18/07/17 07:35:23 INFO SecurityManager: Changing modify acls to: yarn,livy
    18/07/17 07:35:23 INFO SecurityManager: Changing view acls groups to: 
    18/07/17 07:35:23 INFO SecurityManager: Changing modify acls groups to: 
    18/07/17 07:35:23 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(yarn, livy); groups with view permissions: Set(); users  with modify permissions: Set(yarn, livy); groups with modify permissions: Set()
    18/07/17 07:35:23 INFO Utils: Successfully started service 'sparkDriver' on port 33337.
    18/07/17 07:35:23 INFO SparkEnv: Registering MapOutputTracker
    18/07/17 07:35:23 INFO SparkEnv: Registering BlockManagerMaster
    18/07/17 07:35:23 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
    18/07/17 07:35:23 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
    18/07/17 07:35:23 INFO DiskBlockManager: Created local directory at /mnt/resource/hadoop/yarn/local/usercache/livy/appcache/application_1531388713196_0058/blockmgr-b30f2283-5e64-4d56-932e-9f355b695192
    18/07/17 07:35:23 INFO MemoryStore: MemoryStore started with capacity 434.4 MB
    18/07/17 07:35:23 INFO SparkEnv: Registering OutputCommitCoordinator
    18/07/17 07:35:23 INFO log: Logging initialized @3082ms
    18/07/17 07:35:23 INFO JettyUtils: Adding filter: org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter
    18/07/17 07:35:23 INFO Server: jetty-9.3.z-SNAPSHOT
    18/07/17 07:35:23 INFO Server: Started @3219ms
    18/07/17 07:35:23 INFO AbstractConnector: Started ServerConnector@7a0721a8{HTTP/1.1,[http/1.1]}{0.0.0.0:40073}
    18/07/17 07:35:23 INFO Utils: Successfully started service 'SparkUI' on port 40073.
    18/07/17 07:35:23 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@fa62c14{/jobs,null,AVAILABLE,@Spark}
    18/07/17 07:35:23 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@5b6b2d63{/jobs/json,null,AVAILABLE,@Spark}
    18/07/17 07:35:23 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@704e74be{/jobs/job,null,AVAILABLE,@Spark}
    18/07/17 07:35:23 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@5d345b92{/jobs/job/json,null,AVAILABLE,@Spark}
    18/07/17 07:35:23 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@5fb66562{/stages,null,AVAILABLE,@Spark}
    18/07/17 07:35:23 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@5c5e11ae{/stages/json,null,AVAILABLE,@Spark}
    18/07/17 07:35:23 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@71423bb1{/stages/stage,null,AVAILABLE,@Spark}
    18/07/17 07:35:23 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@4e5b4186{/stages/stage/json,null,AVAILABLE,@Spark}
    18/07/17 07:35:23 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@2ea4744f{/stages/pool,null,AVAILABLE,@Spark}
    18/07/17 07:35:23 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@861944d{/stages/pool/json,null,AVAILABLE,@Spark}
    18/07/17 07:35:23 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@4d785e5b{/storage,null,AVAILABLE,@Spark}
    18/07/17 07:35:23 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@593b7103{/storage/json,null,AVAILABLE,@Spark}
    18/07/17 07:35:23 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@54b07067{/storage/rdd,null,AVAILABLE,@Spark}
    18/07/17 07:35:23 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@50f75540{/storage/rdd/json,null,AVAILABLE,@Spark}
    18/07/17 07:35:23 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@5ff25383{/environment,null,AVAILABLE,@Spark}
    18/07/17 07:35:23 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@5dc39217{/environment/json,null,AVAILABLE,@Spark}
    18/07/17 07:35:23 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@aa7e280{/executors,null,AVAILABLE,@Spark}
    18/07/17 07:35:23 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@6010ebc1{/executors/json,null,AVAILABLE,@Spark}
    18/07/17 07:35:23 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@2d33d20b{/executors/threadDump,null,AVAILABLE,@Spark}
    18/07/17 07:35:23 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@6c21be90{/executors/threadDump/json,null,AVAILABLE,@Spark}
    18/07/17 07:35:23 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@59a7f276{/static,null,AVAILABLE,@Spark}
    18/07/17 07:35:23 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@3e4328ba{/,null,AVAILABLE,@Spark}
    18/07/17 07:35:23 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@3168e311{/api,null,AVAILABLE,@Spark}
    18/07/17 07:35:23 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@6ed5ba6{/jobs/job/kill,null,AVAILABLE,@Spark}
    18/07/17 07:35:23 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@840534e{/stages/stage/kill,null,AVAILABLE,@Spark}
    18/07/17 07:35:23 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://wn0-enhanc.bwjauwfnc4kufeitxcpn1gw4ga.cx.internal.cloudapp.net:40073
    18/07/17 07:35:24 INFO YarnClusterScheduler: Created YarnClusterScheduler
    18/07/17 07:35:24 INFO SchedulerExtensionServices: Starting Yarn extension services with app application_1531388713196_0058 and attemptId Some(appattempt_1531388713196_0058_000004)
    18/07/17 07:35:24 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 39885.
    18/07/17 07:35:24 INFO NettyBlockTransferService: Server created on wn0-enhanc.bwjauwfnc4kufeitxcpn1gw4ga.cx.internal.cloudapp.net:39885
    18/07/17 07:35:24 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
    18/07/17 07:35:24 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, wn0-enhanc.bwjauwfnc4kufeitxcpn1gw4ga.cx.internal.cloudapp.net, 39885, None)
    18/07/17 07:35:24 INFO BlockManagerMasterEndpoint: Registering block manager wn0-enhanc.bwjauwfnc4kufeitxcpn1gw4ga.cx.internal.cloudapp.net:39885 with 434.4 MB RAM, BlockManagerId(driver, wn0-enhanc.bwjauwfnc4kufeitxcpn1gw4ga.cx.internal.cloudapp.net, 39885, None)
    18/07/17 07:35:24 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, wn0-enhanc.bwjauwfnc4kufeitxcpn1gw4ga.cx.internal.cloudapp.net, 39885, None)
    18/07/17 07:35:24 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, wn0-enhanc.bwjauwfnc4kufeitxcpn1gw4ga.cx.internal.cloudapp.net, 39885, None)
    18/07/17 07:35:24 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@37fbcfef{/metrics/json,null,AVAILABLE,@Spark}
    18/07/17 07:35:24 INFO EventLoggingListener: Logging events to wasb:/hdp/spark2-events/application_1531388713196_0058_4
    18/07/17 07:35:24 INFO SparkContext: Registered listener com.microsoft.hdinsight.spark.metrics.SparkMetricsListener
    18/07/17 07:35:24 INFO ApplicationMaster: 
    ===============================================================================
    YARN executor launch context:
    env:
    CLASSPATH -> {{PWD}}<CPS>{{PWD}}/__spark_conf__<CPS>{{PWD}}/__spark_libs__/*<CPS>/usr/hdp/current/spark2-client/jars/*<CPS>$HADOOP_CONF_DIR<CPS>/usr/hdp/current/hadoop-client/*<CPS>/usr/hdp/current/hadoop-client/lib/*<CPS>/usr/hdp/current/hadoop-hdfs-client/*<CPS>/usr/hdp/current/hadoop-hdfs-client/lib/*<CPS>/usr/hdp/current/hadoop-yarn-client/*<CPS>/usr/hdp/current/hadoop-yarn-client/lib/*<CPS>$PWD/mr-framework/hadoop/share/hadoop/mapreduce/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/lib/*:$PWD/mr-framework/hadoop/share/hadoop/common/*:$PWD/mr-framework/hadoop/share/hadoop/common/lib/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/lib/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/lib/*:$PWD/mr-framework/hadoop/share/hadoop/tools/lib/*:/usr/hdp/2.6.5.8-7/hadoop/lib/hadoop-lzo-0.6.0.2.6.5.8-7.jar:/etc/hadoop/conf/secure:/usr/hdp/current/ext/hadoop/*<CPS>:/usr/hdp/current/spark2-client/jars/*:/usr/lib/hdinsight-datalake/*:/usr/hdp/current/spark_llap/*:/usr/hdp/current/spark2-client/conf:<CPS>{{PWD}}/__spark_conf__/__hadoop_conf__
    SPARK_DIST_CLASSPATH -> :/usr/hdp/current/spark2-client/jars/*:/usr/lib/hdinsight-datalake/*:/usr/hdp/current/spark_llap/*:/usr/hdp/current/spark2-client/conf:
    SPARK_YARN_STAGING_DIR -> *********(redacted)
    SPARK_USER -> *********(redacted)
    
    command:
    LD_LIBRARY_PATH="/usr/hdp/current/hadoop-client/lib/native:/usr/hdp/current/hadoop-client/lib/native/Linux-amd64-64:$LD_LIBRARY_PATH" \ 
      {{JAVA_HOME}}/bin/java \ 
      -server \ 
      -Xmx9216m \ 
      '-Dhdp.version=' \ 
      '-Detwlogger.component=sparkexecutor' \ 
      '-DlogFilter.filename=SparkLogFilters.xml' \ 
      '-DpatternGroup.filename=SparkPatternGroups.xml' \ 
      '-Dlog4jspark.root.logger=INFO,console,RFA,ETW,Anonymizer' \ 
      '-Dlog4jspark.log.dir=/var/log/sparkapp/\${user.name}' \ 
      '-Dlog4jspark.log.file=sparkexecutor.log' \ 
      '-Dlog4j.configuration=file:/usr/hdp/current/spark2-client/conf/log4j.properties' \ 
      '-Djavax.xml.parsers.SAXParserFactory=com.sun.org.apache.xerces.internal.jaxp.SAXParserFactoryImpl' \ 
      '-XX:+UseG1GC' \ 
      '-XX:InitiatingHeapOccupancyPercent=45' \ 
      -Djava.io.tmpdir={{PWD}}/tmp \ 
      '-Dspark.history.ui.port=18080' \ 
      -Dspark.yarn.app.container.log.dir=<LOG_DIR> \ 
      -XX:OnOutOfMemoryError='kill %p' \ 
      org.apache.spark.executor.CoarseGrainedExecutorBackend \ 
      --driver-url \ 
      spark://CoarseGrainedScheduler@wn0-enhanc.bwjauwfnc4kufeitxcpn1gw4ga.cx.internal.cloudapp.net:33337 \ 
      --executor-id \ 
      <executorId> \ 
      --hostname \ 
      <hostname> \ 
      --cores \ 
      3 \ 
      --app-id \ 
      application_1531388713196_0058 \ 
      --user-class-path \ 
      file:$PWD/__app__.jar \ 
      1><LOG_DIR>/stdout \ 
      2><LOG_DIR>/stderr
    
    resources:
    __app__.jar -> resource { scheme: "wasb" host: "enhancementstorage.blob.core.windows.net" port: -1 file: "/user/livy/.sparkStaging/application_1531388713196_0058/default_artifact.jar" userInfo: "enhancementtesetltian" } size: 67649 timestamp: 1531812888000 type: FILE visibility: PRIVATE
    __spark_conf__ -> resource { scheme: "wasb" host: "enhancementstorage.blob.core.windows.net" port: -1 file: "/user/livy/.sparkStaging/application_1531388713196_0058/__spark_conf__.zip" userInfo: "enhancementtesetltian" } size: 255150 timestamp: 1531812889000 type: ARCHIVE visibility: PRIVATE

=============================================================================== 18/07/17 07:35:24 INFO YarnRMClient: Registering the ApplicationMaster 18/07/17 07:35:24 INFO RequestHedgingRMFailoverProxyProvider: Looking for the active RM in [rm1, rm2]... 18/07/17 07:35:24 INFO RequestHedgingRMFailoverProxyProvider: Found active RM [rm2] 18/07/17 07:35:24 INFO YarnAllocator: Will request 2 executor container(s), each with 3 core(s) and 9600 MB memory (including 384 MB of overhead) 18/07/17 07:35:24 INFO YarnSchedulerBackend$YarnSchedulerEndpoint: ApplicationMaster registered as NettyRpcEndpointRef(spark://YarnAM@wn0-enhanc.bwjauwfnc4kufeitxcpn1gw4ga.cx.internal.cloudapp.net:33337) 18/07/17 07:35:24 INFO YarnAllocator: Submitted 2 unlocalized container requests. 18/07/17 07:35:24 INFO ApplicationMaster: Started progress reporter thread with (heartbeat : 5000, initial allocation : 200) intervals 18/07/17 07:35:26 INFO AMRMClientImpl: Received new token for : wn0-enhanc.bwjauwfnc4kufeitxcpn1gw4ga.cx.internal.cloudapp.net:30050 18/07/17 07:35:26 INFO AMRMClientImpl: Received new token for : wn1-enhanc.bwjauwfnc4kufeitxcpn1gw4ga.cx.internal.cloudapp.net:30050 18/07/17 07:35:26 INFO YarnAllocator: Launching container container_1531388713196_0058_04_000002 on host wn0-enhanc.bwjauwfnc4kufeitxcpn1gw4ga.cx.internal.cloudapp.net for executor with ID 1 18/07/17 07:35:26 INFO YarnAllocator: Launching container container_1531388713196_0058_04_000003 on host wn1-enhanc.bwjauwfnc4kufeitxcpn1gw4ga.cx.internal.cloudapp.net for executor with ID 2 18/07/17 07:35:26 INFO YarnAllocator: Received 2 containers from YARN, launching executors on 2 of them. 18/07/17 07:35:26 INFO ContainerManagementProtocolProxy: yarn.client.max-cached-nodemanagers-proxies : 0 18/07/17 07:35:26 INFO ContainerManagementProtocolProxy: yarn.client.max-cached-nodemanagers-proxies : 0 18/07/17 07:35:26 INFO ContainerManagementProtocolProxy: Opening proxy : wn0-enhanc.bwjauwfnc4kufeitxcpn1gw4ga.cx.internal.cloudapp.net:30050 18/07/17 07:35:26 INFO ContainerManagementProtocolProxy: Opening proxy : wn1-enhanc.bwjauwfnc4kufeitxcpn1gw4ga.cx.internal.cloudapp.net:30050 18/07/17 07:35:28 INFO YarnSchedulerBackend$YarnDriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (10.0.0.14:49860) with ID 1 18/07/17 07:35:28 INFO YarnSchedulerBackend$YarnDriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (10.0.0.11:43952) with ID 2 18/07/17 07:35:28 INFO YarnClusterSchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.8 18/07/17 07:35:28 INFO YarnClusterScheduler: YarnClusterScheduler.postStartHook done 18/07/17 07:35:28 INFO BlockManagerMasterEndpoint: Registering block manager wn0-enhanc.bwjauwfnc4kufeitxcpn1gw4ga.cx.internal.cloudapp.net:40399 with 5.2 GB RAM, BlockManagerId(1, wn0-enhanc.bwjauwfnc4kufeitxcpn1gw4ga.cx.internal.cloudapp.net, 40399, None) 18/07/17 07:35:28 INFO BlockManagerMasterEndpoint: Registering block manager wn1-enhanc.bwjauwfnc4kufeitxcpn1gw4ga.cx.internal.cloudapp.net:44721 with 5.2 GB RAM, BlockManagerId(2, wn1-enhanc.bwjauwfnc4kufeitxcpn1gw4ga.cx.internal.cloudapp.net, 44721, None) 18/07/17 07:35:29 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 371.6 KB, free 434.0 MB) 18/07/17 07:35:29 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 38.0 KB, free 434.0 MB) 18/07/17 07:35:29 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on wn0-enhanc.bwjauwfnc4kufeitxcpn1gw4ga.cx.internal.cloudapp.net:39885 (size: 38.0 KB, free: 434.4 MB) 18/07/17 07:35:29 INFO SparkContext: Created broadcast 0 from textFile at SparkCore_WasbIOTest.scala:14 18/07/17 07:35:29 ERROR ApplicationMaster: User class threw exception: java.lang.NullPointerException java.lang.NullPointerException at org.apache.hadoop.fs.adl.AdlFileSystem.initialize(AdlFileSystem.java:144) at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2796) at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:99) at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2830) at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2812) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:390) at org.apache.hadoop.fs.Path.getFileSystem(Path.java:295) at org.apache.spark.internal.io.SparkHadoopWriterUtils$.createPathFromString(SparkHadoopWriterUtils.scala:55) at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopFile$4.apply$mcV$sp(PairRDDFunctions.scala:1066) at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopFile$4.apply(PairRDDFunctions.scala:1032) at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopFile$4.apply(PairRDDFunctions.scala:1032) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) at org.apache.spark.rdd.RDD.withScope(RDD.scala:363) at org.apache.spark.rdd.PairRDDFunctions.saveAsHadoopFile(PairRDDFunctions.scala:1032) at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopFile$1.apply$mcV$sp(PairRDDFunctions.scala:958) at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopFile$1.apply(PairRDDFunctions.scala:958) at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopFile$1.apply(PairRDDFunctions.scala:958) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) at org.apache.spark.rdd.RDD.withScope(RDD.scala:363) at org.apache.spark.rdd.PairRDDFunctions.saveAsHadoopFile(PairRDDFunctions.scala:957) at org.apache.spark.rdd.RDD$$anonfun$saveAsTextFile$1.apply$mcV$sp(RDD.scala:1493) at org.apache.spark.rdd.RDD$$anonfun$saveAsTextFile$1.apply(RDD.scala:1472) at org.apache.spark.rdd.RDD$$anonfun$saveAsTextFile$1.apply(RDD.scala:1472) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) at org.apache.spark.rdd.RDD.withScope(RDD.scala:363) at org.apache.spark.rdd.RDD.saveAsTextFile(RDD.scala:1472) at sample.SparkCore_WasbIOTest$.main(SparkCore_WasbIOTest.scala:19) at sample.SparkCore_WasbIOTest.main(SparkCore_WasbIOTest.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$4.run(ApplicationMaster.scala:721) 18/07/17 07:35:29 INFO ApplicationMaster: Final app status: FAILED, exitCode: 15, (reason: User class threw exception: java.lang.NullPointerException at org.apache.hadoop.fs.adl.AdlFileSystem.initialize(AdlFileSystem.java:144) at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2796) at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:99) at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2830) at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2812) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:390) at org.apache.hadoop.fs.Path.getFileSystem(Path.java:295) at org.apache.spark.internal.io.SparkHadoopWriterUtils$.createPathFromString(SparkHadoopWriterUtils.scala:55) at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopFile$4.apply$mcV$sp(PairRDDFunctions.scala:1066) at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopFile$4.apply(PairRDDFunctions.scala:1032) at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopFile$4.apply(PairRDDFunctions.scala:1032) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) at org.apache.spark.rdd.RDD.withScope(RDD.scala:363) at org.apache.spark.rdd.PairRDDFunctions.saveAsHadoopFile(PairRDDFunctions.scala:1032) at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopFile$1.apply$mcV$sp(PairRDDFunctions.scala:958) at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopFile$1.apply(PairRDDFunctions.scala:958) at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopFile$1.apply(PairRDDFunctions.scala:958) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) at org.apache.spark.rdd.RDD.withScope(RDD.scala:363) at org.apache.spark.rdd.PairRDDFunctions.saveAsHadoopFile(PairRDDFunctions.scala:957) at org.apache.spark.rdd.RDD$$anonfun$saveAsTextFile$1.apply$mcV$sp(RDD.scala:1493) at org.apache.spark.rdd.RDD$$anonfun$saveAsTextFile$1.apply(RDD.scala:1472) at org.apache.spark.rdd.RDD$$anonfun$saveAsTextFile$1.apply(RDD.scala:1472) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) at org.apache.spark.rdd.RDD.withScope(RDD.scala:363) at org.apache.spark.rdd.RDD.saveAsTextFile(RDD.scala:1472) at sample.SparkCore_WasbIOTest$.main(SparkCore_WasbIOTest.scala:19) at sample.SparkCore_WasbIOTest.main(SparkCore_WasbIOTest.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$4.run(ApplicationMaster.scala:721) ) 18/07/17 07:35:29 INFO SparkContext: Invoking stop() from shutdown hook 18/07/17 07:35:29 INFO AbstractConnector: Stopped Spark@7a0721a8{HTTP/1.1,[http/1.1]}{0.0.0.0:0} 18/07/17 07:35:29 INFO SparkUI: Stopped Spark web UI at http://wn0-enhanc.bwjauwfnc4kufeitxcpn1gw4ga.cx.internal.cloudapp.net:40073 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.BlockManager.disk.diskSpaceUsed_MB, value=0 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.BlockManager.memory.maxMem_MB, value=11133 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.BlockManager.memory.maxOffHeapMem_MB, value=0 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.BlockManager.memory.maxOnHeapMem_MB, value=11133 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.BlockManager.memory.memUsed_MB, value=0 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.BlockManager.memory.offHeapMemUsed_MB, value=0 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.BlockManager.memory.onHeapMemUsed_MB, value=0 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.BlockManager.memory.remainingMem_MB, value=11133 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.BlockManager.memory.remainingOffHeapMem_MB, value=0 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.BlockManager.memory.remainingOnHeapMem_MB, value=11133 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.DAGScheduler.job.activeJobs, value=0 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.DAGScheduler.job.allJobs, value=0 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.DAGScheduler.stage.failedStages, value=0 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.DAGScheduler.stage.runningStages, value=0 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.DAGScheduler.stage.waitingStages, value=0 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.LiveListenerBus.queue.appStatus.size, value=0 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.LiveListenerBus.queue.eventLog.size, value=0 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.LiveListenerBus.queue.executorManagement.size, value=0 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.LiveListenerBus.queue.shared.size, value=0 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.jvm.G1-Old-Generation.count, value=0 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.jvm.G1-Old-Generation.time, value=0 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.jvm.G1-Young-Generation.count, value=4 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.jvm.G1-Young-Generation.time, value=61 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.jvm.direct.capacity, value=62738 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.jvm.direct.count, value=12 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.jvm.direct.used, value=62739 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.jvm.heap.committed, value=924844032 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.jvm.heap.init, value=924844032 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.jvm.heap.max, value=1073741824 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.jvm.heap.usage, value=0.3549805358052254 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.jvm.heap.used, value=381157448 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.jvm.mapped.capacity, value=0 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.jvm.mapped.count, value=0 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.jvm.mapped.used, value=0 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.jvm.non-heap.committed, value=70778880 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.jvm.non-heap.init, value=2555904 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.jvm.non-heap.max, value=-1 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.jvm.non-heap.usage, value=-6.8825408E7 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.jvm.non-heap.used, value=68829696 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.jvm.pools.Code-Cache.committed, value=11010048 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.jvm.pools.Code-Cache.init, value=2555904 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.jvm.pools.Code-Cache.max, value=251658240 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.jvm.pools.Code-Cache.usage, value=0.039081573486328125 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.jvm.pools.Code-Cache.used, value=9835200 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.jvm.pools.Compressed-Class-Space.committed, value=7208960 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.jvm.pools.Compressed-Class-Space.init, value=0 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.jvm.pools.Compressed-Class-Space.max, value=1073741824 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.jvm.pools.Compressed-Class-Space.usage, value=0.00660165399312973 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.jvm.pools.Compressed-Class-Space.used, value=7088472 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.jvm.pools.G1-Eden-Space.committed, value=559939584 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.jvm.pools.G1-Eden-Space.init, value=49283072 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.jvm.pools.G1-Eden-Space.max, value=-1 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.jvm.pools.G1-Eden-Space.usage, value=0.5168539325842697 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.jvm.pools.G1-Eden-Space.used, value=289406976 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.jvm.pools.G1-Old-Gen.committed, value=341835776 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.jvm.pools.G1-Old-Gen.init, value=875560960 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.jvm.pools.G1-Old-Gen.max, value=1073741824 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.jvm.pools.G1-Old-Gen.usage, value=0.06494147330522537 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.jvm.pools.G1-Old-Gen.used, value=69730376 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.jvm.pools.G1-Survivor-Space.committed, value=23068672 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.jvm.pools.G1-Survivor-Space.init, value=0 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.jvm.pools.G1-Survivor-Space.max, value=-1 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.jvm.pools.G1-Survivor-Space.usage, value=1.0 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.jvm.pools.G1-Survivor-Space.used, value=23068672 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.jvm.pools.Metaspace.committed, value=52559872 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.jvm.pools.Metaspace.init, value=0 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.jvm.pools.Metaspace.max, value=-1 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.jvm.pools.Metaspace.usage, value=0.987604383815851 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.jvm.pools.Metaspace.used, value=51908360 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.jvm.total.committed, value=995622912 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.jvm.total.init, value=927399936 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.jvm.total.max, value=1073741823 18/07/17 07:35:29 INFO metrics: type=GAUGE, name=application_1531388713196_0058.driver.jvm.total.used, value=452118344 18/07/17 07:35:29 INFO metrics: type=COUNTER, name=application_1531388713196_0058.driver.HiveExternalCatalog.fileCacheHits, count=0 18/07/17 07:35:29 INFO metrics: type=COUNTER, name=application_1531388713196_0058.driver.HiveExternalCatalog.filesDiscovered, count=0 18/07/17 07:35:29 INFO metrics: type=COUNTER, name=application_1531388713196_0058.driver.HiveExternalCatalog.hiveClientCalls, count=0 18/07/17 07:35:29 INFO metrics: type=COUNTER, name=application_1531388713196_0058.driver.HiveExternalCatalog.parallelListingJobCount, count=0 18/07/17 07:35:29 INFO metrics: type=COUNTER, name=application_1531388713196_0058.driver.HiveExternalCatalog.partitionsFetched, count=0 18/07/17 07:35:29 INFO metrics: type=COUNTER, name=application_1531388713196_0058.driver.LiveListenerBus.numEventsPosted, count=9 18/07/17 07:35:29 INFO metrics: type=COUNTER, name=application_1531388713196_0058.driver.LiveListenerBus.queue.appStatus.numDroppedEvents, count=0 18/07/17 07:35:29 INFO metrics: type=COUNTER, name=application_1531388713196_0058.driver.LiveListenerBus.queue.eventLog.numDroppedEvents, count=0 18/07/17 07:35:29 INFO metrics: type=COUNTER, name=application_1531388713196_0058.driver.LiveListenerBus.queue.executorManagement.numDroppedEvents, count=0 18/07/17 07:35:29 INFO metrics: type=COUNTER, name=application_1531388713196_0058.driver.LiveListenerBus.queue.shared.numDroppedEvents, count=0 18/07/17 07:35:29 INFO metrics: type=HISTOGRAM, name=application_1531388713196_0058.driver.CodeGenerator.compilationTime, count=0, min=0, max=0, mean=0.0, stddev=0.0, median=0.0, p75=0.0, p95=0.0, p98=0.0, p99=0.0, p999=0.0 18/07/17 07:35:29 INFO metrics: type=HISTOGRAM, name=application_1531388713196_0058.driver.CodeGenerator.generatedClassSize, count=0, min=0, max=0, mean=0.0, stddev=0.0, median=0.0, p75=0.0, p95=0.0, p98=0.0, p99=0.0, p999=0.0 18/07/17 07:35:29 INFO metrics: type=HISTOGRAM, name=application_1531388713196_0058.driver.CodeGenerator.generatedMethodSize, count=0, min=0, max=0, mean=0.0, stddev=0.0, median=0.0, p75=0.0, p95=0.0, p98=0.0, p99=0.0, p999=0.0 18/07/17 07:35:29 INFO metrics: type=HISTOGRAM, name=application_1531388713196_0058.driver.CodeGenerator.sourceCodeSize, count=0, min=0, max=0, mean=0.0, stddev=0.0, median=0.0, p75=0.0, p95=0.0, p98=0.0, p99=0.0, p999=0.0 18/07/17 07:35:29 INFO metrics: type=TIMER, name=application_1531388713196_0058.driver.DAGScheduler.messageProcessingTime, count=2, min=0.006501, max=2.4506259999999997, mean=1.2285635, stddev=1.2220625, median=2.4506259999999997, p75=2.4506259999999997, p95=2.4506259999999997, p98=2.4506259999999997, p99=2.4506259999999997, p999=2.4506259999999997, mean_rate=0.3748669751269318, m1=0.4, m5=0.4, m15=0.4, rate_unit=events/second, duration_unit=milliseconds 18/07/17 07:35:29 INFO metrics: type=TIMER, name=application_1531388713196_0058.driver.LiveListenerBus.listenerProcessingTime.com.microsoft.hdinsight.spark.metrics.SparkMetricsListener, count=9, min=0.191802, max=9.153995, mean=3.0366951903654744, stddev=3.2872781180334467, median=1.824919, p75=6.1115639999999996, p95=9.153995, p98=9.153995, p99=9.153995, p999=9.153995, mean_rate=1.8560718737683126, m1=0.0, m5=0.0, m15=0.0, rate_unit=events/second, duration_unit=milliseconds 18/07/17 07:35:29 INFO metrics: type=TIMER, name=application_1531388713196_0058.driver.LiveListenerBus.listenerProcessingTime.org.apache.spark.HeartbeatReceiver, count=9, min=0.002201, max=7.828281, mean=1.55398767562332, stddev=2.4593976229903687, median=0.117601, p75=2.489426, p95=7.828281, p98=7.828281, p99=7.828281, p999=7.828281, mean_rate=1.6728702640944542, m1=1.4, m5=1.4, m15=1.4, rate_unit=events/second, duration_unit=milliseconds 18/07/17 07:35:29 INFO metrics: type=TIMER, name=application_1531388713196_0058.driver.LiveListenerBus.listenerProcessingTime.org.apache.spark.scheduler.EventLoggingListener, count=9, min=0.22150199999999998, max=18.833496, mean=4.963429587843711, stddev=5.895638174704117, median=2.543227, p75=9.014694, p95=18.833496, p98=18.833496, p99=18.833496, p999=18.833496, mean_rate=1.8520863873436146, m1=0.0, m5=0.0, m15=0.0, rate_unit=events/second, duration_unit=milliseconds 18/07/17 07:35:29 INFO metrics: type=TIMER, name=application_1531388713196_0058.driver.LiveListenerBus.listenerProcessingTime.org.apache.spark.status.AppStatusListener, count=9, min=0.0434, max=14.061446, mean=3.5402707923512073, stddev=4.453888841700679, median=2.105221, p75=5.971562, p95=14.061446, p98=14.061446, p99=14.061446, p999=14.061446, mean_rate=1.4743682026647817, m1=0.6, m5=0.6, m15=0.6, rate_unit=events/second, duration_unit=milliseconds 18/07/17 07:35:29 INFO metrics: type=TIMER, name=application_1531388713196_0058.driver.LiveListenerBus.queue.appStatus.listenerProcessingTime, count=9, min=0.085201, max=14.119147, mean=3.580990352214505, stddev=4.46043537477246, median=2.1568229999999997, p75=5.998762, p95=14.119147, p98=14.119147, p99=14.119147, p999=14.119147, mean_rate=1.4739298568140227, m1=0.6, m5=0.6, m15=0.6, rate_unit=events/second, duration_unit=milliseconds 18/07/17 07:35:29 INFO metrics: type=TIMER, name=application_1531388713196_0058.driver.LiveListenerBus.queue.eventLog.listenerProcessingTime, count=9, min=0.24640199999999998, max=18.875096, mean=5.0007787675778, stddev=5.900012014628203, median=2.583927, p75=9.043994, p95=18.875096, p98=18.875096, p99=18.875096, p999=18.875096, mean_rate=1.851317565792589, m1=0.0, m5=0.0, m15=0.0, rate_unit=events/second, duration_unit=milliseconds 18/07/17 07:35:29 INFO metrics: type=TIMER, name=application_1531388713196_0058.driver.LiveListenerBus.queue.executorManagement.listenerProcessingTime, count=9, min=0.042301, max=7.858782, mean=1.77310921474341, stddev=2.512965825703974, median=0.513605, p75=2.5694269999999997, p95=7.858782, p98=7.858782, p99=7.858782, p999=7.858782, mean_rate=1.6719658040651464, m1=1.4, m5=1.4, m15=1.4, rate_unit=events/second, duration_unit=milliseconds 18/07/17 07:35:29 INFO metrics: type=TIMER, name=application_1531388713196_0058.driver.LiveListenerBus.queue.shared.listenerProcessingTime, count=9, min=0.218102, max=9.184096, mean=3.0768390307763083, stddev=3.28391585413878, median=1.91522, p75=6.140364, p95=9.184096, p98=9.184096, p99=9.184096, p999=9.184096, mean_rate=1.8546195112335164, m1=0.0, m5=0.0, m15=0.0, rate_unit=events/second, duration_unit=milliseconds ts/second, duration_unit=milliseconds 18/07/17 07:35:38 INFO YarnAllocator: Driver requested a total number of 0 executor(s). 18/07/17 07:35:38 INFO YarnClusterSchedulerBackend: Shutting down all executors 18/07/17 07:35:38 INFO YarnSchedulerBackend$YarnDriverEndpoint: Asking each executor to shut down 18/07/17 07:35:38 INFO SchedulerExtensionServices: Stopping SchedulerExtensionServices (serviceOption=None, services=List(), started=false) 18/07/17 07:35:38 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped! 18/07/17 07:35:38 INFO MemoryStore: MemoryStore cleared 18/07/17 07:35:38 INFO BlockManager: BlockManager stopped 18/07/17 07:35:38 INFO BlockManagerMaster: BlockManagerMaster stopped 18/07/17 07:35:38 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped! 18/07/17 07:35:38 INFO SparkContext: Successfully stopped SparkContext 18/07/17 07:35:38 INFO ApplicationMaster: Unregistering ApplicationMaster with FAILED (diag message: User class threw exception: java.lang.NullPointerException at org.apache.hadoop.fs.adl.AdlFileSystem.initialize(AdlFileSystem.java:144) at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2796) at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:99) at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2830) at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2812) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:390) at org.apache.hadoop.fs.Path.getFileSystem(Path.java:295) at org.apache.spark.internal.io.SparkHadoopWriterUtils$.createPathFromString(SparkHadoopWriterUtils.scala:55) at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopFile$4.apply$mcV$sp(PairRDDFunctions.scala:1066) at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopFile$4.apply(PairRDDFunctions.scala:1032) at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopFile$4.apply(PairRDDFunctions.scala:1032) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) at org.apache.spark.rdd.RDD.withScope(RDD.scala:363) at org.apache.spark.rdd.PairRDDFunctions.saveAsHadoopFile(PairRDDFunctions.scala:1032) at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopFile$1.apply$mcV$sp(PairRDDFunctions.scala:958) at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopFile$1.apply(PairRDDFunctions.scala:958) at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopFile$1.apply(PairRDDFunctions.scala:958) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) at org.apache.spark.rdd.RDD.withScope(RDD.scala:363) at org.apache.spark.rdd.PairRDDFunctions.saveAsHadoopFile(PairRDDFunctions.scala:957) at org.apache.spark.rdd.RDD$$anonfun$saveAsTextFile$1.apply$mcV$sp(RDD.scala:1493) at org.apache.spark.rdd.RDD$$anonfun$saveAsTextFile$1.apply(RDD.scala:1472) at org.apache.spark.rdd.RDD$$anonfun$saveAsTextFile$1.apply(RDD.scala:1472) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) at org.apache.spark.rdd.RDD.withScope(RDD.scala:363) at org.apache.spark.rdd.RDD.saveAsTextFile(RDD.scala:1472) at sample.SparkCore_WasbIOTest$.main(SparkCore_WasbIOTest.scala:19) at sample.SparkCore_WasbIOTest.main(SparkCore_WasbIOTest.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$4.run(ApplicationMaster.scala:721) ) 18/07/17 07:35:39 INFO AMRMClientImpl: Waiting for application to be successfully unregistered. 18/07/17 07:35:39 INFO ApplicationMaster: Deleting staging directory wasb://enhancementtesetltian@enhancementstorage.blob.core.windows.net/user/livy/.sparkStaging/application_1531388713196_0058 18/07/17 07:35:39 WARN AzureFileSystemThreadPoolExecutor: Disabling threads for Delete operation as thread count 0 is <= 1 18/07/17 07:35:39 INFO AzureFileSystemThreadPoolExecutor: Time taken for Delete operation is: 15 ms with threads: 0 18/07/17 07:35:39 INFO ShutdownHookManager: Shutdown hook called 18/07/17 07:35:39 INFO ShutdownHookManager: Deleting directory /mnt/resource/hadoop/yarn/local/usercache/livy/appcache/application_1531388713196_0058/spark-1cb54ab6-da7e-4fd9-b4b8-5158e4d98949 18/07/17 07:35:39 INFO MetricsSystemImpl: Stopping azure-file-system metrics system... 18/07/17 07:35:39 INFO MetricsSinkAdapter: azurefs2 thread interrupted. 18/07/17 07:35:39 INFO MetricsSystemImpl: azure-file-system metrics system stopped. 18/07/17 07:35:39 INFO MetricsSystemImpl: azure-file-system metrics system shutdown complete. INFO: INFO: ========== RESULT ========== ERROR: Job state is dead ERROR: Diagnostics: at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) at org.apache.spark.rdd.RDD.withScope(RDD.scala:363) at org.apache.spark.rdd.RDD.saveAsTextFile(RDD.scala:1472) at sample.SparkCore_WasbIOTest$.main(SparkCore_WasbIOTest.scala:19) at sample.SparkCore_WasbIOTest.main(SparkCore_WasbIOTest.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$4.run(ApplicationMaster.scala:721)


## error 2
Package and deploy the job to Spark cluster
INFO: Get target jar from C:/Users/v-yajing/IdeaProjects/maven220622/out/artifacts/maven220622_DefaultArtifact/default_artifact.jar.
INFO: Create Spark helper interactive session...
ERROR: com.microsoft.azure.hdinsight.sdk.common.livy.interactive.exceptions.SessionNotStartException: Session Helper session to upload /SparkSubmission/2018/07/17/564810b3-3c15-4bba-8030-61022fad50ee/default_artifact.jar is SHUTTING_DOWN. stdout:  ; 
stderr:  ; SLF4J: Class path contains multiple SLF4J bindings. ; SLF4J: Found binding in [jar:file:/usr/hdp/2.6.5.8-7/spark2/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class] ; SLF4J: Found binding in [jar:file:/usr/hdp/2.6.5.8-7/spark_llap/spark-llap-assembly-1.0.0.2.6.5.8-7.jar!/org/slf4j/impl/StaticLoggerBinder.class] ; SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. ; SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] ; Error: Executor cores must be a positive number ; Run with --help for usage help or --verbose for debug output ; 
YARN Diagnostics: 
 stack trace: java.lang.RuntimeException: com.microsoft.azure.hdinsight.sdk.common.livy.interactive.exceptions.SessionNotStartException: Session Helper session to upload /SparkSubmission/2018/07/17/564810b3-3c15-4bba-8030-61022fad50ee/default_artifact.jar is SHUTTING_DOWN. stdout:  ; 
stderr:  ; SLF4J: Class path contains multiple SLF4J bindings. ; SLF4J: Found binding in [jar:file:/usr/hdp/2.6.5.8-7/spark2/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class] ; SLF4J: Found binding in [jar:file:/usr/hdp/2.6.5.8-7/spark_llap/spark-llap-assembly-1.0.0.2.6.5.8-7.jar!/org/slf4j/impl/StaticLoggerBinder.class] ; SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. ; SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] ; Error: Executor cores must be a positive number ; Run with --help for usage help or --verbose for debug output ; 
YARN Diagnostics: 
    at rx.exceptions.Exceptions.propagate(Exceptions.java:58)
    at com.microsoft.azure.hdinsight.sdk.common.livy.interactive.Session.lambda$awaitReady$9(Session.java:344)
    at rx.internal.operators.OnSubscribeReduceSeed$ReduceSeedSubscriber.onNext(OnSubscribeReduceSeed.java:57)
    at rx.internal.operators.OperatorTakeUntilPredicate$ParentSubscriber.onNext(OperatorTakeUntilPredicate.java:61)
    at rx.internal.operators.OnSubscribeRedo$2$1.onNext(OnSubscribeRedo.java:244)
    at rx.internal.operators.OperatorSwitchIfEmpty$ParentSubscriber.onNext(OperatorSwitchIfEmpty.java:90)
    at rx.internal.operators.OnSubscribeMap$MapSubscriber.onNext(OnSubscribeMap.java:77)
    at rx.internal.operators.OnSubscribeMap$MapSubscriber.onNext(OnSubscribeMap.java:77)
    at rx.internal.operators.OperatorMerge$MergeSubscriber.emitScalar(OperatorMerge.java:395)
    at rx.internal.operators.OperatorMerge$MergeSubscriber.tryEmit(OperatorMerge.java:355)
    at rx.internal.operators.OperatorMerge$InnerSubscriber.onNext(OperatorMerge.java:846)
    at rx.observers.Subscribers$5.onNext(Subscribers.java:235)
    at rx.internal.operators.OperatorDoAfterTerminate$1.onNext(OperatorDoAfterTerminate.java:50)
    at rx.internal.util.ScalarSynchronousObservable$WeakSingleProducer.request(ScalarSynchronousObservable.java:276)
    at rx.Subscriber.setProducer(Subscriber.java:211)
    at rx.Subscriber.setProducer(Subscriber.java:205)
    at rx.Subscriber.setProducer(Subscriber.java:205)
    at rx.internal.util.ScalarSynchronousObservable$JustOnSubscribe.call(ScalarSynchronousObservable.java:138)
    at rx.internal.util.ScalarSynchronousObservable$JustOnSubscribe.call(ScalarSynchronousObservable.java:129)
    at rx.internal.operators.OnSubscribeLift.call(OnSubscribeLift.java:48)
    at rx.internal.operators.OnSubscribeLift.call(OnSubscribeLift.java:30)
    at rx.Observable.unsafeSubscribe(Observable.java:10142)
    at rx.internal.operators.OnSubscribeUsing.call(OnSubscribeUsing.java:94)
    at rx.internal.operators.OnSubscribeUsing.call(OnSubscribeUsing.java:32)
    at rx.Observable.unsafeSubscribe(Observable.java:10142)
    at rx.internal.operators.OperatorMerge$MergeSubscriber.onNext(OperatorMerge.java:248)
    at rx.internal.operators.OperatorMerge$MergeSubscriber.onNext(OperatorMerge.java:148)
    at rx.internal.operators.OnSubscribeMap$MapSubscriber.onNext(OnSubscribeMap.java:77)
    at rx.internal.producers.SingleDelayedProducer.emit(SingleDelayedProducer.java:102)
    at rx.internal.producers.SingleDelayedProducer.setValue(SingleDelayedProducer.java:85)
    at rx.internal.operators.OnSubscribeFromCallable.call(OnSubscribeFromCallable.java:48)
    at rx.internal.operators.OnSubscribeFromCallable.call(OnSubscribeFromCallable.java:33)
    at rx.Observable.unsafeSubscribe(Observable.java:10142)
    at rx.internal.operators.OnSubscribeMap.call(OnSubscribeMap.java:48)
    at rx.internal.operators.OnSubscribeMap.call(OnSubscribeMap.java:33)
    at rx.internal.operators.OnSubscribeLift.call(OnSubscribeLift.java:48)
    at rx.internal.operators.OnSubscribeLift.call(OnSubscribeLift.java:30)
    at rx.Observable.unsafeSubscribe(Observable.java:10142)
    at rx.internal.operators.OnSubscribeMap.call(OnSubscribeMap.java:48)
    at rx.internal.operators.OnSubscribeMap.call(OnSubscribeMap.java:33)
    at rx.Observable.unsafeSubscribe(Observable.java:10142)
    at rx.internal.operators.OnSubscribeMap.call(OnSubscribeMap.java:48)
    at rx.internal.operators.OnSubscribeMap.call(OnSubscribeMap.java:33)
    at rx.internal.operators.OnSubscribeLift.call(OnSubscribeLift.java:48)
    at rx.internal.operators.OnSubscribeLift.call(OnSubscribeLift.java:30)
    at rx.Observable.unsafeSubscribe(Observable.java:10142)
    at rx.internal.operators.OnSubscribeRedo$2.call(OnSubscribeRedo.java:273)
    at rx.internal.schedulers.TrampolineScheduler$InnerCurrentThreadScheduler.enqueue(TrampolineScheduler.java:73)
    at rx.internal.schedulers.TrampolineScheduler$InnerCurrentThreadScheduler.schedule(TrampolineScheduler.java:52)
    at rx.internal.operators.OnSubscribeRedo$4$1.onNext(OnSubscribeRedo.java:336)
    at rx.internal.operators.OperatorDelay$1$3.call(OperatorDelay.java:87)
    at rx.internal.schedulers.EventLoopsScheduler$EventLoopWorker$2.call(EventLoopsScheduler.java:189)
    at rx.internal.schedulers.ScheduledAction.run(ScheduledAction.java:55)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
Caused by: com.microsoft.azure.hdinsight.sdk.common.livy.interactive.exceptions.SessionNotStartException: Session Helper session to upload /SparkSubmission/2018/07/17/564810b3-3c15-4bba-8030-61022fad50ee/default_artifact.jar is SHUTTING_DOWN. stdout:  ; 
stderr:  ; SLF4J: Class path contains multiple SLF4J bindings. ; SLF4J: Found binding in [jar:file:/usr/hdp/2.6.5.8-7/spark2/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class] ; SLF4J: Found binding in [jar:file:/usr/hdp/2.6.5.8-7/spark_llap/spark-llap-assembly-1.0.0.2.6.5.8-7.jar!/org/slf4j/impl/StaticLoggerBinder.class] ; SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. ; SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] ; Error: Executor cores must be a positive number ; Run with --help for usage help or --verbose for debug output ; 
YARN Diagnostics: 
    at com.microsoft.azure.hdinsight.sdk.common.livy.interactive.Session.lambda$awaitReady$9(Session.java:345)
    ... 58 more
Caused by: rx.exceptions.OnErrorThrowable$OnNextValue: OnError while emitting onNext value: com.microsoft.azure.hdinsight.sdk.common.livy.interactive.SparkSession.class
    at rx.exceptions.OnErrorThrowable.addValueAsLastCause(OnErrorThrowable.java:118)
    at rx.internal.operators.OnSubscribeMap$MapSubscriber.onNext(OnSubscribeMap.java:73)
    at rx.observers.Subscribers$5.onNext(Subscribers.java:235)
    at rx.internal.operators.OperatorDoAfterTerminate$1.onNext(OperatorDoAfterTerminate.java:50)
    at rx.internal.operators.OnSubscribeMap$MapSubscriber.onNext(OnSubscribeMap.java:77)
    at rx.internal.operators.OnSubscribeMap$MapSubscriber.onNext(OnSubscribeMap.java:77)
    at rx.internal.operators.OperatorMerge$MergeSubscriber.emitScalar(OperatorMerge.java:395)
    at rx.internal.operators.OperatorMerge$MergeSubscriber.tryEmit(OperatorMerge.java:355)
    at rx.internal.operators.OperatorMerge$InnerSubscriber.onNext(OperatorMerge.java:846)
    at rx.observers.Subscribers$5.onNext(Subscribers.java:235)
    at rx.internal.operators.OperatorDoAfterTerminate$1.onNext(OperatorDoAfterTerminate.java:50)
    at rx.internal.util.ScalarSynchronousObservable$WeakSingleProducer.request(ScalarSynchronousObservable.java:276)
    at rx.Subscriber.setProducer(Subscriber.java:211)
    at rx.Subscriber.setProducer(Subscriber.java:205)
    at rx.Subscriber.setProducer(Subscriber.java:205)
    at rx.internal.util.ScalarSynchronousObservable$JustOnSubscribe.call(ScalarSynchronousObservable.java:138)
    at rx.internal.util.ScalarSynchronousObservable$JustOnSubscribe.call(ScalarSynchronousObservable.java:129)
    at rx.internal.operators.OnSubscribeLift.call(OnSubscribeLift.java:48)
    at rx.internal.operators.OnSubscribeLift.call(OnSubscribeLift.java:30)
    at rx.Observable.unsafeSubscribe(Observable.java:10142)
    at rx.internal.operators.OnSubscribeUsing.call(OnSubscribeUsing.java:94)
    at rx.internal.operators.OnSubscribeUsing.call(OnSubscribeUsing.java:32)
    at rx.Observable.unsafeSubscribe(Observable.java:10142)
    at rx.internal.operators.OperatorMerge$MergeSubscriber.onNext(OperatorMerge.java:248)
    at rx.internal.operators.OperatorMerge$MergeSubscriber.onNext(OperatorMerge.java:148)
    at rx.internal.operators.OnSubscribeMap$MapSubscriber.onNext(OnSubscribeMap.java:77)
    at rx.internal.producers.SingleDelayedProducer.emit(SingleDelayedProducer.java:102)
    at rx.internal.producers.SingleDelayedProducer.setValue(SingleDelayedProducer.java:85)
    at rx.internal.operators.OnSubscribeFromCallable.call(OnSubscribeFromCallable.java:48)
    at rx.internal.operators.OnSubscribeFromCallable.call(OnSubscribeFromCallable.java:33)
    at rx.Observable.unsafeSubscribe(Observable.java:10142)
    at rx.internal.operators.OnSubscribeMap.call(OnSubscribeMap.java:48)
    at rx.internal.operators.OnSubscribeMap.call(OnSubscribeMap.java:33)
    at rx.internal.operators.OnSubscribeLift.call(OnSubscribeLift.java:48)
    at rx.internal.operators.OnSubscribeLift.call(OnSubscribeLift.java:30)
    at rx.Observable.unsafeSubscribe(Observable.java:10142)
    at rx.internal.operators.OnSubscribeMap.call(OnSubscribeMap.java:48)
    at rx.internal.operators.OnSubscribeMap.call(OnSubscribeMap.java:33)
    at rx.Observable.unsafeSubscribe(Observable.java:10142)
    at rx.internal.operators.OnSubscribeMap.call(OnSubscribeMap.java:48)
    at rx.internal.operators.OnSubscribeMap.call(OnSubscribeMap.java:33)
    at rx.internal.operators.OnSubscribeLift.call(OnSubscribeLift.java:48)
    at rx.internal.operators.OnSubscribeLift.call(OnSubscribeLift.java:30)
    at rx.Observable.unsafeSubscribe(Observable.java:10142)
    at rx.internal.operators.OnSubscribeUsing.call(OnSubscribeUsing.java:94)
    at rx.internal.operators.OnSubscribeUsing.call(OnSubscribeUsing.java:32)
    at rx.Observable.unsafeSubscribe(Observable.java:10142)
    at rx.internal.operators.OnSubscribeMap.call(OnSubscribeMap.java:48)
    at rx.internal.operators.OnSubscribeMap.call(OnSubscribeMap.java:33)
    at rx.internal.operators.OnSubscribeLift.call(OnSubscribeLift.java:48)
    at rx.internal.operators.OnSubscribeLift.call(OnSubscribeLift.java:30)
    at rx.Observable.subscribe(Observable.java:10238)
    at rx.Observable.subscribe(Observable.java:10205)
    at rx.observables.BlockingObservable.blockForSingle(BlockingObservable.java:444)
    at rx.observables.BlockingObservable.single(BlockingObservable.java:341)
    at com.microsoft.azure.hdinsight.spark.jobs.JobUtils.uploadFileToHDFS(JobUtils.java:584)
    at com.microsoft.azure.hdinsight.spark.jobs.JobUtils.uploadFileToCluster(JobUtils.java:598)
    at com.microsoft.azure.hdinsight.spark.jobs.JobUtils.lambda$deployArtifact$9(JobUtils.java:630)
    at rx.Single.subscribe(Single.java:1876)
    at rx.internal.operators.SingleOnSubscribeMap.call(SingleOnSubscribeMap.java:45)
    at rx.internal.operators.SingleOnSubscribeMap.call(SingleOnSubscribeMap.java:30)
    at rx.internal.operators.SingleToObservable.call(SingleToObservable.java:39)
    at rx.internal.operators.SingleToObservable.call(SingleToObservable.java:27)
    at rx.Observable.unsafeSubscribe(Observable.java:10142)
    at rx.internal.operators.OperatorSubscribeOn$1.call(OperatorSubscribeOn.java:94)
    at rx.internal.schedulers.ScheduledAction.run(ScheduledAction.java:55)
    at rx.internal.schedulers.ExecutorScheduler$ExecutorSchedulerWorker.run(ExecutorScheduler.java:107)
    at com.microsoft.intellij.rxjava.IdeaSchedulers$1.run(IdeaSchedulers.java:53)
    at com.intellij.openapi.progress.impl.CoreProgressManager$TaskRunnable.run(CoreProgressManager.java:750)
    at com.intellij.openapi.progress.impl.CoreProgressManager.lambda$runProcess$1(CoreProgressManager.java:157)
    at com.intellij.openapi.progress.impl.CoreProgressManager.registerIndicatorAndRun(CoreProgressManager.java:580)
    at com.intellij.openapi.progress.impl.CoreProgressManager.executeProcessUnderProgress(CoreProgressManager.java:525)
    at com.intellij.openapi.progress.impl.ProgressManagerImpl.executeProcessUnderProgress(ProgressManagerImpl.java:85)
    at com.intellij.openapi.progress.impl.CoreProgressManager.runProcess(CoreProgressManager.java:144)
    at com.intellij.openapi.progress.impl.CoreProgressManager$4.run(CoreProgressManager.java:395)
    at com.intellij.openapi.application.impl.ApplicationImpl$1.run(ApplicationImpl.java:305)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    ... 3 more
wezhang commented 6 years ago

Error 1 is the cluster backend issue

wezhang commented 6 years ago

Error2: Error: Executor cores must be a positive number ; Run with --help for usage help or --verbose for debug output ;

jingyanjingyan commented 6 years ago

fix with 1783