microsoft / azure-tools-for-java

Azure tools for Java, including Azure Toolkits for Eclipse, IntelliJ and related projects.
Other
240 stars 161 forks source link

[IntelliJ][HDInsight]Submit sstream fail to hdinsight #2366

Closed jingyanjingyan closed 5 years ago

jingyanjingyan commented 5 years ago

Build: dev 860

Repro Steps:

  1. Submit sstream script
package sample
import org.apache.spark.sql.{SparkSession}
object ReadSStream {
  def main(args: Array[String]) {
    val spark = SparkSession.builder.appName("ReadSStreamDemo").getOrCreate()
    val streamDf = spark.read.format("sstreaminterop2").load("adl://sandbox-c08.azuredatalakestore.net/users/xnl/test.ss")
    streamDf.createOrReplaceTempView("streamView")
    spark.sql("SELECT COUNT(*) FROM streamView").rdd.saveAsTextFile("adl://sandbox-c08.azuredatalakestore.net/users/rufan/out")
  }
}
Result:
Package and deploy the job to Spark cluster
INFO: Begin uploading file C:\Users\v-yajing\IdeaProjects\untitled14\out\artifacts\untitled14_DefaultArtifact\default_artifact.jar to Azure Blob Storage Account wasbs://spark23hdinsight-2018-10-29t06-07-10-410z@catestsa.blob.core.windows.net/SparkSubmission/2018/11/20/a10fb5d9-b82c-4a05-9c42-b19aa161557c/default_artifact.jar ...
INFO: Submit file to azure blob 'wasbs://spark23hdinsight-2018-10-29t06-07-10-410z@catestsa.blob.core.windows.net/SparkSubmission/2018/11/20/a10fb5d9-b82c-4a05-9c42-b19aa161557c/default_artifact.jar' successfully.
LOG: SLF4J: Class path contains multiple SLF4J bindings.
LOG: SLF4J: Found binding in [jar:file:/usr/hdp/2.6.5.3003-25/spark2/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
LOG: SLF4J: Found binding in [jar:file:/usr/hdp/2.6.5.3003-25/spark_llap/spark-llap-assembly-1.0.0.2.6.5.3003-25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
LOG: SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
LOG: SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
LOG: Warning: Master yarn-cluster is deprecated since 2.0. Please use master "yarn" with specified deploy mode instead.
LOG: 18/11/20 03:49:42 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
LOG: Warning: Skip remote jar wasbs://spark23hdinsight-2018-10-29t06-07-10-410z@catestsa.blob.core.windows.net/SparkSubmission/2018/11/20/a10fb5d9-b82c-4a05-9c42-b19aa161557c/default_artifact.jar.
LOG: 18/11/20 03:49:43 INFO MetricsConfig: loaded properties from hadoop-metrics2-azure-file-system.properties
LOG: 18/11/20 03:49:43 INFO WasbAzureIaasSink: Init starting.
LOG: 18/11/20 03:49:43 INFO AzureIaasSink: Init starting. Initializing MdsLogger.
LOG: 18/11/20 03:49:43 INFO AzureIaasSink: Init completed.
LOG: 18/11/20 03:49:43 INFO WasbAzureIaasSink: Init completed.
LOG: 18/11/20 03:49:43 INFO MetricsSinkAdapter: Sink azurefs2 started
LOG: 18/11/20 03:49:43 INFO RequestHedgingRMFailoverProxyProvider: Looking for the active RM in [rm1, rm2]...
LOG: 18/11/20 03:49:43 INFO RequestHedgingRMFailoverProxyProvider: Found active RM [rm2]
LOG: 18/11/20 03:49:43 INFO Client: Requesting a new application from cluster with 2 NodeManagers
LOG: 18/11/20 03:49:43 INFO Client: Verifying our application has not requested more than the maximum memory capability of the cluster (51200 MB per container)
LOG: 18/11/20 03:49:43 INFO Client: Will allocate AM container, with 4480 MB memory including 384 MB overhead
LOG: 18/11/20 03:49:43 INFO Client: Setting up container launch context for our AM
LOG: 18/11/20 03:49:43 INFO Client: Setting up the launch environment for our AM container
LOG: 18/11/20 03:49:43 INFO Client: Preparing resources for our AM container
LOG: 18/11/20 03:49:46 INFO SecurityManager: Changing view acls to: livy
LOG: 18/11/20 03:49:46 INFO SecurityManager: Changing modify acls to: livy
LOG: 18/11/20 03:49:46 INFO SecurityManager: Changing view acls groups to: 
LOG: 18/11/20 03:49:46 INFO SecurityManager: Changing modify acls groups to: 
LOG: 18/11/20 03:49:46 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(livy); groups with view permissions: Set(); users  with modify permissions: Set(livy); groups with modify permissions: Set()
LOG: 18/11/20 03:49:46 INFO Client: Submitting application application_1540794334804_0259 to ResourceManager
LOG: 18/11/20 03:49:46 INFO YarnClientImpl: Submitted application application_1540794334804_0259
LOG: 18/11/20 03:49:46 INFO Client: Application report for application_1540794334804_0259 (state: ACCEPTED)
LOG: 18/11/20 03:49:46 INFO Client: 
LOG:     client token: N/A
LOG:     diagnostics: AM container is launched, waiting for AM container to Register with RM
LOG:     ApplicationMaster host: N/A
LOG:     ApplicationMaster RPC port: -1
LOG:     queue: default
LOG:     start time: 1542685786118
LOG:     final status: UNDEFINED
LOG:     tracking URL: http://hn1-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:8088/proxy/application_1540794334804_0259/
LOG:     user: livy
LOG: 18/11/20 03:49:46 INFO ShutdownHookManager: Shutdown hook called
LOG: 18/11/20 03:49:46 INFO ShutdownHookManager: Deleting directory /tmp/spark-4624f7a4-167c-490a-a5ac-402261d201c8
LOG: 18/11/20 03:49:46 INFO ShutdownHookManager: Deleting directory /tmp/spark-269c7108-8f90-416e-8045-18829cc29976
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/2.6.5.3003-25/spark2/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.6.5.3003-25/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.6.5.3003-25/spark_llap/spark-llap-assembly-1.0.0.2.6.5.3003-25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
18/11/20 03:50:00 INFO SignalUtils: Registered signal handler for TERM
18/11/20 03:50:00 INFO SignalUtils: Registered signal handler for HUP
18/11/20 03:50:00 INFO SignalUtils: Registered signal handler for INT
18/11/20 03:50:00 INFO SecurityManager: Changing view acls to: yarn,livy
18/11/20 03:50:00 INFO SecurityManager: Changing modify acls to: yarn,livy
18/11/20 03:50:00 INFO SecurityManager: Changing view acls groups to: 
18/11/20 03:50:00 INFO SecurityManager: Changing modify acls groups to: 
18/11/20 03:50:00 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(yarn, livy); groups with view permissions: Set(); users  with modify permissions: Set(yarn, livy); groups with modify permissions: Set()
18/11/20 03:50:01 INFO ApplicationMaster: Preparing Local resources
18/11/20 03:50:01 INFO MetricsConfig: loaded properties from hadoop-metrics2-azure-file-system.properties
18/11/20 03:50:01 INFO WasbAzureIaasSink: Init starting.
18/11/20 03:50:01 INFO AzureIaasSink: Init starting. Initializing MdsLogger.
18/11/20 03:50:01 INFO AzureIaasSink: Init completed.
18/11/20 03:50:01 INFO WasbAzureIaasSink: Init completed.
18/11/20 03:50:01 INFO MetricsSinkAdapter: Sink azurefs2 started
18/11/20 03:50:01 INFO MetricsSystemImpl: Scheduled snapshot period at 60 second(s).
18/11/20 03:50:01 INFO MetricsSystemImpl: azure-file-system metrics system started
18/11/20 03:50:02 INFO ApplicationMaster: ApplicationAttemptId: appattempt_1540794334804_0259_000002
18/11/20 03:50:02 INFO ApplicationMaster: Starting the user application in a separate Thread
18/11/20 03:50:02 INFO ApplicationMaster: Waiting for spark context initialization...
18/11/20 03:50:02 INFO SparkContext: Running Spark version 2.3.0.2.6.5.3003-25
18/11/20 03:50:02 INFO SparkContext: Submitted application: ReadSStreamDemo
18/11/20 03:50:02 INFO SecurityManager: Changing view acls to: yarn,livy
18/11/20 03:50:02 INFO SecurityManager: Changing modify acls to: yarn,livy
18/11/20 03:50:02 INFO SecurityManager: Changing view acls groups to: 
18/11/20 03:50:02 INFO SecurityManager: Changing modify acls groups to: 
18/11/20 03:50:02 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(yarn, livy); groups with view permissions: Set(); users  with modify permissions: Set(yarn, livy); groups with modify permissions: Set()
18/11/20 03:50:02 INFO Utils: Successfully started service 'sparkDriver' on port 34861.
18/11/20 03:50:02 INFO SparkEnv: Registering MapOutputTracker
18/11/20 03:50:02 INFO SparkEnv: Registering BlockManagerMaster
18/11/20 03:50:02 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
18/11/20 03:50:02 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
18/11/20 03:50:02 INFO DiskBlockManager: Created local directory at /mnt/resource/hadoop/yarn/local/usercache/livy/appcache/application_1540794334804_0259/blockmgr-0181ca88-c473-48cc-af4b-899b6510cf0a
18/11/20 03:50:02 INFO MemoryStore: MemoryStore started with capacity 2004.6 MB
18/11/20 03:50:02 INFO SparkEnv: Registering OutputCommitCoordinator
18/11/20 03:50:02 INFO log: Logging initialized @3329ms
18/11/20 03:50:02 INFO JettyUtils: Adding filter: org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter
18/11/20 03:50:02 INFO Server: jetty-9.3.z-SNAPSHOT
18/11/20 03:50:02 INFO Server: Started @3469ms
18/11/20 03:50:02 INFO AbstractConnector: Started ServerConnector@2b7993c{HTTP/1.1,[http/1.1]}{0.0.0.0:35059}
18/11/20 03:50:02 INFO Utils: Successfully started service 'SparkUI' on port 35059.
18/11/20 03:50:02 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@4f7bef33{/jobs,null,AVAILABLE,@Spark}
18/11/20 03:50:02 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@753c84cc{/jobs/json,null,AVAILABLE,@Spark}
18/11/20 03:50:02 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@25faf1df{/jobs/job,null,AVAILABLE,@Spark}
18/11/20 03:50:02 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@7223d55{/jobs/job/json,null,AVAILABLE,@Spark}
18/11/20 03:50:02 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@16a9d399{/stages,null,AVAILABLE,@Spark}
18/11/20 03:50:02 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@64afac9a{/stages/json,null,AVAILABLE,@Spark}
18/11/20 03:50:02 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@539c4457{/stages/stage,null,AVAILABLE,@Spark}
18/11/20 03:50:02 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@47d918d{/stages/stage/json,null,AVAILABLE,@Spark}
18/11/20 03:50:02 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@35f2d3a9{/stages/pool,null,AVAILABLE,@Spark}
18/11/20 03:50:02 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@607a2bd0{/stages/pool/json,null,AVAILABLE,@Spark}
18/11/20 03:50:02 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@49f73d4{/storage,null,AVAILABLE,@Spark}
18/11/20 03:50:02 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@6c867b33{/storage/json,null,AVAILABLE,@Spark}
18/11/20 03:50:02 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@4f4a9291{/storage/rdd,null,AVAILABLE,@Spark}
18/11/20 03:50:02 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@7edf1403{/storage/rdd/json,null,AVAILABLE,@Spark}
18/11/20 03:50:02 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@121d9df{/environment,null,AVAILABLE,@Spark}
18/11/20 03:50:02 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@5ea1652b{/environment/json,null,AVAILABLE,@Spark}
18/11/20 03:50:02 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@4564ba74{/executors,null,AVAILABLE,@Spark}
18/11/20 03:50:02 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@431dd1d7{/executors/json,null,AVAILABLE,@Spark}
18/11/20 03:50:02 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@4cfd074a{/executors/threadDump,null,AVAILABLE,@Spark}
18/11/20 03:50:03 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@19a0abc3{/executors/threadDump/json,null,AVAILABLE,@Spark}
18/11/20 03:50:03 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@79f70599{/static,null,AVAILABLE,@Spark}
18/11/20 03:50:03 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@19a375c0{/,null,AVAILABLE,@Spark}
18/11/20 03:50:03 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@363eefda{/api,null,AVAILABLE,@Spark}
18/11/20 03:50:03 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@4636ead8{/jobs/job/kill,null,AVAILABLE,@Spark}
18/11/20 03:50:03 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@61deab99{/stages/stage/kill,null,AVAILABLE,@Spark}
18/11/20 03:50:03 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://wn0-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:35059
18/11/20 03:50:03 INFO YarnClusterScheduler: Created YarnClusterScheduler
18/11/20 03:50:03 INFO SchedulerExtensionServices: Starting Yarn extension services with app application_1540794334804_0259 and attemptId Some(appattempt_1540794334804_0259_000002)
18/11/20 03:50:03 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 35387.
18/11/20 03:50:03 INFO NettyBlockTransferService: Server created on wn0-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:35387
18/11/20 03:50:03 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
18/11/20 03:50:03 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, wn0-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net, 35387, None)
18/11/20 03:50:03 INFO BlockManagerMasterEndpoint: Registering block manager wn0-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:35387 with 2004.6 MB RAM, BlockManagerId(driver, wn0-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net, 35387, None)
18/11/20 03:50:03 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, wn0-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net, 35387, None)
18/11/20 03:50:03 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, wn0-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net, 35387, None)
18/11/20 03:50:03 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@2d63fef6{/metrics/json,null,AVAILABLE,@Spark}
18/11/20 03:50:03 INFO EventLoggingListener: Logging events to wasb:/hdp/spark2-events/application_1540794334804_0259_2
18/11/20 03:50:03 INFO EnhancementSparkListener: Enhancement listener is enabled
18/11/20 03:50:03 INFO SparkContext: Registered listener com.microsoft.hdinsight.spark.metrics.SparkMetricsListener
18/11/20 03:50:03 INFO SparkContext: Registered listener org.apache.spark.sql.scheduler.EnhancementSparkListener
18/11/20 03:50:03 INFO ApplicationMaster: 
===============================================================================
YARN executor launch context:
  env:
    CLASSPATH -> {{PWD}}<CPS>{{PWD}}/__spark_conf__<CPS>{{PWD}}/__spark_libs__/*<CPS>/usr/hdp/current/spark2-client/jars/*<CPS>$HADOOP_CONF_DIR<CPS>/usr/hdp/current/hadoop-client/*<CPS>/usr/hdp/current/hadoop-client/lib/*<CPS>/usr/hdp/current/hadoop-hdfs-client/*<CPS>/usr/hdp/current/hadoop-hdfs-client/lib/*<CPS>/usr/hdp/current/hadoop-yarn-client/*<CPS>/usr/hdp/current/hadoop-yarn-client/lib/*<CPS>$PWD/mr-framework/hadoop/share/hadoop/mapreduce/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/lib/*:$PWD/mr-framework/hadoop/share/hadoop/common/*:$PWD/mr-framework/hadoop/share/hadoop/common/lib/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/lib/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/lib/*:$PWD/mr-framework/hadoop/share/hadoop/tools/lib/*:/usr/hdp/2.6.5.3003-25/hadoop/lib/hadoop-lzo-0.6.0.2.6.5.3003-25.jar:/etc/hadoop/conf/secure:/usr/hdp/current/ext/hadoop/*<CPS>:/usr/hdp/current/spark2-client/jars/*:/usr/lib/hdinsight-datalake/*:/usr/hdp/current/spark_llap/*:/usr/hdp/current/spark2-client/conf:<CPS>{{PWD}}/__spark_conf__/__hadoop_conf__
    SPARK_DIST_CLASSPATH -> :/usr/hdp/current/spark2-client/jars/*:/usr/lib/hdinsight-datalake/*:/usr/hdp/current/spark_llap/*:/usr/hdp/current/spark2-client/conf:
    SPARK_YARN_STAGING_DIR -> *********(redacted)
    SPARK_USER -> *********(redacted)

  command:
    LD_LIBRARY_PATH="/usr/hdp/current/hadoop-client/lib/native:/usr/hdp/current/hadoop-client/lib/native/Linux-amd64-64:$LD_LIBRARY_PATH" \ 
      {{JAVA_HOME}}/bin/java \ 
      -server \ 
      -Xmx4096m \ 
      '-Dhdp.version=' \ 
      '-Detwlogger.component=sparkexecutor' \ 
      '-DlogFilter.filename=SparkLogFilters.xml' \ 
      '-DpatternGroup.filename=SparkPatternGroups.xml' \ 
      '-Dlog4jspark.root.logger=INFO,console,RFA,ETW,Anonymizer' \ 
      '-Dlog4jspark.log.dir=/var/log/sparkapp/\${user.name}' \ 
      '-Dlog4jspark.log.file=sparkexecutor.log' \ 
      '-Dlog4j.configuration=file:/usr/hdp/current/spark2-client/conf/log4j.properties' \ 
      '-Djavax.xml.parsers.SAXParserFactory=com.sun.org.apache.xerces.internal.jaxp.SAXParserFactoryImpl' \ 
      '-XX:+UseParallelGC' \ 
      '-XX:+UseParallelOldGC' \ 
      -Djava.io.tmpdir={{PWD}}/tmp \ 
      '-Dspark.history.ui.port=18080' \ 
      -Dspark.yarn.app.container.log.dir=<LOG_DIR> \ 
      -XX:OnOutOfMemoryError='kill %p' \ 
      org.apache.spark.executor.CoarseGrainedExecutorBackend \ 
      --driver-url \ 
      spark://CoarseGrainedScheduler@wn0-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:34861 \ 
      --executor-id \ 
      <executorId> \ 
      --hostname \ 
      <hostname> \ 
      --cores \ 
      1 \ 
      --app-id \ 
      application_1540794334804_0259 \ 
      --user-class-path \ 
      file:$PWD/__app__.jar \ 
      1><LOG_DIR>/stdout \ 
      2><LOG_DIR>/stderr

  resources:
    __app__.jar -> resource { scheme: "wasb" host: "catestsa.blob.core.windows.net" port: -1 file: "/user/livy/.sparkStaging/application_1540794334804_0259/default_artifact.jar" userInfo: "spark23hdinsight-2018-10-29t06-07-10-410z" } size: 69301 timestamp: 1542685785000 type: FILE visibility: PRIVATE
    __spark_conf__ -> resource { scheme: "wasb" host: "catestsa.blob.core.windows.net" port: -1 file: "/user/livy/.sparkStaging/application_1540794334804_0259/__spark_conf__.zip" userInfo: "spark23hdinsight-2018-10-29t06-07-10-410z" } size: 259876 timestamp: 1542685786000 type: ARCHIVE visibility: PRIVATE

===============================================================================
18/11/20 03:50:03 INFO YarnRMClient: Registering the ApplicationMaster
18/11/20 03:50:03 INFO RequestHedgingRMFailoverProxyProvider: Looking for the active RM in [rm1, rm2]...
18/11/20 03:50:03 INFO RequestHedgingRMFailoverProxyProvider: Found active RM [rm2]
18/11/20 03:50:03 INFO YarnAllocator: Will request 5 executor container(s), each with 1 core(s) and 4480 MB memory (including 384 MB of overhead)
18/11/20 03:50:03 INFO YarnSchedulerBackend$YarnSchedulerEndpoint: ApplicationMaster registered as NettyRpcEndpointRef(spark://YarnAM@wn0-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:34861)
18/11/20 03:50:03 INFO YarnAllocator: Submitted 5 unlocalized container requests.
18/11/20 03:50:04 INFO ApplicationMaster: Started progress reporter thread with (heartbeat : 5000, initial allocation : 200) intervals
18/11/20 03:50:04 INFO AMRMClientImpl: Received new token for : wn0-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:30050
18/11/20 03:50:04 INFO AMRMClientImpl: Received new token for : wn1-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:30050
18/11/20 03:50:04 INFO YarnAllocator: Launching container container_e02_1540794334804_0259_02_000002 on host wn0-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net for executor with ID 1
18/11/20 03:50:04 INFO YarnAllocator: Launching container container_e02_1540794334804_0259_02_000003 on host wn1-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net for executor with ID 2
18/11/20 03:50:04 INFO YarnAllocator: Received 2 containers from YARN, launching executors on 2 of them.
18/11/20 03:50:04 INFO ContainerManagementProtocolProxy: yarn.client.max-cached-nodemanagers-proxies : 0
18/11/20 03:50:04 INFO ContainerManagementProtocolProxy: yarn.client.max-cached-nodemanagers-proxies : 0
18/11/20 03:50:04 INFO ContainerManagementProtocolProxy: Opening proxy : wn1-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:30050
18/11/20 03:50:04 INFO ContainerManagementProtocolProxy: Opening proxy : wn0-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:30050
18/11/20 03:50:05 INFO YarnAllocator: Launching container container_e02_1540794334804_0259_02_000004 on host wn0-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net for executor with ID 3
18/11/20 03:50:05 INFO YarnAllocator: Launching container container_e02_1540794334804_0259_02_000005 on host wn1-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net for executor with ID 4
18/11/20 03:50:05 INFO YarnAllocator: Received 2 containers from YARN, launching executors on 2 of them.
18/11/20 03:50:05 INFO ContainerManagementProtocolProxy: yarn.client.max-cached-nodemanagers-proxies : 0
18/11/20 03:50:05 INFO ContainerManagementProtocolProxy: yarn.client.max-cached-nodemanagers-proxies : 0
18/11/20 03:50:05 INFO ContainerManagementProtocolProxy: Opening proxy : wn0-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:30050
18/11/20 03:50:05 INFO ContainerManagementProtocolProxy: Opening proxy : wn1-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:30050
18/11/20 03:50:07 INFO YarnAllocator: Launching container container_e02_1540794334804_0259_02_000006 on host wn0-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net for executor with ID 5
18/11/20 03:50:07 INFO YarnAllocator: Received 2 containers from YARN, launching executors on 1 of them.
18/11/20 03:50:07 INFO ContainerManagementProtocolProxy: yarn.client.max-cached-nodemanagers-proxies : 0
18/11/20 03:50:07 INFO ContainerManagementProtocolProxy: Opening proxy : wn0-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:30050
18/11/20 03:50:07 INFO YarnSchedulerBackend$YarnDriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (10.0.0.12:49954) with ID 2
18/11/20 03:50:07 INFO YarnSchedulerBackend$YarnDriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (10.0.0.6:60862) with ID 1
18/11/20 03:50:07 INFO BlockManagerMasterEndpoint: Registering block manager wn1-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:41785 with 2004.6 MB RAM, BlockManagerId(2, wn1-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net, 41785, None)
18/11/20 03:50:07 INFO BlockManagerMasterEndpoint: Registering block manager wn0-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:45809 with 2004.6 MB RAM, BlockManagerId(1, wn0-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net, 45809, None)
18/11/20 03:50:08 INFO YarnSchedulerBackend$YarnDriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (10.0.0.12:49958) with ID 4
18/11/20 03:50:08 INFO BlockManagerMasterEndpoint: Registering block manager wn1-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:40691 with 2004.6 MB RAM, BlockManagerId(4, wn1-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net, 40691, None)
18/11/20 03:50:08 INFO YarnSchedulerBackend$YarnDriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (10.0.0.6:60868) with ID 3
18/11/20 03:50:08 INFO BlockManagerMasterEndpoint: Registering block manager wn0-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:39517 with 2004.6 MB RAM, BlockManagerId(3, wn0-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net, 39517, None)
18/11/20 03:50:08 INFO YarnClusterSchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.8
18/11/20 03:50:08 INFO YarnClusterScheduler: YarnClusterScheduler.postStartHook done
18/11/20 03:50:08 INFO SharedState: loading hive config file: file:/etc/spark2/2.6.5.3003-25/0/hive-site.xml
18/11/20 03:50:08 INFO SharedState: Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('/hive/warehouse').
18/11/20 03:50:08 INFO SharedState: Warehouse path is '/hive/warehouse'.
18/11/20 03:50:08 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@350adc29{/SQL,null,AVAILABLE,@Spark}
18/11/20 03:50:08 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@62a93810{/SQL/json,null,AVAILABLE,@Spark}
18/11/20 03:50:08 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@53fa30ab{/SQL/execution,null,AVAILABLE,@Spark}
18/11/20 03:50:08 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@66d7320e{/SQL/execution/json,null,AVAILABLE,@Spark}
18/11/20 03:50:08 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@110beda2{/static/sql,null,AVAILABLE,@Spark}
18/11/20 03:50:08 INFO StateStoreCoordinatorRef: Registered StateStoreCoordinator endpoint
18/11/20 03:50:08 ERROR ApplicationMaster: User class threw exception: java.lang.ClassNotFoundException: Failed to find data source: sstream. Please find packages at http://spark.apache.org/third-party-projects.html
java.lang.ClassNotFoundException: Failed to find data source: sstream. Please find packages at http://spark.apache.org/third-party-projects.html
    at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:635)
    at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:190)
    at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:174)
    at sample.ReadSStream$.main(ReadSStream.scala:6)
    at sample.ReadSStream.main(ReadSStream.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$4.run(ApplicationMaster.scala:721)
Caused by: java.lang.ClassNotFoundException: sstream.DefaultSource
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$23$$anonfun$apply$15.apply(DataSource.scala:618)
    at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$23$$anonfun$apply$15.apply(DataSource.scala:618)
    at scala.util.Try$.apply(Try.scala:192)
    at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$23.apply(DataSource.scala:618)
    at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$23.apply(DataSource.scala:618)
    at scala.util.Try.orElse(Try.scala:84)
    at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:618)
    ... 9 more
18/11/20 03:50:08 INFO ApplicationMaster: Final app status: FAILED, exitCode: 15, (reason: User class threw exception: java.lang.ClassNotFoundException: Failed to find data source: sstream. Please find packages at http://spark.apache.org/third-party-projects.html
    at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:635)
    at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:190)
    at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:174)
    at sample.ReadSStream$.main(ReadSStream.scala:6)
    at sample.ReadSStream.main(ReadSStream.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$4.run(ApplicationMaster.scala:721)
Caused by: java.lang.ClassNotFoundException: sstream.DefaultSource
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$23$$anonfun$apply$15.apply(DataSource.scala:618)
    at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$23$$anonfun$apply$15.apply(DataSource.scala:618)
    at scala.util.Try$.apply(Try.scala:192)
    at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$23.apply(DataSource.scala:618)
    at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$23.apply(DataSource.scala:618)
    at scala.util.Try.orElse(Try.scala:84)
    at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:618)
    ... 9 more
)
18/11/20 03:50:08 INFO SparkContext: Invoking stop() from shutdown hook
18/11/20 03:50:08 INFO AbstractConnector: Stopped Spark@2b7993c{HTTP/1.1,[http/1.1]}{0.0.0.0:0}
18/11/20 03:50:09 INFO SparkUI: Stopped Spark web UI at http://wn0-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:35059
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.BlockManager.disk.diskSpaceUsed_MB, value=0
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.BlockManager.memory.maxMem_MB, value=10022
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.BlockManager.memory.maxOffHeapMem_MB, value=0
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.BlockManager.memory.maxOnHeapMem_MB, value=10022
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.BlockManager.memory.memUsed_MB, value=0
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.BlockManager.memory.offHeapMemUsed_MB, value=0
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.BlockManager.memory.onHeapMemUsed_MB, value=0
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.BlockManager.memory.remainingMem_MB, value=10022
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.BlockManager.memory.remainingOffHeapMem_MB, value=0
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.BlockManager.memory.remainingOnHeapMem_MB, value=10022
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.DAGScheduler.job.activeJobs, value=0
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.DAGScheduler.job.allJobs, value=0
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.DAGScheduler.stage.failedStages, value=0
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.DAGScheduler.stage.runningStages, value=0
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.DAGScheduler.stage.waitingStages, value=0
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.LiveListenerBus.queue.appStatus.size, value=0
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.LiveListenerBus.queue.eventLog.size, value=0
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.LiveListenerBus.queue.executorManagement.size, value=0
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.LiveListenerBus.queue.shared.size, value=0
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.PS-MarkSweep.count, value=2
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.PS-MarkSweep.time, value=69
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.PS-Scavenge.count, value=5
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.PS-Scavenge.time, value=78
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.direct.capacity, value=86813
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.direct.count, value=18
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.direct.used, value=86814
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.heap.committed, value=851443712
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.heap.init, value=924844032
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.heap.max, value=3817865216
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.heap.usage, value=0.0957124013882422
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.heap.used, value=365417048
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.mapped.capacity, value=0
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.mapped.count, value=0
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.mapped.used, value=0
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.non-heap.committed, value=75644928
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.non-heap.init, value=2555904
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.non-heap.max, value=-1
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.non-heap.usage, value=-7.3350744E7
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.non-heap.used, value=73361080
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.Code-Cache.committed, value=10813440
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.Code-Cache.init, value=2555904
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.Code-Cache.max, value=251658240
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.Code-Cache.usage, value=0.03921839396158854
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.Code-Cache.used, value=9869632
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.Compressed-Class-Space.committed, value=7905280
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.Compressed-Class-Space.init, value=0
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.Compressed-Class-Space.max, value=1073741824
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.Compressed-Class-Space.usage, value=0.0071531012654304504
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.Compressed-Class-Space.used, value=7680584
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.Metaspace.committed, value=56926208
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.Metaspace.init, value=0
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.Metaspace.max, value=-1
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.Metaspace.usage, value=0.9805842679702116
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.Metaspace.used, value=55820944
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.PS-Eden-Space.committed, value=366477312
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.PS-Eden-Space.init, value=231735296
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.PS-Eden-Space.max, value=1336410112
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.PS-Eden-Space.usage, value=0.21682250785004537
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.PS-Eden-Space.used, value=289763792
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.PS-Old-Gen.committed, value=446693376
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.PS-Old-Gen.init, value=616562688
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.PS-Old-Gen.max, value=2863661056
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.PS-Old-Gen.usage, value=0.013485014896958532
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.PS-Old-Gen.used, value=38616512
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.PS-Survivor-Space.committed, value=38273024
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.PS-Survivor-Space.init, value=38273024
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.PS-Survivor-Space.max, value=38273024
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.PS-Survivor-Space.usage, value=0.9996166490528682
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.PS-Survivor-Space.used, value=38258352
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.total.committed, value=927088640
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.total.init, value=927399936
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.total.max, value=3817865215
18/11/20 03:50:09 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.total.used, value=440867560
18/11/20 03:50:09 INFO metrics: type=COUNTER, name=application_1540794334804_0259.driver.HiveExternalCatalog.fileCacheHits, count=0
18/11/20 03:50:09 INFO metrics: type=COUNTER, name=application_1540794334804_0259.driver.HiveExternalCatalog.filesDiscovered, count=0
18/11/20 03:50:09 INFO metrics: type=COUNTER, name=application_1540794334804_0259.driver.HiveExternalCatalog.hiveClientCalls, count=0
18/11/20 03:50:09 INFO metrics: type=COUNTER, name=application_1540794334804_0259.driver.HiveExternalCatalog.parallelListingJobCount, count=0
18/11/20 03:50:09 INFO metrics: type=COUNTER, name=application_1540794334804_0259.driver.HiveExternalCatalog.partitionsFetched, count=0
18/11/20 03:50:09 INFO metrics: type=COUNTER, name=application_1540794334804_0259.driver.LiveListenerBus.numEventsPosted, count=12
18/11/20 03:50:09 INFO metrics: type=COUNTER, name=application_1540794334804_0259.driver.LiveListenerBus.queue.appStatus.numDroppedEvents, count=0
18/11/20 03:50:09 INFO metrics: type=COUNTER, name=application_1540794334804_0259.driver.LiveListenerBus.queue.eventLog.numDroppedEvents, count=0
18/11/20 03:50:09 INFO metrics: type=COUNTER, name=application_1540794334804_0259.driver.LiveListenerBus.queue.executorManagement.numDroppedEvents, count=0
18/11/20 03:50:09 INFO metrics: type=COUNTER, name=application_1540794334804_0259.driver.LiveListenerBus.queue.shared.numDroppedEvents, count=0
18/11/20 03:50:09 INFO metrics: type=HISTOGRAM, name=application_1540794334804_0259.driver.CodeGenerator.compilationTime, count=0, min=0, max=0, mean=0.0, stddev=0.0, median=0.0, p75=0.0, p95=0.0, p98=0.0, p99=0.0, p999=0.0
18/11/20 03:50:09 INFO metrics: type=HISTOGRAM, name=application_1540794334804_0259.driver.CodeGenerator.generatedClassSize, count=0, min=0, max=0, mean=0.0, stddev=0.0, median=0.0, p75=0.0, p95=0.0, p98=0.0, p99=0.0, p999=0.0
18/11/20 03:50:09 INFO metrics: type=HISTOGRAM, name=application_1540794334804_0259.driver.CodeGenerator.generatedMethodSize, count=0, min=0, max=0, mean=0.0, stddev=0.0, median=0.0, p75=0.0, p95=0.0, p98=0.0, p99=0.0, p999=0.0
18/11/20 03:50:09 INFO metrics: type=HISTOGRAM, name=application_1540794334804_0259.driver.CodeGenerator.sourceCodeSize, count=0, min=0, max=0, mean=0.0, stddev=0.0, median=0.0, p75=0.0, p95=0.0, p98=0.0, p99=0.0, p999=0.0
18/11/20 03:50:09 INFO metrics: type=TIMER, name=application_1540794334804_0259.driver.DAGScheduler.messageProcessingTime, count=4, min=0.005399999999999999, max=2.65001, mean=0.6625258865924033, stddev=1.141737595626449, median=0.0081, p75=0.0081, p95=2.65001, p98=2.65001, p99=2.65001, p999=2.65001, mean_rate=0.6703846257226554, m1=0.4, m5=0.4, m15=0.4, rate_unit=events/second, duration_unit=milliseconds
18/11/20 03:50:09 INFO metrics: type=TIMER, name=application_1540794334804_0259.driver.LiveListenerBus.listenerProcessingTime.com.microsoft.hdinsight.spark.metrics.SparkMetricsListener, count=12, min=0.22120099999999998, max=25.471497, mean=3.8357353594749903, stddev=7.354410648988354, median=0.25760099999999997, p75=1.151305, p95=25.471497, p98=25.471497, p99=25.471497, p999=25.471497, mean_rate=2.1864748871352733, m1=2.1999999999999997, m5=2.1999999999999997, m15=2.1999999999999997, rate_unit=events/second, duration_unit=milliseconds
18/11/20 03:50:09 INFO metrics: type=TIMER, name=application_1540794334804_0259.driver.LiveListenerBus.listenerProcessingTime.org.apache.spark.HeartbeatReceiver, count=12, min=0.0026999999999999997, max=24.394392999999997, mean=2.5193418444934226, stddev=6.537848642731393, median=0.012499999999999999, p75=0.1414, p95=24.394392999999997, p98=24.394392999999997, p99=24.394392999999997, p999=24.394392999999997, mean_rate=1.9940312191596172, m1=1.4, m5=1.4, m15=1.4, rate_unit=events/second, duration_unit=milliseconds
18/11/20 03:50:09 INFO metrics: type=TIMER, name=application_1540794334804_0259.driver.LiveListenerBus.listenerProcessingTime.org.apache.spark.scheduler.EventLoggingListener, count=12, min=0.24190099999999998, max=28.816409999999998, mean=6.275276225317461, stddev=9.998441009711161, median=0.379202, p75=4.070116, p95=28.816409999999998, p98=28.816409999999998, p99=28.816409999999998, p999=28.816409999999998, mean_rate=2.1793483892915786, m1=2.1999999999999997, m5=2.1999999999999997, m15=2.1999999999999997, rate_unit=events/second, duration_unit=milliseconds
18/11/20 03:50:09 INFO metrics: type=TIMER, name=application_1540794334804_0259.driver.LiveListenerBus.listenerProcessingTime.org.apache.spark.sql.scheduler.EnhancementSparkListener, count=12, min=0.0018, max=1.149905, mean=0.09316718404280172, stddev=0.30977855089796874, median=0.0023, p75=0.0026999999999999997, p95=1.149905, p98=1.149905, p99=1.149905, p999=1.149905, mean_rate=2.1862419330576275, m1=2.1999999999999997, m5=2.1999999999999997, m15=2.1999999999999997, rate_unit=events/second, duration_unit=milliseconds
18/11/20 03:50:09 INFO metrics: type=TIMER, name=application_1540794334804_0259.driver.LiveListenerBus.listenerProcessingTime.org.apache.spark.status.AppStatusListener, count=12, min=0.0768, max=30.630516999999998, mean=4.487322578668212, stddev=9.266211060280746, median=0.0955, p75=1.590506, p95=30.630516999999998, p98=30.630516999999998, p99=30.630516999999998, p999=30.630516999999998, mean_rate=1.7717777979445464, m1=0.6, m5=0.6, m15=0.6, rate_unit=events/second, duration_unit=milliseconds
18/11/20 03:50:09 INFO metrics: type=TIMER, name=application_1540794334804_0259.driver.LiveListenerBus.queue.appStatus.listenerProcessingTime, count=12, min=0.1168, max=30.668816999999997, mean=4.535635469516158, stddev=9.267528794401072, median=0.165101, p75=1.6418059999999999, p95=30.668816999999997, p98=30.668816999999997, p99=30.668816999999997, p999=30.668816999999997, mean_rate=1.7711038308543416, m1=0.6, m5=0.6, m15=0.6, rate_unit=events/second, duration_unit=milliseconds
18/11/20 03:50:09 INFO metrics: type=TIMER, name=application_1540794334804_0259.driver.LiveListenerBus.queue.eventLog.listenerProcessingTime, count=12, min=0.272401, max=28.848509999999997, mean=6.3166646980680365, stddev=9.998350018456884, median=0.412302, p75=4.138916, p95=28.848509999999997, p98=28.848509999999997, p99=28.848509999999997, p999=28.848509999999997, mean_rate=2.1782392942612727, m1=2.1999999999999997, m5=2.1999999999999997, m15=2.1999999999999997, rate_unit=events/second, duration_unit=milliseconds
18/11/20 03:50:09 INFO metrics: type=TIMER, name=application_1540794334804_0259.driver.LiveListenerBus.queue.executorManagement.listenerProcessingTime, count=12, min=0.039, max=24.456993, mean=2.7318625334114115, stddev=6.612457910131616, median=0.061799999999999994, p75=0.19119999999999998, p95=24.456993, p98=24.456993, p99=24.456993, p999=24.456993, mean_rate=1.9927448620001822, m1=1.4, m5=1.4, m15=1.4, rate_unit=events/second, duration_unit=milliseconds
18/11/20 03:50:09 INFO metrics: type=TIMER, name=application_1540794334804_0259.driver.LiveListenerBus.queue.shared.listenerProcessingTime, count=12, min=0.26920099999999997, max=25.528298, mean=4.138439240370065, stddev=7.3301359954339596, median=0.344902, p75=2.0913079999999997, p95=25.528298, p98=25.528298, p99=25.528298, p999=25.528298, mean_rate=2.184551913798267, m1=2.1999999999999997, m5=2.1999999999999997, m15=2.1999999999999997, rate_unit=events/second, duration_unit=milliseconds
18/11/20 03:50:09 INFO YarnAllocator: Driver requested a total number of 0 executor(s).
18/11/20 03:50:09 INFO YarnClusterSchedulerBackend: Shutting down all executors
18/11/20 03:50:09 INFO YarnSchedulerBackend$YarnDriverEndpoint: Asking each executor to shut down
18/11/20 03:50:09 INFO SchedulerExtensionServices: Stopping SchedulerExtensionServices
(serviceOption=None,
 services=List(),
 started=false)
18/11/20 03:50:09 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
18/11/20 03:50:09 INFO MemoryStore: MemoryStore cleared
18/11/20 03:50:09 INFO BlockManager: BlockManager stopped
18/11/20 03:50:09 INFO BlockManagerMaster: BlockManagerMaster stopped
18/11/20 03:50:09 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
18/11/20 03:50:09 INFO SparkContext: Successfully stopped SparkContext
18/11/20 03:50:09 INFO ShutdownHookManager: Shutdown hook called
18/11/20 03:50:09 INFO ShutdownHookManager: Deleting directory /mnt/resource/hadoop/yarn/local/usercache/livy/appcache/application_1540794334804_0259/spark-c0e2bc03-11ee-44eb-bb03-5c7b4d961844
18/11/20 03:50:09 INFO MetricsSystemImpl: Stopping azure-file-system metrics system...
18/11/20 03:50:09 INFO MetricsSinkAdapter: azurefs2 thread interrupted.
18/11/20 03:50:09 INFO MetricsSystemImpl: azure-file-system metrics system stopped.
18/11/20 03:50:09 INFO MetricsSystemImpl: azure-file-system metrics system shutdown complete.
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/2.6.5.3003-25/spark2/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.6.5.3003-25/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.6.5.3003-25/spark_llap/spark-llap-assembly-1.0.0.2.6.5.3003-25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
18/11/20 03:50:11 INFO SignalUtils: Registered signal handler for TERM
18/11/20 03:50:11 INFO SignalUtils: Registered signal handler for HUP
18/11/20 03:50:11 INFO SignalUtils: Registered signal handler for INT
18/11/20 03:50:12 INFO SecurityManager: Changing view acls to: yarn,livy
18/11/20 03:50:12 INFO SecurityManager: Changing modify acls to: yarn,livy
18/11/20 03:50:12 INFO SecurityManager: Changing view acls groups to: 
18/11/20 03:50:12 INFO SecurityManager: Changing modify acls groups to: 
18/11/20 03:50:12 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(yarn, livy); groups with view permissions: Set(); users  with modify permissions: Set(yarn, livy); groups with modify permissions: Set()
18/11/20 03:50:12 INFO ApplicationMaster: Preparing Local resources
18/11/20 03:50:13 INFO MetricsConfig: loaded properties from hadoop-metrics2-azure-file-system.properties
18/11/20 03:50:13 INFO WasbAzureIaasSink: Init starting.
18/11/20 03:50:13 INFO AzureIaasSink: Init starting. Initializing MdsLogger.
18/11/20 03:50:13 INFO AzureIaasSink: Init completed.
18/11/20 03:50:13 INFO WasbAzureIaasSink: Init completed.
18/11/20 03:50:13 INFO MetricsSinkAdapter: Sink azurefs2 started
18/11/20 03:50:13 INFO MetricsSystemImpl: Scheduled snapshot period at 60 second(s).
18/11/20 03:50:13 INFO MetricsSystemImpl: azure-file-system metrics system started
18/11/20 03:50:13 INFO ApplicationMaster: ApplicationAttemptId: appattempt_1540794334804_0259_000003
18/11/20 03:50:13 INFO ApplicationMaster: Starting the user application in a separate Thread
18/11/20 03:50:13 INFO ApplicationMaster: Waiting for spark context initialization...
18/11/20 03:50:13 INFO SparkContext: Running Spark version 2.3.0.2.6.5.3003-25
18/11/20 03:50:13 INFO SparkContext: Submitted application: ReadSStreamDemo
18/11/20 03:50:13 INFO SecurityManager: Changing view acls to: yarn,livy
18/11/20 03:50:13 INFO SecurityManager: Changing modify acls to: yarn,livy
18/11/20 03:50:13 INFO SecurityManager: Changing view acls groups to: 
18/11/20 03:50:13 INFO SecurityManager: Changing modify acls groups to: 
18/11/20 03:50:13 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(yarn, livy); groups with view permissions: Set(); users  with modify permissions: Set(yarn, livy); groups with modify permissions: Set()
18/11/20 03:50:13 INFO Utils: Successfully started service 'sparkDriver' on port 42155.
18/11/20 03:50:13 INFO SparkEnv: Registering MapOutputTracker
18/11/20 03:50:13 INFO SparkEnv: Registering BlockManagerMaster
18/11/20 03:50:13 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
18/11/20 03:50:13 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
18/11/20 03:50:13 INFO DiskBlockManager: Created local directory at /mnt/resource/hadoop/yarn/local/usercache/livy/appcache/application_1540794334804_0259/blockmgr-a5c696cd-baa2-43de-9f3d-14fce227b8f0
18/11/20 03:50:13 INFO MemoryStore: MemoryStore started with capacity 2004.6 MB
18/11/20 03:50:14 INFO SparkEnv: Registering OutputCommitCoordinator
18/11/20 03:50:14 INFO log: Logging initialized @3227ms
18/11/20 03:50:14 INFO JettyUtils: Adding filter: org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter
18/11/20 03:50:14 INFO Server: jetty-9.3.z-SNAPSHOT
18/11/20 03:50:14 INFO Server: Started @3377ms
18/11/20 03:50:14 INFO AbstractConnector: Started ServerConnector@20231448{HTTP/1.1,[http/1.1]}{0.0.0.0:37365}
18/11/20 03:50:14 INFO Utils: Successfully started service 'SparkUI' on port 37365.
18/11/20 03:50:14 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@2d42cbe9{/jobs,null,AVAILABLE,@Spark}
18/11/20 03:50:14 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@83f3f43{/jobs/json,null,AVAILABLE,@Spark}
18/11/20 03:50:14 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@5714da27{/jobs/job,null,AVAILABLE,@Spark}
18/11/20 03:50:14 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@120ebfb4{/jobs/job/json,null,AVAILABLE,@Spark}
18/11/20 03:50:14 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@633a5f91{/stages,null,AVAILABLE,@Spark}
18/11/20 03:50:14 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@120e6433{/stages/json,null,AVAILABLE,@Spark}
18/11/20 03:50:14 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@3e3371cd{/stages/stage,null,AVAILABLE,@Spark}
18/11/20 03:50:14 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@91e5e17{/stages/stage/json,null,AVAILABLE,@Spark}
18/11/20 03:50:14 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@2c06fbe0{/stages/pool,null,AVAILABLE,@Spark}
18/11/20 03:50:14 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@4911065a{/stages/pool/json,null,AVAILABLE,@Spark}
18/11/20 03:50:14 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@25ab59ba{/storage,null,AVAILABLE,@Spark}
18/11/20 03:50:14 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@5ebe55fe{/storage/json,null,AVAILABLE,@Spark}
18/11/20 03:50:14 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@45fc6c32{/storage/rdd,null,AVAILABLE,@Spark}
18/11/20 03:50:14 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@49e9101{/storage/rdd/json,null,AVAILABLE,@Spark}
18/11/20 03:50:14 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@7b877ea1{/environment,null,AVAILABLE,@Spark}
18/11/20 03:50:14 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@57bac58a{/environment/json,null,AVAILABLE,@Spark}
18/11/20 03:50:14 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@7181aeb3{/executors,null,AVAILABLE,@Spark}
18/11/20 03:50:14 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@1673f1b{/executors/json,null,AVAILABLE,@Spark}
18/11/20 03:50:14 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@41553be0{/executors/threadDump,null,AVAILABLE,@Spark}
18/11/20 03:50:14 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@404230d5{/executors/threadDump/json,null,AVAILABLE,@Spark}
18/11/20 03:50:14 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@3c4aea58{/static,null,AVAILABLE,@Spark}
18/11/20 03:50:14 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@27b1a866{/,null,AVAILABLE,@Spark}
18/11/20 03:50:14 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@7b41e97b{/api,null,AVAILABLE,@Spark}
18/11/20 03:50:14 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@7a66f7b{/jobs/job/kill,null,AVAILABLE,@Spark}
18/11/20 03:50:14 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@6a4bb293{/stages/stage/kill,null,AVAILABLE,@Spark}
18/11/20 03:50:14 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://wn1-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:37365
18/11/20 03:50:14 INFO YarnClusterScheduler: Created YarnClusterScheduler
18/11/20 03:50:14 INFO SchedulerExtensionServices: Starting Yarn extension services with app application_1540794334804_0259 and attemptId Some(appattempt_1540794334804_0259_000003)
18/11/20 03:50:14 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 33179.
18/11/20 03:50:14 INFO NettyBlockTransferService: Server created on wn1-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:33179
18/11/20 03:50:14 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
18/11/20 03:50:14 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, wn1-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net, 33179, None)
18/11/20 03:50:14 INFO BlockManagerMasterEndpoint: Registering block manager wn1-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:33179 with 2004.6 MB RAM, BlockManagerId(driver, wn1-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net, 33179, None)
18/11/20 03:50:14 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, wn1-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net, 33179, None)
18/11/20 03:50:14 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, wn1-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net, 33179, None)
18/11/20 03:50:14 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@1fd98f8a{/metrics/json,null,AVAILABLE,@Spark}
18/11/20 03:50:14 INFO EventLoggingListener: Logging events to wasb:/hdp/spark2-events/application_1540794334804_0259_3
18/11/20 03:50:14 INFO EnhancementSparkListener: Enhancement listener is enabled
18/11/20 03:50:14 INFO SparkContext: Registered listener com.microsoft.hdinsight.spark.metrics.SparkMetricsListener
18/11/20 03:50:14 INFO SparkContext: Registered listener org.apache.spark.sql.scheduler.EnhancementSparkListener
18/11/20 03:50:15 INFO ApplicationMaster: 
===============================================================================
YARN executor launch context:
  env:
    CLASSPATH -> {{PWD}}<CPS>{{PWD}}/__spark_conf__<CPS>{{PWD}}/__spark_libs__/*<CPS>/usr/hdp/current/spark2-client/jars/*<CPS>$HADOOP_CONF_DIR<CPS>/usr/hdp/current/hadoop-client/*<CPS>/usr/hdp/current/hadoop-client/lib/*<CPS>/usr/hdp/current/hadoop-hdfs-client/*<CPS>/usr/hdp/current/hadoop-hdfs-client/lib/*<CPS>/usr/hdp/current/hadoop-yarn-client/*<CPS>/usr/hdp/current/hadoop-yarn-client/lib/*<CPS>$PWD/mr-framework/hadoop/share/hadoop/mapreduce/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/lib/*:$PWD/mr-framework/hadoop/share/hadoop/common/*:$PWD/mr-framework/hadoop/share/hadoop/common/lib/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/lib/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/lib/*:$PWD/mr-framework/hadoop/share/hadoop/tools/lib/*:/usr/hdp/2.6.5.3003-25/hadoop/lib/hadoop-lzo-0.6.0.2.6.5.3003-25.jar:/etc/hadoop/conf/secure:/usr/hdp/current/ext/hadoop/*<CPS>:/usr/hdp/current/spark2-client/jars/*:/usr/lib/hdinsight-datalake/*:/usr/hdp/current/spark_llap/*:/usr/hdp/current/spark2-client/conf:<CPS>{{PWD}}/__spark_conf__/__hadoop_conf__
    SPARK_DIST_CLASSPATH -> :/usr/hdp/current/spark2-client/jars/*:/usr/lib/hdinsight-datalake/*:/usr/hdp/current/spark_llap/*:/usr/hdp/current/spark2-client/conf:
    SPARK_YARN_STAGING_DIR -> *********(redacted)
    SPARK_USER -> *********(redacted)

  command:
    LD_LIBRARY_PATH="/usr/hdp/current/hadoop-client/lib/native:/usr/hdp/current/hadoop-client/lib/native/Linux-amd64-64:$LD_LIBRARY_PATH" \ 
      {{JAVA_HOME}}/bin/java \ 
      -server \ 
      -Xmx4096m \ 
      '-Dhdp.version=' \ 
      '-Detwlogger.component=sparkexecutor' \ 
      '-DlogFilter.filename=SparkLogFilters.xml' \ 
      '-DpatternGroup.filename=SparkPatternGroups.xml' \ 
      '-Dlog4jspark.root.logger=INFO,console,RFA,ETW,Anonymizer' \ 
      '-Dlog4jspark.log.dir=/var/log/sparkapp/\${user.name}' \ 
      '-Dlog4jspark.log.file=sparkexecutor.log' \ 
      '-Dlog4j.configuration=file:/usr/hdp/current/spark2-client/conf/log4j.properties' \ 
      '-Djavax.xml.parsers.SAXParserFactory=com.sun.org.apache.xerces.internal.jaxp.SAXParserFactoryImpl' \ 
      '-XX:+UseParallelGC' \ 
      '-XX:+UseParallelOldGC' \ 
      -Djava.io.tmpdir={{PWD}}/tmp \ 
      '-Dspark.history.ui.port=18080' \ 
      -Dspark.yarn.app.container.log.dir=<LOG_DIR> \ 
      -XX:OnOutOfMemoryError='kill %p' \ 
      org.apache.spark.executor.CoarseGrainedExecutorBackend \ 
      --driver-url \ 
      spark://CoarseGrainedScheduler@wn1-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:42155 \ 
      --executor-id \ 
      <executorId> \ 
      --hostname \ 
      <hostname> \ 
      --cores \ 
      1 \ 
      --app-id \ 
      application_1540794334804_0259 \ 
      --user-class-path \ 
      file:$PWD/__app__.jar \ 
      1><LOG_DIR>/stdout \ 
      2><LOG_DIR>/stderr

  resources:
    __app__.jar -> resource { scheme: "wasb" host: "catestsa.blob.core.windows.net" port: -1 file: "/user/livy/.sparkStaging/application_1540794334804_0259/default_artifact.jar" userInfo: "spark23hdinsight-2018-10-29t06-07-10-410z" } size: 69301 timestamp: 1542685785000 type: FILE visibility: PRIVATE
    __spark_conf__ -> resource { scheme: "wasb" host: "catestsa.blob.core.windows.net" port: -1 file: "/user/livy/.sparkStaging/application_1540794334804_0259/__spark_conf__.zip" userInfo: "spark23hdinsight-2018-10-29t06-07-10-410z" } size: 259876 timestamp: 1542685786000 type: ARCHIVE visibility: PRIVATE

===============================================================================
18/11/20 03:50:15 INFO YarnRMClient: Registering the ApplicationMaster
18/11/20 03:50:15 INFO RequestHedgingRMFailoverProxyProvider: Looking for the active RM in [rm1, rm2]...
18/11/20 03:50:15 INFO RequestHedgingRMFailoverProxyProvider: Found active RM [rm2]
18/11/20 03:50:15 INFO YarnAllocator: Will request 5 executor container(s), each with 1 core(s) and 4480 MB memory (including 384 MB of overhead)
18/11/20 03:50:15 INFO YarnSchedulerBackend$YarnSchedulerEndpoint: ApplicationMaster registered as NettyRpcEndpointRef(spark://YarnAM@wn1-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:42155)
18/11/20 03:50:15 INFO YarnAllocator: Submitted 5 unlocalized container requests.
18/11/20 03:50:15 INFO ApplicationMaster: Started progress reporter thread with (heartbeat : 5000, initial allocation : 200) intervals
18/11/20 03:50:15 INFO AMRMClientImpl: Received new token for : wn1-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:30050
18/11/20 03:50:15 INFO AMRMClientImpl: Received new token for : wn0-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:30050
18/11/20 03:50:15 INFO YarnAllocator: Launching container container_e02_1540794334804_0259_03_000002 on host wn1-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net for executor with ID 1
18/11/20 03:50:15 INFO YarnAllocator: Launching container container_e02_1540794334804_0259_03_000003 on host wn0-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net for executor with ID 2
18/11/20 03:50:15 INFO YarnAllocator: Received 2 containers from YARN, launching executors on 2 of them.
18/11/20 03:50:15 INFO ContainerManagementProtocolProxy: yarn.client.max-cached-nodemanagers-proxies : 0
18/11/20 03:50:15 INFO ContainerManagementProtocolProxy: yarn.client.max-cached-nodemanagers-proxies : 0
18/11/20 03:50:16 INFO ContainerManagementProtocolProxy: Opening proxy : wn1-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:30050
18/11/20 03:50:16 INFO ContainerManagementProtocolProxy: Opening proxy : wn0-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:30050
18/11/20 03:50:18 INFO YarnAllocator: Launching container container_e02_1540794334804_0259_03_000004 on host wn1-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net for executor with ID 3
18/11/20 03:50:18 INFO YarnAllocator: Launching container container_e02_1540794334804_0259_03_000005 on host wn0-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net for executor with ID 4
18/11/20 03:50:18 INFO YarnAllocator: Launching container container_e02_1540794334804_0259_03_000006 on host wn1-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net for executor with ID 5
18/11/20 03:50:18 INFO ContainerManagementProtocolProxy: yarn.client.max-cached-nodemanagers-proxies : 0
18/11/20 03:50:18 INFO YarnAllocator: Received 3 containers from YARN, launching executors on 3 of them.
18/11/20 03:50:18 INFO ContainerManagementProtocolProxy: yarn.client.max-cached-nodemanagers-proxies : 0
18/11/20 03:50:18 INFO ContainerManagementProtocolProxy: yarn.client.max-cached-nodemanagers-proxies : 0
18/11/20 03:50:18 INFO ContainerManagementProtocolProxy: Opening proxy : wn1-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:30050
18/11/20 03:50:18 INFO ContainerManagementProtocolProxy: Opening proxy : wn1-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:30050
18/11/20 03:50:18 INFO ContainerManagementProtocolProxy: Opening proxy : wn0-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:30050
18/11/20 03:50:18 INFO YarnSchedulerBackend$YarnDriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (10.0.0.6:51960) with ID 2
18/11/20 03:50:18 INFO YarnSchedulerBackend$YarnDriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (10.0.0.12:53986) with ID 1
18/11/20 03:50:18 INFO BlockManagerMasterEndpoint: Registering block manager wn0-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:36959 with 2004.6 MB RAM, BlockManagerId(2, wn0-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net, 36959, None)
18/11/20 03:50:18 INFO BlockManagerMasterEndpoint: Registering block manager wn1-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:38707 with 2004.6 MB RAM, BlockManagerId(1, wn1-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net, 38707, None)
18/11/20 03:50:21 INFO YarnSchedulerBackend$YarnDriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (10.0.0.6:51978) with ID 4
18/11/20 03:50:21 INFO BlockManagerMasterEndpoint: Registering block manager wn0-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:44937 with 2004.6 MB RAM, BlockManagerId(4, wn0-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net, 44937, None)
18/11/20 03:50:21 INFO YarnSchedulerBackend$YarnDriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (10.0.0.12:54016) with ID 5
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/2.6.5.3003-25/spark2/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.6.5.3003-25/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.6.5.3003-25/spark_llap/spark-llap-assembly-1.0.0.2.6.5.3003-25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
18/11/20 03:50:24 INFO SignalUtils: Registered signal handler for TERM
18/11/20 03:50:24 INFO SignalUtils: Registered signal handler for HUP
18/11/20 03:50:24 INFO SignalUtils: Registered signal handler for INT
18/11/20 03:50:24 INFO SecurityManager: Changing view acls to: yarn,livy
18/11/20 03:50:24 INFO SecurityManager: Changing modify acls to: yarn,livy
18/11/20 03:50:24 INFO SecurityManager: Changing view acls groups to: 
18/11/20 03:50:24 INFO SecurityManager: Changing modify acls groups to: 
18/11/20 03:50:24 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(yarn, livy); groups with view permissions: Set(); users  with modify permissions: Set(yarn, livy); groups with modify permissions: Set()
18/11/20 03:50:25 INFO ApplicationMaster: Preparing Local resources
18/11/20 03:50:26 INFO MetricsConfig: loaded properties from hadoop-metrics2-azure-file-system.properties
18/11/20 03:50:26 INFO WasbAzureIaasSink: Init starting.
18/11/20 03:50:26 INFO AzureIaasSink: Init starting. Initializing MdsLogger.
18/11/20 03:50:26 INFO AzureIaasSink: Init completed.
18/11/20 03:50:26 INFO WasbAzureIaasSink: Init completed.
18/11/20 03:50:26 INFO MetricsSinkAdapter: Sink azurefs2 started
18/11/20 03:50:26 INFO MetricsSystemImpl: Scheduled snapshot period at 60 second(s).
18/11/20 03:50:26 INFO MetricsSystemImpl: azure-file-system metrics system started
18/11/20 03:50:26 INFO ApplicationMaster: ApplicationAttemptId: appattempt_1540794334804_0259_000004
18/11/20 03:50:26 INFO ApplicationMaster: Starting the user application in a separate Thread
18/11/20 03:50:26 INFO ApplicationMaster: Waiting for spark context initialization...
18/11/20 03:50:26 INFO SparkContext: Running Spark version 2.3.0.2.6.5.3003-25
18/11/20 03:50:26 INFO SparkContext: Submitted application: ReadSStreamDemo
18/11/20 03:50:26 INFO SecurityManager: Changing view acls to: yarn,livy
18/11/20 03:50:26 INFO SecurityManager: Changing modify acls to: yarn,livy
18/11/20 03:50:26 INFO SecurityManager: Changing view acls groups to: 
18/11/20 03:50:26 INFO SecurityManager: Changing modify acls groups to: 
18/11/20 03:50:26 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(yarn, livy); groups with view permissions: Set(); users  with modify permissions: Set(yarn, livy); groups with modify permissions: Set()
18/11/20 03:50:26 INFO Utils: Successfully started service 'sparkDriver' on port 45061.
18/11/20 03:50:26 INFO SparkEnv: Registering MapOutputTracker
18/11/20 03:50:26 INFO SparkEnv: Registering BlockManagerMaster
18/11/20 03:50:26 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
18/11/20 03:50:26 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
18/11/20 03:50:26 INFO DiskBlockManager: Created local directory at /mnt/resource/hadoop/yarn/local/usercache/livy/appcache/application_1540794334804_0259/blockmgr-0ee9b165-5872-41e7-8847-4630e411f51a
18/11/20 03:50:26 INFO MemoryStore: MemoryStore started with capacity 2004.6 MB
18/11/20 03:50:26 INFO SparkEnv: Registering OutputCommitCoordinator
18/11/20 03:50:27 INFO log: Logging initialized @3275ms
18/11/20 03:50:27 INFO JettyUtils: Adding filter: org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter
18/11/20 03:50:27 INFO Server: jetty-9.3.z-SNAPSHOT
18/11/20 03:50:27 INFO Server: Started @3408ms
18/11/20 03:50:27 INFO AbstractConnector: Started ServerConnector@66281876{HTTP/1.1,[http/1.1]}{0.0.0.0:34009}
18/11/20 03:50:27 INFO Utils: Successfully started service 'SparkUI' on port 34009.
18/11/20 03:50:27 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@1ff766bc{/jobs,null,AVAILABLE,@Spark}
18/11/20 03:50:27 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@55757592{/jobs/json,null,AVAILABLE,@Spark}
18/11/20 03:50:27 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@5925621e{/jobs/job,null,AVAILABLE,@Spark}
18/11/20 03:50:27 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@4cbce37a{/jobs/job/json,null,AVAILABLE,@Spark}
18/11/20 03:50:27 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@329bc69a{/stages,null,AVAILABLE,@Spark}
18/11/20 03:50:27 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@40dc6745{/stages/json,null,AVAILABLE,@Spark}
18/11/20 03:50:27 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@7a9ce8a6{/stages/stage,null,AVAILABLE,@Spark}
18/11/20 03:50:27 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@1e166eb2{/stages/stage/json,null,AVAILABLE,@Spark}
18/11/20 03:50:27 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@3d27c0e{/stages/pool,null,AVAILABLE,@Spark}
18/11/20 03:50:27 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@518de24b{/stages/pool/json,null,AVAILABLE,@Spark}
18/11/20 03:50:27 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@727dfb38{/storage,null,AVAILABLE,@Spark}
18/11/20 03:50:27 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@5f33683b{/storage/json,null,AVAILABLE,@Spark}
18/11/20 03:50:27 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@4f115ddf{/storage/rdd,null,AVAILABLE,@Spark}
18/11/20 03:50:27 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@713071cc{/storage/rdd/json,null,AVAILABLE,@Spark}
18/11/20 03:50:27 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@6c09f0e9{/environment,null,AVAILABLE,@Spark}
18/11/20 03:50:27 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@28bf3fe3{/environment/json,null,AVAILABLE,@Spark}
18/11/20 03:50:27 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@6d85708e{/executors,null,AVAILABLE,@Spark}
18/11/20 03:50:27 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@1f49d2e3{/executors/json,null,AVAILABLE,@Spark}
18/11/20 03:50:27 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@3ce4e75b{/executors/threadDump,null,AVAILABLE,@Spark}
18/11/20 03:50:27 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@6df58703{/executors/threadDump/json,null,AVAILABLE,@Spark}
18/11/20 03:50:27 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@2bb28b33{/static,null,AVAILABLE,@Spark}
18/11/20 03:50:27 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@4a813a8b{/,null,AVAILABLE,@Spark}
18/11/20 03:50:27 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@6c7237b0{/api,null,AVAILABLE,@Spark}
18/11/20 03:50:27 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@4bb7f2ff{/jobs/job/kill,null,AVAILABLE,@Spark}
18/11/20 03:50:27 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@6f49ed1d{/stages/stage/kill,null,AVAILABLE,@Spark}
18/11/20 03:50:27 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://wn0-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:34009
18/11/20 03:50:27 INFO YarnClusterScheduler: Created YarnClusterScheduler
18/11/20 03:50:27 INFO SchedulerExtensionServices: Starting Yarn extension services with app application_1540794334804_0259 and attemptId Some(appattempt_1540794334804_0259_000004)
18/11/20 03:50:27 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 35131.
18/11/20 03:50:27 INFO NettyBlockTransferService: Server created on wn0-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:35131
18/11/20 03:50:27 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
18/11/20 03:50:27 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, wn0-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net, 35131, None)
18/11/20 03:50:27 INFO BlockManagerMasterEndpoint: Registering block manager wn0-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:35131 with 2004.6 MB RAM, BlockManagerId(driver, wn0-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net, 35131, None)
18/11/20 03:50:27 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, wn0-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net, 35131, None)
18/11/20 03:50:27 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, wn0-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net, 35131, None)
18/11/20 03:50:27 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@6a713da5{/metrics/json,null,AVAILABLE,@Spark}
18/11/20 03:50:27 INFO EventLoggingListener: Logging events to wasb:/hdp/spark2-events/application_1540794334804_0259_4
18/11/20 03:50:27 INFO EnhancementSparkListener: Enhancement listener is enabled
18/11/20 03:50:27 INFO SparkContext: Registered listener com.microsoft.hdinsight.spark.metrics.SparkMetricsListener
18/11/20 03:50:27 INFO SparkContext: Registered listener org.apache.spark.sql.scheduler.EnhancementSparkListener
18/11/20 03:50:28 INFO ApplicationMaster: 
===============================================================================
YARN executor launch context:
  env:
    CLASSPATH -> {{PWD}}<CPS>{{PWD}}/__spark_conf__<CPS>{{PWD}}/__spark_libs__/*<CPS>/usr/hdp/current/spark2-client/jars/*<CPS>$HADOOP_CONF_DIR<CPS>/usr/hdp/current/hadoop-client/*<CPS>/usr/hdp/current/hadoop-client/lib/*<CPS>/usr/hdp/current/hadoop-hdfs-client/*<CPS>/usr/hdp/current/hadoop-hdfs-client/lib/*<CPS>/usr/hdp/current/hadoop-yarn-client/*<CPS>/usr/hdp/current/hadoop-yarn-client/lib/*<CPS>$PWD/mr-framework/hadoop/share/hadoop/mapreduce/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/lib/*:$PWD/mr-framework/hadoop/share/hadoop/common/*:$PWD/mr-framework/hadoop/share/hadoop/common/lib/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/lib/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/lib/*:$PWD/mr-framework/hadoop/share/hadoop/tools/lib/*:/usr/hdp/2.6.5.3003-25/hadoop/lib/hadoop-lzo-0.6.0.2.6.5.3003-25.jar:/etc/hadoop/conf/secure:/usr/hdp/current/ext/hadoop/*<CPS>:/usr/hdp/current/spark2-client/jars/*:/usr/lib/hdinsight-datalake/*:/usr/hdp/current/spark_llap/*:/usr/hdp/current/spark2-client/conf:<CPS>{{PWD}}/__spark_conf__/__hadoop_conf__
    SPARK_DIST_CLASSPATH -> :/usr/hdp/current/spark2-client/jars/*:/usr/lib/hdinsight-datalake/*:/usr/hdp/current/spark_llap/*:/usr/hdp/current/spark2-client/conf:
    SPARK_YARN_STAGING_DIR -> *********(redacted)
    SPARK_USER -> *********(redacted)

  command:
    LD_LIBRARY_PATH="/usr/hdp/current/hadoop-client/lib/native:/usr/hdp/current/hadoop-client/lib/native/Linux-amd64-64:$LD_LIBRARY_PATH" \ 
      {{JAVA_HOME}}/bin/java \ 
      -server \ 
      -Xmx4096m \ 
      '-Dhdp.version=' \ 
      '-Detwlogger.component=sparkexecutor' \ 
      '-DlogFilter.filename=SparkLogFilters.xml' \ 
      '-DpatternGroup.filename=SparkPatternGroups.xml' \ 
      '-Dlog4jspark.root.logger=INFO,console,RFA,ETW,Anonymizer' \ 
      '-Dlog4jspark.log.dir=/var/log/sparkapp/\${user.name}' \ 
      '-Dlog4jspark.log.file=sparkexecutor.log' \ 
      '-Dlog4j.configuration=file:/usr/hdp/current/spark2-client/conf/log4j.properties' \ 
      '-Djavax.xml.parsers.SAXParserFactory=com.sun.org.apache.xerces.internal.jaxp.SAXParserFactoryImpl' \ 
      '-XX:+UseParallelGC' \ 
      '-XX:+UseParallelOldGC' \ 
      -Djava.io.tmpdir={{PWD}}/tmp \ 
      '-Dspark.history.ui.port=18080' \ 
      -Dspark.yarn.app.container.log.dir=<LOG_DIR> \ 
      -XX:OnOutOfMemoryError='kill %p' \ 
      org.apache.spark.executor.CoarseGrainedExecutorBackend \ 
      --driver-url \ 
      spark://CoarseGrainedScheduler@wn0-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:45061 \ 
      --executor-id \ 
      <executorId> \ 
      --hostname \ 
      <hostname> \ 
      --cores \ 
      1 \ 
      --app-id \ 
      application_1540794334804_0259 \ 
      --user-class-path \ 
      file:$PWD/__app__.jar \ 
      1><LOG_DIR>/stdout \ 
      2><LOG_DIR>/stderr

  resources:
    __app__.jar -> resource { scheme: "wasb" host: "catestsa.blob.core.windows.net" port: -1 file: "/user/livy/.sparkStaging/application_1540794334804_0259/default_artifact.jar" userInfo: "spark23hdinsight-2018-10-29t06-07-10-410z" } size: 69301 timestamp: 1542685785000 type: FILE visibility: PRIVATE
    __spark_conf__ -> resource { scheme: "wasb" host: "catestsa.blob.core.windows.net" port: -1 file: "/user/livy/.sparkStaging/application_1540794334804_0259/__spark_conf__.zip" userInfo: "spark23hdinsight-2018-10-29t06-07-10-410z" } size: 259876 timestamp: 1542685786000 type: ARCHIVE visibility: PRIVATE

===============================================================================
18/11/20 03:50:28 INFO YarnRMClient: Registering the ApplicationMaster
18/11/20 03:50:28 INFO RequestHedgingRMFailoverProxyProvider: Looking for the active RM in [rm1, rm2]...
18/11/20 03:50:28 INFO RequestHedgingRMFailoverProxyProvider: Found active RM [rm2]
18/11/20 03:50:28 INFO YarnAllocator: Will request 5 executor container(s), each with 1 core(s) and 4480 MB memory (including 384 MB of overhead)
18/11/20 03:50:28 INFO YarnSchedulerBackend$YarnSchedulerEndpoint: ApplicationMaster registered as NettyRpcEndpointRef(spark://YarnAM@wn0-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:45061)
18/11/20 03:50:28 INFO YarnAllocator: Submitted 5 unlocalized container requests.
18/11/20 03:50:28 INFO ApplicationMaster: Started progress reporter thread with (heartbeat : 5000, initial allocation : 200) intervals
18/11/20 03:50:28 INFO AMRMClientImpl: Received new token for : wn0-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:30050
18/11/20 03:50:28 INFO AMRMClientImpl: Received new token for : wn1-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:30050
18/11/20 03:50:29 INFO YarnAllocator: Launching container container_e02_1540794334804_0259_04_000002 on host wn0-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net for executor with ID 1
18/11/20 03:50:29 INFO YarnAllocator: Launching container container_e02_1540794334804_0259_04_000003 on host wn1-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net for executor with ID 2
18/11/20 03:50:29 INFO YarnAllocator: Received 2 containers from YARN, launching executors on 2 of them.
18/11/20 03:50:29 INFO ContainerManagementProtocolProxy: yarn.client.max-cached-nodemanagers-proxies : 0
18/11/20 03:50:29 INFO ContainerManagementProtocolProxy: yarn.client.max-cached-nodemanagers-proxies : 0
18/11/20 03:50:29 INFO ContainerManagementProtocolProxy: Opening proxy : wn1-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:30050
18/11/20 03:50:29 INFO ContainerManagementProtocolProxy: Opening proxy : wn0-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:30050
18/11/20 03:50:29 INFO YarnAllocator: Launching container container_e02_1540794334804_0259_04_000004 on host wn0-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net for executor with ID 3
18/11/20 03:50:29 INFO YarnAllocator: Launching container container_e02_1540794334804_0259_04_000005 on host wn1-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net for executor with ID 4
18/11/20 03:50:29 INFO ContainerManagementProtocolProxy: yarn.client.max-cached-nodemanagers-proxies : 0
18/11/20 03:50:29 INFO YarnAllocator: Received 2 containers from YARN, launching executors on 2 of them.
18/11/20 03:50:29 INFO ContainerManagementProtocolProxy: yarn.client.max-cached-nodemanagers-proxies : 0
18/11/20 03:50:29 INFO ContainerManagementProtocolProxy: Opening proxy : wn0-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:30050
18/11/20 03:50:29 INFO ContainerManagementProtocolProxy: Opening proxy : wn1-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:30050
18/11/20 03:50:31 INFO YarnAllocator: Launching container container_e02_1540794334804_0259_04_000006 on host wn0-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net for executor with ID 5
18/11/20 03:50:31 INFO YarnAllocator: Received 2 containers from YARN, launching executors on 1 of them.
18/11/20 03:50:31 INFO ContainerManagementProtocolProxy: yarn.client.max-cached-nodemanagers-proxies : 0
18/11/20 03:50:31 INFO ContainerManagementProtocolProxy: Opening proxy : wn0-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:30050
18/11/20 03:50:31 INFO YarnSchedulerBackend$YarnDriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (10.0.0.12:35514) with ID 2
18/11/20 03:50:31 INFO YarnSchedulerBackend$YarnDriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (10.0.0.6:51376) with ID 1
18/11/20 03:50:31 INFO BlockManagerMasterEndpoint: Registering block manager wn1-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:42619 with 2004.6 MB RAM, BlockManagerId(2, wn1-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net, 42619, None)
18/11/20 03:50:31 INFO BlockManagerMasterEndpoint: Registering block manager wn0-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:33393 with 2004.6 MB RAM, BlockManagerId(1, wn0-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net, 33393, None)
18/11/20 03:50:32 INFO YarnSchedulerBackend$YarnDriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (10.0.0.12:35518) with ID 4
18/11/20 03:50:32 INFO YarnSchedulerBackend$YarnDriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (10.0.0.6:51382) with ID 3
18/11/20 03:50:32 INFO BlockManagerMasterEndpoint: Registering block manager wn1-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:45229 with 2004.6 MB RAM, BlockManagerId(4, wn1-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net, 45229, None)
18/11/20 03:50:32 INFO YarnClusterSchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.8
18/11/20 03:50:32 INFO YarnClusterScheduler: YarnClusterScheduler.postStartHook done
18/11/20 03:50:32 INFO BlockManagerMasterEndpoint: Registering block manager wn0-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:36889 with 2004.6 MB RAM, BlockManagerId(3, wn0-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net, 36889, None)
18/11/20 03:50:32 INFO SharedState: loading hive config file: file:/etc/spark2/2.6.5.3003-25/0/hive-site.xml
18/11/20 03:50:32 INFO SharedState: Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('/hive/warehouse').
18/11/20 03:50:32 INFO SharedState: Warehouse path is '/hive/warehouse'.
18/11/20 03:50:32 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@75b01f7a{/SQL,null,AVAILABLE,@Spark}
18/11/20 03:50:32 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@5f1dea69{/SQL/json,null,AVAILABLE,@Spark}
18/11/20 03:50:32 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@7fa4836d{/SQL/execution,null,AVAILABLE,@Spark}
18/11/20 03:50:32 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@3c09906c{/SQL/execution/json,null,AVAILABLE,@Spark}
18/11/20 03:50:32 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@5f7f3423{/static/sql,null,AVAILABLE,@Spark}
18/11/20 03:50:33 INFO StateStoreCoordinatorRef: Registered StateStoreCoordinator endpoint
18/11/20 03:50:33 ERROR ApplicationMaster: User class threw exception: java.lang.ClassNotFoundException: Failed to find data source: sstream. Please find packages at http://spark.apache.org/third-party-projects.html
java.lang.ClassNotFoundException: Failed to find data source: sstream. Please find packages at http://spark.apache.org/third-party-projects.html
    at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:635)
    at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:190)
    at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:174)
    at sample.ReadSStream$.main(ReadSStream.scala:6)
    at sample.ReadSStream.main(ReadSStream.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$4.run(ApplicationMaster.scala:721)
Caused by: java.lang.ClassNotFoundException: sstream.DefaultSource
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$23$$anonfun$apply$15.apply(DataSource.scala:618)
    at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$23$$anonfun$apply$15.apply(DataSource.scala:618)
    at scala.util.Try$.apply(Try.scala:192)
    at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$23.apply(DataSource.scala:618)
    at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$23.apply(DataSource.scala:618)
    at scala.util.Try.orElse(Try.scala:84)
    at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:618)
    ... 9 more
18/11/20 03:50:33 INFO ApplicationMaster: Final app status: FAILED, exitCode: 15, (reason: User class threw exception: java.lang.ClassNotFoundException: Failed to find data source: sstream. Please find packages at http://spark.apache.org/third-party-projects.html
    at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:635)
    at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:190)
    at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:174)
    at sample.ReadSStream$.main(ReadSStream.scala:6)
    at sample.ReadSStream.main(ReadSStream.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$4.run(ApplicationMaster.scala:721)
Caused by: java.lang.ClassNotFoundException: sstream.DefaultSource
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$23$$anonfun$apply$15.apply(DataSource.scala:618)
    at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$23$$anonfun$apply$15.apply(DataSource.scala:618)
    at scala.util.Try$.apply(Try.scala:192)
    at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$23.apply(DataSource.scala:618)
    at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$23.apply(DataSource.scala:618)
    at scala.util.Try.orElse(Try.scala:84)
    at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:618)
    ... 9 more
)
18/11/20 03:50:33 INFO SparkContext: Invoking stop() from shutdown hook
18/11/20 03:50:33 INFO AbstractConnector: Stopped Spark@66281876{HTTP/1.1,[http/1.1]}{0.0.0.0:0}
18/11/20 03:50:33 INFO SparkUI: Stopped Spark web UI at http://wn0-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:34009
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.BlockManager.disk.diskSpaceUsed_MB, value=0
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.BlockManager.memory.maxMem_MB, value=10022
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.BlockManager.memory.maxOffHeapMem_MB, value=0
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.BlockManager.memory.maxOnHeapMem_MB, value=10022
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.BlockManager.memory.memUsed_MB, value=0
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.BlockManager.memory.offHeapMemUsed_MB, value=0
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.BlockManager.memory.onHeapMemUsed_MB, value=0
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.BlockManager.memory.remainingMem_MB, value=10022
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.BlockManager.memory.remainingOffHeapMem_MB, value=0
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.BlockManager.memory.remainingOnHeapMem_MB, value=10022
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.DAGScheduler.job.activeJobs, value=0
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.DAGScheduler.job.allJobs, value=0
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.DAGScheduler.stage.failedStages, value=0
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.DAGScheduler.stage.runningStages, value=0
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.DAGScheduler.stage.waitingStages, value=0
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.LiveListenerBus.queue.appStatus.size, value=0
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.LiveListenerBus.queue.eventLog.size, value=0
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.LiveListenerBus.queue.executorManagement.size, value=0
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.LiveListenerBus.queue.shared.size, value=0
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.PS-MarkSweep.count, value=2
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.PS-MarkSweep.time, value=51
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.PS-Scavenge.count, value=5
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.PS-Scavenge.time, value=82
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.direct.capacity, value=74337
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.direct.count, value=16
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.direct.used, value=74338
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.heap.committed, value=825229312
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.heap.init, value=924844032
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.heap.max, value=3817865216
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.heap.usage, value=0.09233195517816835
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.heap.used, value=352510960
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.mapped.capacity, value=0
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.mapped.count, value=0
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.mapped.used, value=0
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.non-heap.committed, value=75579392
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.non-heap.init, value=2555904
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.non-heap.max, value=-1
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.non-heap.usage, value=-7.32548E7
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.non-heap.used, value=73255336
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.Code-Cache.committed, value=10747904
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.Code-Cache.init, value=2555904
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.Code-Cache.max, value=251658240
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.Code-Cache.usage, value=0.03895238240559896
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.Code-Cache.used, value=9802688
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.Compressed-Class-Space.committed, value=7905280
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.Compressed-Class-Space.init, value=0
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.Compressed-Class-Space.max, value=1073741824
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.Compressed-Class-Space.usage, value=0.007152199745178223
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.Compressed-Class-Space.used, value=7679616
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.Metaspace.committed, value=56926208
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.Metaspace.init, value=0
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.Metaspace.max, value=-1
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.Metaspace.usage, value=0.979972809711829
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.Metaspace.used, value=55787384
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.PS-Eden-Space.committed, value=384303104
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.PS-Eden-Space.init, value=231735296
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.PS-Eden-Space.max, value=1323302912
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.PS-Eden-Space.usage, value=0.19502428783289794
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.PS-Eden-Space.used, value=258076208
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.PS-Old-Gen.committed, value=402653184
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.PS-Old-Gen.init, value=616562688
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.PS-Old-Gen.max, value=2863661056
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.PS-Old-Gen.usage, value=0.020038086518574354
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.PS-Old-Gen.used, value=57382288
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.PS-Survivor-Space.committed, value=38273024
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.PS-Survivor-Space.init, value=38273024
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.PS-Survivor-Space.max, value=38273024
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.PS-Survivor-Space.usage, value=0.999831108197774
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.PS-Survivor-Space.used, value=38266560
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.total.committed, value=900808704
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.total.init, value=927399936
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.total.max, value=3817865215
18/11/20 03:50:33 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.total.used, value=427900864
18/11/20 03:50:33 INFO metrics: type=COUNTER, name=application_1540794334804_0259.driver.HiveExternalCatalog.fileCacheHits, count=0
18/11/20 03:50:33 INFO metrics: type=COUNTER, name=application_1540794334804_0259.driver.HiveExternalCatalog.filesDiscovered, count=0
18/11/20 03:50:33 INFO metrics: type=COUNTER, name=application_1540794334804_0259.driver.HiveExternalCatalog.hiveClientCalls, count=0
18/11/20 03:50:33 INFO metrics: type=COUNTER, name=application_1540794334804_0259.driver.HiveExternalCatalog.parallelListingJobCount, count=0
18/11/20 03:50:33 INFO metrics: type=COUNTER, name=application_1540794334804_0259.driver.HiveExternalCatalog.partitionsFetched, count=0
18/11/20 03:50:33 INFO metrics: type=COUNTER, name=application_1540794334804_0259.driver.LiveListenerBus.numEventsPosted, count=12
18/11/20 03:50:33 INFO metrics: type=COUNTER, name=application_1540794334804_0259.driver.LiveListenerBus.queue.appStatus.numDroppedEvents, count=0
18/11/20 03:50:33 INFO metrics: type=COUNTER, name=application_1540794334804_0259.driver.LiveListenerBus.queue.eventLog.numDroppedEvents, count=0
18/11/20 03:50:33 INFO metrics: type=COUNTER, name=application_1540794334804_0259.driver.LiveListenerBus.queue.executorManagement.numDroppedEvents, count=0
18/11/20 03:50:33 INFO metrics: type=COUNTER, name=application_1540794334804_0259.driver.LiveListenerBus.queue.shared.numDroppedEvents, count=0
18/11/20 03:50:33 INFO metrics: type=HISTOGRAM, name=application_1540794334804_0259.driver.CodeGenerator.compilationTime, count=0, min=0, max=0, mean=0.0, stddev=0.0, median=0.0, p75=0.0, p95=0.0, p98=0.0, p99=0.0, p999=0.0
18/11/20 03:50:33 INFO metrics: type=HISTOGRAM, name=application_1540794334804_0259.driver.CodeGenerator.generatedClassSize, count=0, min=0, max=0, mean=0.0, stddev=0.0, median=0.0, p75=0.0, p95=0.0, p98=0.0, p99=0.0, p999=0.0
18/11/20 03:50:33 INFO metrics: type=HISTOGRAM, name=application_1540794334804_0259.driver.CodeGenerator.generatedMethodSize, count=0, min=0, max=0, mean=0.0, stddev=0.0, median=0.0, p75=0.0, p95=0.0, p98=0.0, p99=0.0, p999=0.0
18/11/20 03:50:33 INFO metrics: type=HISTOGRAM, name=application_1540794334804_0259.driver.CodeGenerator.sourceCodeSize, count=0, min=0, max=0, mean=0.0, stddev=0.0, median=0.0, p75=0.0, p95=0.0, p98=0.0, p99=0.0, p999=0.0
18/11/20 03:50:33 INFO metrics: type=TIMER, name=application_1540794334804_0259.driver.DAGScheduler.messageProcessingTime, count=4, min=0.0042, max=4.330515999999999, mean=1.0785740594670652, stddev=1.8681222804739996, median=0.0063, p75=0.0063, p95=4.330515999999999, p98=4.330515999999999, p99=4.330515999999999, p999=4.330515999999999, mean_rate=0.6700803622587382, m1=0.4, m5=0.4, m15=0.4, rate_unit=events/second, duration_unit=milliseconds
18/11/20 03:50:33 INFO metrics: type=TIMER, name=application_1540794334804_0259.driver.LiveListenerBus.listenerProcessingTime.com.microsoft.hdinsight.spark.metrics.SparkMetricsListener, count=12, min=0.211001, max=23.379589, mean=2.698954001808535, stddev=6.163973331168745, median=0.268601, p75=1.783107, p95=23.379589, p98=23.379589, p99=23.379589, p999=23.379589, mean_rate=2.2161491756671907, m1=2.1999999999999997, m5=2.1999999999999997, m15=2.1999999999999997, rate_unit=events/second, duration_unit=milliseconds
18/11/20 03:50:33 INFO metrics: type=TIMER, name=application_1540794334804_0259.driver.LiveListenerBus.listenerProcessingTime.org.apache.spark.HeartbeatReceiver, count=12, min=0.0031, max=14.260753999999999, mean=1.3960531365564102, stddev=3.813565904793036, median=0.0121, p75=0.123, p95=14.260753999999999, p98=14.260753999999999, p99=14.260753999999999, p999=14.260753999999999, mean_rate=1.9930145653299465, m1=1.4, m5=1.4, m15=1.4, rate_unit=events/second, duration_unit=milliseconds
18/11/20 03:50:33 INFO metrics: type=TIMER, name=application_1540794334804_0259.driver.LiveListenerBus.listenerProcessingTime.org.apache.spark.scheduler.EventLoggingListener, count=12, min=0.26770099999999997, max=25.668598, mean=5.0566926077075465, stddev=8.716174717553924, median=0.330101, p75=3.6961139999999997, p95=25.668598, p98=25.668598, p99=25.668598, p999=25.668598, mean_rate=2.209271469594465, m1=2.1999999999999997, m5=2.1999999999999997, m15=2.1999999999999997, rate_unit=events/second, duration_unit=milliseconds
18/11/20 03:50:33 INFO metrics: type=TIMER, name=application_1540794334804_0259.driver.LiveListenerBus.listenerProcessingTime.org.apache.spark.sql.scheduler.EnhancementSparkListener, count=12, min=0.0019, max=1.000504, mean=0.08401916802913656, stddev=0.26864335752095003, median=0.0023, p75=0.0026999999999999997, p95=1.000504, p98=1.000504, p99=1.000504, p999=1.000504, mean_rate=2.2159939885032145, m1=2.1999999999999997, m5=2.1999999999999997, m15=2.1999999999999997, rate_unit=events/second, duration_unit=milliseconds
18/11/20 03:50:33 INFO metrics: type=TIMER, name=application_1540794334804_0259.driver.LiveListenerBus.listenerProcessingTime.org.apache.spark.status.AppStatusListener, count=12, min=0.0812, max=41.671259, mean=4.464243009282402, stddev=11.204265350873184, median=0.1011, p75=0.686202, p95=41.671259, p98=41.671259, p99=41.671259, p999=41.671259, mean_rate=1.7715497897735524, m1=0.6, m5=0.6, m15=0.6, rate_unit=events/second, duration_unit=milliseconds
18/11/20 03:50:33 INFO metrics: type=TIMER, name=application_1540794334804_0259.driver.LiveListenerBus.queue.appStatus.listenerProcessingTime, count=12, min=0.1175, max=41.761659, mean=4.540388913781883, stddev=11.206402667792057, median=0.158301, p75=1.004804, p95=41.761659, p98=41.761659, p99=41.761659, p999=41.761659, mean_rate=1.7708204119549984, m1=0.6, m5=0.6, m15=0.6, rate_unit=events/second, duration_unit=milliseconds
18/11/20 03:50:33 INFO metrics: type=TIMER, name=application_1540794334804_0259.driver.LiveListenerBus.queue.eventLog.listenerProcessingTime, count=12, min=0.295101, max=25.761798, mean=5.096317469040512, stddev=8.730299095627391, median=0.368402, p75=3.728014, p95=25.761798, p98=25.761798, p99=25.761798, p999=25.761798, mean_rate=2.2082565461651824, m1=2.1999999999999997, m5=2.1999999999999997, m15=2.1999999999999997, rate_unit=events/second, duration_unit=milliseconds
18/11/20 03:50:33 INFO metrics: type=TIMER, name=application_1540794334804_0259.driver.LiveListenerBus.queue.executorManagement.listenerProcessingTime, count=12, min=0.050501, max=16.651963, mean=1.672426505818177, stddev=4.426608385897497, median=0.075901, p75=0.653602, p95=16.651963, p98=16.651963, p99=16.651963, p999=16.651963, mean_rate=1.9918775320910067, m1=1.4, m5=1.4, m15=1.4, rate_unit=events/second, duration_unit=milliseconds
18/11/20 03:50:33 INFO metrics: type=TIMER, name=application_1540794334804_0259.driver.LiveListenerBus.queue.shared.listenerProcessingTime, count=12, min=0.263501, max=23.444689999999998, mean=2.8438458199746366, stddev=6.180094718664762, median=0.335401, p75=1.835907, p95=23.444689999999998, p98=23.444689999999998, p99=23.444689999999998, p999=23.444689999999998, mean_rate=2.2143999936697685, m1=2.1999999999999997, m5=2.1999999999999997, m15=2.1999999999999997, rate_unit=events/second, duration_unit=milliseconds
18/11/20 03:50:33 INFO YarnAllocator: Driver requested a total number of 0 executor(s).
18/11/20 03:50:33 INFO YarnClusterSchedulerBackend: Shutting down all executors
18/11/20 03:50:33 INFO YarnSchedulerBackend$YarnDriverEndpoint: Asking each executor to shut down
18/11/20 03:50:33 INFO SchedulerExtensionServices: Stopping SchedulerExtensionServices
(serviceOption=None,
 services=List(),
 started=false)
18/11/20 03:50:33 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
18/11/20 03:50:33 INFO MemoryStore: MemoryStore cleared
18/11/20 03:50:33 INFO BlockManager: BlockManager stopped
18/11/20 03:50:33 INFO BlockManagerMaster: BlockManagerMaster stopped
18/11/20 03:50:33 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
18/11/20 03:50:33 INFO SparkContext: Successfully stopped SparkContext
18/11/20 03:50:33 INFO ShutdownHookManager: Shutdown hook called
18/11/20 03:50:33 INFO ShutdownHookManager: Deleting directory /mnt/resource/hadoop/yarn/local/usercache/livy/appcache/application_1540794334804_0259/spark-32973863-df31-4ff4-a7de-2e5118f23670
18/11/20 03:50:33 INFO MetricsSystemImpl: Stopping azure-file-system metrics system...
18/11/20 03:50:33 INFO MetricsSinkAdapter: azurefs2 thread interrupted.
18/11/20 03:50:33 INFO MetricsSystemImpl: azure-file-system metrics system stopped.
18/11/20 03:50:33 INFO MetricsSystemImpl: azure-file-system metrics system shutdown complete.
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/2.6.5.3003-25/spark2/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.6.5.3003-25/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.6.5.3003-25/spark_llap/spark-llap-assembly-1.0.0.2.6.5.3003-25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
18/11/20 03:50:36 INFO SignalUtils: Registered signal handler for TERM
18/11/20 03:50:36 INFO SignalUtils: Registered signal handler for HUP
18/11/20 03:50:36 INFO SignalUtils: Registered signal handler for INT
18/11/20 03:50:36 INFO SecurityManager: Changing view acls to: yarn,livy
18/11/20 03:50:36 INFO SecurityManager: Changing modify acls to: yarn,livy
18/11/20 03:50:36 INFO SecurityManager: Changing view acls groups to: 
18/11/20 03:50:36 INFO SecurityManager: Changing modify acls groups to: 
18/11/20 03:50:36 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(yarn, livy); groups with view permissions: Set(); users  with modify permissions: Set(yarn, livy); groups with modify permissions: Set()
18/11/20 03:50:36 INFO ApplicationMaster: Preparing Local resources
18/11/20 03:50:37 INFO MetricsConfig: loaded properties from hadoop-metrics2-azure-file-system.properties
18/11/20 03:50:37 INFO WasbAzureIaasSink: Init starting.
18/11/20 03:50:37 INFO AzureIaasSink: Init starting. Initializing MdsLogger.
18/11/20 03:50:37 INFO AzureIaasSink: Init completed.
18/11/20 03:50:37 INFO WasbAzureIaasSink: Init completed.
18/11/20 03:50:37 INFO MetricsSinkAdapter: Sink azurefs2 started
18/11/20 03:50:37 INFO MetricsSystemImpl: Scheduled snapshot period at 60 second(s).
18/11/20 03:50:37 INFO MetricsSystemImpl: azure-file-system metrics system started
18/11/20 03:50:37 INFO ApplicationMaster: ApplicationAttemptId: appattempt_1540794334804_0259_000005
18/11/20 03:50:37 INFO ApplicationMaster: Starting the user application in a separate Thread
18/11/20 03:50:37 INFO ApplicationMaster: Waiting for spark context initialization...
18/11/20 03:50:37 INFO SparkContext: Running Spark version 2.3.0.2.6.5.3003-25
18/11/20 03:50:37 INFO SparkContext: Submitted application: ReadSStreamDemo
18/11/20 03:50:37 INFO SecurityManager: Changing view acls to: yarn,livy
18/11/20 03:50:37 INFO SecurityManager: Changing modify acls to: yarn,livy
18/11/20 03:50:37 INFO SecurityManager: Changing view acls groups to: 
18/11/20 03:50:37 INFO SecurityManager: Changing modify acls groups to: 
18/11/20 03:50:37 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(yarn, livy); groups with view permissions: Set(); users  with modify permissions: Set(yarn, livy); groups with modify permissions: Set()
18/11/20 03:50:38 INFO Utils: Successfully started service 'sparkDriver' on port 33285.
18/11/20 03:50:38 INFO SparkEnv: Registering MapOutputTracker
18/11/20 03:50:38 INFO SparkEnv: Registering BlockManagerMaster
18/11/20 03:50:38 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
18/11/20 03:50:38 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
18/11/20 03:50:38 INFO DiskBlockManager: Created local directory at /mnt/resource/hadoop/yarn/local/usercache/livy/appcache/application_1540794334804_0259/blockmgr-a40d749b-6e1f-4da1-8de2-f7024f97c2df
18/11/20 03:50:38 INFO MemoryStore: MemoryStore started with capacity 2004.6 MB
18/11/20 03:50:38 INFO SparkEnv: Registering OutputCommitCoordinator
18/11/20 03:50:38 INFO log: Logging initialized @3220ms
18/11/20 03:50:38 INFO JettyUtils: Adding filter: org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter
18/11/20 03:50:38 INFO Server: jetty-9.3.z-SNAPSHOT
18/11/20 03:50:38 INFO Server: Started @3367ms
18/11/20 03:50:38 INFO AbstractConnector: Started ServerConnector@17f4f6cf{HTTP/1.1,[http/1.1]}{0.0.0.0:39053}
18/11/20 03:50:38 INFO Utils: Successfully started service 'SparkUI' on port 39053.
18/11/20 03:50:38 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@57264874{/jobs,null,AVAILABLE,@Spark}
18/11/20 03:50:38 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@210de84d{/jobs/json,null,AVAILABLE,@Spark}
18/11/20 03:50:38 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@53b78a69{/jobs/job,null,AVAILABLE,@Spark}
18/11/20 03:50:38 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@63fc1c38{/jobs/job/json,null,AVAILABLE,@Spark}
18/11/20 03:50:38 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@2dfddf8a{/stages,null,AVAILABLE,@Spark}
18/11/20 03:50:38 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@4276fc9e{/stages/json,null,AVAILABLE,@Spark}
18/11/20 03:50:38 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@3ccad78d{/stages/stage,null,AVAILABLE,@Spark}
18/11/20 03:50:38 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@9f1191f{/stages/stage/json,null,AVAILABLE,@Spark}
18/11/20 03:50:38 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@63edc113{/stages/pool,null,AVAILABLE,@Spark}
18/11/20 03:50:38 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@7c29970f{/stages/pool/json,null,AVAILABLE,@Spark}
18/11/20 03:50:38 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@34372ee5{/storage,null,AVAILABLE,@Spark}
18/11/20 03:50:38 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@350ff09d{/storage/json,null,AVAILABLE,@Spark}
18/11/20 03:50:38 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@38674a76{/storage/rdd,null,AVAILABLE,@Spark}
18/11/20 03:50:38 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@846239a{/storage/rdd/json,null,AVAILABLE,@Spark}
18/11/20 03:50:38 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@58b6471{/environment,null,AVAILABLE,@Spark}
18/11/20 03:50:38 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@4f4ae745{/environment/json,null,AVAILABLE,@Spark}
18/11/20 03:50:38 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@4dfc3020{/executors,null,AVAILABLE,@Spark}
18/11/20 03:50:38 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@749f90f6{/executors/json,null,AVAILABLE,@Spark}
18/11/20 03:50:38 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@2ae9daf8{/executors/threadDump,null,AVAILABLE,@Spark}
18/11/20 03:50:38 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@3201702f{/executors/threadDump/json,null,AVAILABLE,@Spark}
18/11/20 03:50:38 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@1ed03b7f{/static,null,AVAILABLE,@Spark}
18/11/20 03:50:38 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@c84bfa9{/,null,AVAILABLE,@Spark}
18/11/20 03:50:38 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@624d4cbd{/api,null,AVAILABLE,@Spark}
18/11/20 03:50:38 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@7ff0c685{/jobs/job/kill,null,AVAILABLE,@Spark}
18/11/20 03:50:38 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@6f916cbc{/stages/stage/kill,null,AVAILABLE,@Spark}
18/11/20 03:50:38 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://wn1-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:39053
18/11/20 03:50:38 INFO YarnClusterScheduler: Created YarnClusterScheduler
18/11/20 03:50:38 INFO SchedulerExtensionServices: Starting Yarn extension services with app application_1540794334804_0259 and attemptId Some(appattempt_1540794334804_0259_000005)
18/11/20 03:50:38 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 33125.
18/11/20 03:50:38 INFO NettyBlockTransferService: Server created on wn1-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:33125
18/11/20 03:50:38 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
18/11/20 03:50:38 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, wn1-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net, 33125, None)
18/11/20 03:50:38 INFO BlockManagerMasterEndpoint: Registering block manager wn1-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:33125 with 2004.6 MB RAM, BlockManagerId(driver, wn1-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net, 33125, None)
18/11/20 03:50:38 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, wn1-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net, 33125, None)
18/11/20 03:50:38 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, wn1-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net, 33125, None)
18/11/20 03:50:38 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@2fb6fb63{/metrics/json,null,AVAILABLE,@Spark}
18/11/20 03:50:39 INFO EventLoggingListener: Logging events to wasb:/hdp/spark2-events/application_1540794334804_0259_5
18/11/20 03:50:39 INFO EnhancementSparkListener: Enhancement listener is enabled
18/11/20 03:50:39 INFO SparkContext: Registered listener com.microsoft.hdinsight.spark.metrics.SparkMetricsListener
18/11/20 03:50:39 INFO SparkContext: Registered listener org.apache.spark.sql.scheduler.EnhancementSparkListener
18/11/20 03:50:39 INFO ApplicationMaster: 
===============================================================================
YARN executor launch context:
  env:
    CLASSPATH -> {{PWD}}<CPS>{{PWD}}/__spark_conf__<CPS>{{PWD}}/__spark_libs__/*<CPS>/usr/hdp/current/spark2-client/jars/*<CPS>$HADOOP_CONF_DIR<CPS>/usr/hdp/current/hadoop-client/*<CPS>/usr/hdp/current/hadoop-client/lib/*<CPS>/usr/hdp/current/hadoop-hdfs-client/*<CPS>/usr/hdp/current/hadoop-hdfs-client/lib/*<CPS>/usr/hdp/current/hadoop-yarn-client/*<CPS>/usr/hdp/current/hadoop-yarn-client/lib/*<CPS>$PWD/mr-framework/hadoop/share/hadoop/mapreduce/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/lib/*:$PWD/mr-framework/hadoop/share/hadoop/common/*:$PWD/mr-framework/hadoop/share/hadoop/common/lib/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/lib/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/lib/*:$PWD/mr-framework/hadoop/share/hadoop/tools/lib/*:/usr/hdp/2.6.5.3003-25/hadoop/lib/hadoop-lzo-0.6.0.2.6.5.3003-25.jar:/etc/hadoop/conf/secure:/usr/hdp/current/ext/hadoop/*<CPS>:/usr/hdp/current/spark2-client/jars/*:/usr/lib/hdinsight-datalake/*:/usr/hdp/current/spark_llap/*:/usr/hdp/current/spark2-client/conf:<CPS>{{PWD}}/__spark_conf__/__hadoop_conf__
    SPARK_DIST_CLASSPATH -> :/usr/hdp/current/spark2-client/jars/*:/usr/lib/hdinsight-datalake/*:/usr/hdp/current/spark_llap/*:/usr/hdp/current/spark2-client/conf:
    SPARK_YARN_STAGING_DIR -> *********(redacted)
    SPARK_USER -> *********(redacted)

  command:
    LD_LIBRARY_PATH="/usr/hdp/current/hadoop-client/lib/native:/usr/hdp/current/hadoop-client/lib/native/Linux-amd64-64:$LD_LIBRARY_PATH" \ 
      {{JAVA_HOME}}/bin/java \ 
      -server \ 
      -Xmx4096m \ 
      '-Dhdp.version=' \ 
      '-Detwlogger.component=sparkexecutor' \ 
      '-DlogFilter.filename=SparkLogFilters.xml' \ 
      '-DpatternGroup.filename=SparkPatternGroups.xml' \ 
      '-Dlog4jspark.root.logger=INFO,console,RFA,ETW,Anonymizer' \ 
      '-Dlog4jspark.log.dir=/var/log/sparkapp/\${user.name}' \ 
      '-Dlog4jspark.log.file=sparkexecutor.log' \ 
      '-Dlog4j.configuration=file:/usr/hdp/current/spark2-client/conf/log4j.properties' \ 
      '-Djavax.xml.parsers.SAXParserFactory=com.sun.org.apache.xerces.internal.jaxp.SAXParserFactoryImpl' \ 
      '-XX:+UseParallelGC' \ 
      '-XX:+UseParallelOldGC' \ 
      -Djava.io.tmpdir={{PWD}}/tmp \ 
      '-Dspark.history.ui.port=18080' \ 
      -Dspark.yarn.app.container.log.dir=<LOG_DIR> \ 
      -XX:OnOutOfMemoryError='kill %p' \ 
      org.apache.spark.executor.CoarseGrainedExecutorBackend \ 
      --driver-url \ 
      spark://CoarseGrainedScheduler@wn1-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:33285 \ 
      --executor-id \ 
      <executorId> \ 
      --hostname \ 
      <hostname> \ 
      --cores \ 
      1 \ 
      --app-id \ 
      application_1540794334804_0259 \ 
      --user-class-path \ 
      file:$PWD/__app__.jar \ 
      1><LOG_DIR>/stdout \ 
      2><LOG_DIR>/stderr

  resources:
    __app__.jar -> resource { scheme: "wasb" host: "catestsa.blob.core.windows.net" port: -1 file: "/user/livy/.sparkStaging/application_1540794334804_0259/default_artifact.jar" userInfo: "spark23hdinsight-2018-10-29t06-07-10-410z" } size: 69301 timestamp: 1542685785000 type: FILE visibility: PRIVATE
    __spark_conf__ -> resource { scheme: "wasb" host: "catestsa.blob.core.windows.net" port: -1 file: "/user/livy/.sparkStaging/application_1540794334804_0259/__spark_conf__.zip" userInfo: "spark23hdinsight-2018-10-29t06-07-10-410z" } size: 259876 timestamp: 1542685786000 type: ARCHIVE visibility: PRIVATE

===============================================================================
18/11/20 03:50:39 INFO YarnRMClient: Registering the ApplicationMaster
18/11/20 03:50:39 INFO RequestHedgingRMFailoverProxyProvider: Looking for the active RM in [rm1, rm2]...
18/11/20 03:50:39 INFO RequestHedgingRMFailoverProxyProvider: Found active RM [rm2]
18/11/20 03:50:39 INFO YarnAllocator: Will request 5 executor container(s), each with 1 core(s) and 4480 MB memory (including 384 MB of overhead)
18/11/20 03:50:39 INFO YarnSchedulerBackend$YarnSchedulerEndpoint: ApplicationMaster registered as NettyRpcEndpointRef(spark://YarnAM@wn1-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:33285)
18/11/20 03:50:39 INFO YarnAllocator: Submitted 5 unlocalized container requests.
18/11/20 03:50:39 INFO ApplicationMaster: Started progress reporter thread with (heartbeat : 5000, initial allocation : 200) intervals
18/11/20 03:50:40 INFO AMRMClientImpl: Received new token for : wn1-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:30050
18/11/20 03:50:40 INFO AMRMClientImpl: Received new token for : wn0-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:30050
18/11/20 03:50:40 INFO YarnAllocator: Launching container container_e02_1540794334804_0259_05_000002 on host wn1-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net for executor with ID 1
18/11/20 03:50:40 INFO YarnAllocator: Launching container container_e02_1540794334804_0259_05_000003 on host wn0-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net for executor with ID 2
18/11/20 03:50:40 INFO YarnAllocator: Received 2 containers from YARN, launching executors on 2 of them.
18/11/20 03:50:40 INFO ContainerManagementProtocolProxy: yarn.client.max-cached-nodemanagers-proxies : 0
18/11/20 03:50:40 INFO ContainerManagementProtocolProxy: yarn.client.max-cached-nodemanagers-proxies : 0
18/11/20 03:50:40 INFO ContainerManagementProtocolProxy: Opening proxy : wn1-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:30050
18/11/20 03:50:40 INFO ContainerManagementProtocolProxy: Opening proxy : wn0-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:30050
18/11/20 03:50:41 INFO YarnAllocator: Launching container container_e02_1540794334804_0259_05_000004 on host wn1-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net for executor with ID 3
18/11/20 03:50:41 INFO YarnAllocator: Launching container container_e02_1540794334804_0259_05_000005 on host wn0-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net for executor with ID 4
18/11/20 03:50:41 INFO YarnAllocator: Received 2 containers from YARN, launching executors on 2 of them.
18/11/20 03:50:41 INFO ContainerManagementProtocolProxy: yarn.client.max-cached-nodemanagers-proxies : 0
18/11/20 03:50:41 INFO ContainerManagementProtocolProxy: yarn.client.max-cached-nodemanagers-proxies : 0
18/11/20 03:50:41 INFO ContainerManagementProtocolProxy: Opening proxy : wn1-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:30050
18/11/20 03:50:41 INFO ContainerManagementProtocolProxy: Opening proxy : wn0-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:30050
18/11/20 03:50:42 INFO YarnAllocator: Launching container container_e02_1540794334804_0259_05_000006 on host wn1-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net for executor with ID 5
18/11/20 03:50:42 INFO YarnAllocator: Received 2 containers from YARN, launching executors on 1 of them.
18/11/20 03:50:42 INFO ContainerManagementProtocolProxy: yarn.client.max-cached-nodemanagers-proxies : 0
18/11/20 03:50:42 INFO ContainerManagementProtocolProxy: Opening proxy : wn1-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:30050
18/11/20 03:50:43 INFO YarnSchedulerBackend$YarnDriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (10.0.0.12:59594) with ID 1
18/11/20 03:50:43 INFO YarnSchedulerBackend$YarnDriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (10.0.0.6:33186) with ID 2
18/11/20 03:50:43 INFO BlockManagerMasterEndpoint: Registering block manager wn0-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:39977 with 2004.6 MB RAM, BlockManagerId(2, wn0-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net, 39977, None)
18/11/20 03:50:43 INFO BlockManagerMasterEndpoint: Registering block manager wn1-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:40569 with 2004.6 MB RAM, BlockManagerId(1, wn1-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net, 40569, None)
18/11/20 03:50:43 INFO YarnSchedulerBackend$YarnDriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (10.0.0.6:33200) with ID 4
18/11/20 03:50:44 INFO YarnSchedulerBackend$YarnDriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (10.0.0.12:59602) with ID 3
18/11/20 03:50:44 INFO BlockManagerMasterEndpoint: Registering block manager wn0-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:37031 with 2004.6 MB RAM, BlockManagerId(4, wn0-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net, 37031, None)
18/11/20 03:50:44 INFO BlockManagerMasterEndpoint: Registering block manager wn1-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:36035 with 2004.6 MB RAM, BlockManagerId(3, wn1-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net, 36035, None)
18/11/20 03:50:44 INFO YarnClusterSchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.8
18/11/20 03:50:44 INFO YarnClusterScheduler: YarnClusterScheduler.postStartHook done
18/11/20 03:50:44 INFO SharedState: loading hive config file: file:/etc/spark2/2.6.5.3003-25/0/hive-site.xml
18/11/20 03:50:44 INFO SharedState: Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('/hive/warehouse').
18/11/20 03:50:44 INFO SharedState: Warehouse path is '/hive/warehouse'.
18/11/20 03:50:44 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@7aca369b{/SQL,null,AVAILABLE,@Spark}
18/11/20 03:50:44 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@79a52ace{/SQL/json,null,AVAILABLE,@Spark}
18/11/20 03:50:44 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@1a6e5fab{/SQL/execution,null,AVAILABLE,@Spark}
18/11/20 03:50:44 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@5786b793{/SQL/execution/json,null,AVAILABLE,@Spark}
18/11/20 03:50:44 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@5962c0f{/static/sql,null,AVAILABLE,@Spark}
18/11/20 03:50:44 INFO StateStoreCoordinatorRef: Registered StateStoreCoordinator endpoint
18/11/20 03:50:44 ERROR ApplicationMaster: User class threw exception: java.lang.ClassNotFoundException: Failed to find data source: sstream. Please find packages at http://spark.apache.org/third-party-projects.html
java.lang.ClassNotFoundException: Failed to find data source: sstream. Please find packages at http://spark.apache.org/third-party-projects.html
    at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:635)
    at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:190)
    at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:174)
    at sample.ReadSStream$.main(ReadSStream.scala:6)
    at sample.ReadSStream.main(ReadSStream.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$4.run(ApplicationMaster.scala:721)
Caused by: java.lang.ClassNotFoundException: sstream.DefaultSource
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$23$$anonfun$apply$15.apply(DataSource.scala:618)
    at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$23$$anonfun$apply$15.apply(DataSource.scala:618)
    at scala.util.Try$.apply(Try.scala:192)
    at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$23.apply(DataSource.scala:618)
    at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$23.apply(DataSource.scala:618)
    at scala.util.Try.orElse(Try.scala:84)
    at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:618)
    ... 9 more
18/11/20 03:50:44 INFO ApplicationMaster: Final app status: FAILED, exitCode: 15, (reason: User class threw exception: java.lang.ClassNotFoundException: Failed to find data source: sstream. Please find packages at http://spark.apache.org/third-party-projects.html
    at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:635)
    at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:190)
    at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:174)
    at sample.ReadSStream$.main(ReadSStream.scala:6)
    at sample.ReadSStream.main(ReadSStream.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$4.run(ApplicationMaster.scala:721)
Caused by: java.lang.ClassNotFoundException: sstream.DefaultSource
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$23$$anonfun$apply$15.apply(DataSource.scala:618)
    at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$23$$anonfun$apply$15.apply(DataSource.scala:618)
    at scala.util.Try$.apply(Try.scala:192)
    at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$23.apply(DataSource.scala:618)
    at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$23.apply(DataSource.scala:618)
    at scala.util.Try.orElse(Try.scala:84)
    at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:618)
    ... 9 more
)
18/11/20 03:50:44 INFO SparkContext: Invoking stop() from shutdown hook
18/11/20 03:50:44 INFO AbstractConnector: Stopped Spark@17f4f6cf{HTTP/1.1,[http/1.1]}{0.0.0.0:0}
18/11/20 03:50:44 INFO SparkUI: Stopped Spark web UI at http://wn1-spark2.l01rfxocadnetnrq0tupemyb3b.cx.internal.cloudapp.net:39053
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.BlockManager.disk.diskSpaceUsed_MB, value=0
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.BlockManager.memory.maxMem_MB, value=10022
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.BlockManager.memory.maxOffHeapMem_MB, value=0
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.BlockManager.memory.maxOnHeapMem_MB, value=10022
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.BlockManager.memory.memUsed_MB, value=0
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.BlockManager.memory.offHeapMemUsed_MB, value=0
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.BlockManager.memory.onHeapMemUsed_MB, value=0
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.BlockManager.memory.remainingMem_MB, value=10022
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.BlockManager.memory.remainingOffHeapMem_MB, value=0
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.BlockManager.memory.remainingOnHeapMem_MB, value=10022
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.DAGScheduler.job.activeJobs, value=0
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.DAGScheduler.job.allJobs, value=0
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.DAGScheduler.stage.failedStages, value=0
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.DAGScheduler.stage.runningStages, value=0
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.DAGScheduler.stage.waitingStages, value=0
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.LiveListenerBus.queue.appStatus.size, value=0
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.LiveListenerBus.queue.eventLog.size, value=0
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.LiveListenerBus.queue.executorManagement.size, value=0
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.LiveListenerBus.queue.shared.size, value=0
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.PS-MarkSweep.count, value=2
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.PS-MarkSweep.time, value=55
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.PS-Scavenge.count, value=5
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.PS-Scavenge.time, value=119
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.direct.capacity, value=74337
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.direct.count, value=16
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.direct.used, value=74338
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.heap.committed, value=819462144
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.heap.init, value=924844032
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.heap.max, value=3817865216
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.heap.usage, value=0.09404473643943327
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.heap.used, value=359050128
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.mapped.capacity, value=0
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.mapped.count, value=0
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.mapped.used, value=0
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.non-heap.committed, value=75448320
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.non-heap.init, value=2555904
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.non-heap.max, value=-1
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.non-heap.usage, value=-7.3384976E7
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.non-heap.used, value=73387600
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.Code-Cache.committed, value=10878976
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.Code-Cache.init, value=2555904
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.Code-Cache.max, value=251658240
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.Code-Cache.usage, value=0.039348602294921875
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.Code-Cache.used, value=9902400
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.Compressed-Class-Space.committed, value=7905280
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.Compressed-Class-Space.init, value=0
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.Compressed-Class-Space.max, value=1073741824
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.Compressed-Class-Space.usage, value=0.007155425846576691
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.Compressed-Class-Space.used, value=7683080
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.Metaspace.committed, value=56664064
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.Metaspace.init, value=0
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.Metaspace.max, value=-1
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.Metaspace.usage, value=0.9849376140758277
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.Metaspace.used, value=55811816
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.PS-Eden-Space.committed, value=385351680
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.PS-Eden-Space.init, value=231735296
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.PS-Eden-Space.max, value=1324351488
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.PS-Eden-Space.usage, value=0.2009682795025485
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.PS-Eden-Space.used, value=266152640
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.PS-Old-Gen.committed, value=395837440
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.PS-Old-Gen.init, value=616562688
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.PS-Old-Gen.max, value=2863661056
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.PS-Old-Gen.usage, value=0.01951105207892313
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.PS-Old-Gen.used, value=55873040
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.PS-Survivor-Space.committed, value=38273024
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.PS-Survivor-Space.init, value=38273024
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.PS-Survivor-Space.max, value=38273024
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.PS-Survivor-Space.usage, value=0.9991885668610874
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.pools.PS-Survivor-Space.used, value=38241968
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.total.committed, value=894910464
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.total.init, value=927399936
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.total.max, value=3817865215
18/11/20 03:50:44 INFO metrics: type=GAUGE, name=application_1540794334804_0259.driver.jvm.total.used, value=434538656
18/11/20 03:50:44 INFO metrics: type=COUNTER, name=application_1540794334804_0259.driver.HiveExternalCatalog.fileCacheHits, count=0
18/11/20 03:50:44 INFO metrics: type=COUNTER, name=application_1540794334804_0259.driver.HiveExternalCatalog.filesDiscovered, count=0
18/11/20 03:50:44 INFO metrics: type=COUNTER, name=application_1540794334804_0259.driver.HiveExternalCatalog.hiveClientCalls, count=0
18/11/20 03:50:44 INFO metrics: type=COUNTER, name=application_1540794334804_0259.driver.HiveExternalCatalog.parallelListingJobCount, count=0
18/11/20 03:50:44 INFO metrics: type=COUNTER, name=application_1540794334804_0259.driver.HiveExternalCatalog.partitionsFetched, count=0
18/11/20 03:50:44 INFO metrics: type=COUNTER, name=application_1540794334804_0259.driver.LiveListenerBus.numEventsPosted, count=12
18/11/20 03:50:44 INFO metrics: type=COUNTER, name=application_1540794334804_0259.driver.LiveListenerBus.queue.appStatus.numDroppedEvents, count=0
18/11/20 03:50:44 INFO metrics: type=COUNTER, name=application_1540794334804_0259.driver.LiveListenerBus.queue.eventLog.numDroppedEvents, count=0
18/11/20 03:50:44 INFO metrics: type=COUNTER, name=application_1540794334804_0259.driver.LiveListenerBus.queue.executorManagement.numDroppedEvents, count=0
18/11/20 03:50:44 INFO metrics: type=COUNTER, name=application_1540794334804_0259.driver.LiveListenerBus.queue.shared.numDroppedEvents, count=0
18/11/20 03:50:44 INFO metrics: type=HISTOGRAM, name=application_1540794334804_0259.driver.CodeGenerator.compilationTime, count=0, min=0, max=0, mean=0.0, stddev=0.0, median=0.0, p75=0.0, p95=0.0, p98=0.0, p99=0.0, p999=0.0
18/11/20 03:50:44 INFO metrics: type=HISTOGRAM, name=application_1540794334804_0259.driver.CodeGenerator.generatedClassSize, count=0, min=0, max=0, mean=0.0, stddev=0.0, median=0.0, p75=0.0, p95=0.0, p98=0.0, p99=0.0, p999=0.0
18/11/20 03:50:44 INFO metrics: type=HISTOGRAM, name=application_1540794334804_0259.driver.CodeGenerator.generatedMethodSize, count=0, min=0, max=0, mean=0.0, stddev=0.0, median=0.0, p75=0.0, p95=0.0, p98=0.0, p99=0.0, p999=0.0
18/11/20 03:50:44 INFO metrics: type=HISTOGRAM, name=application_1540794334804_0259.driver.CodeGenerator.sourceCodeSize, count=0, min=0, max=0, mean=0.0, stddev=0.0, median=0.0, p75=0.0, p95=0.0, p98=0.0, p99=0.0, p999=0.0
18/11/20 03:50:44 INFO metrics: type=TIMER, name=application_1540794334804_0259.driver.DAGScheduler.messageProcessingTime, count=4, min=0.0033, max=8.455435, mean=2.1110152655606265, stddev=3.6537613775890128, median=0.0086, p75=0.0086, p95=8.455435, p98=8.455435, p99=8.455435, p999=8.455435, mean_rate=0.6515542189405801, m1=0.4, m5=0.4, m15=0.4, rate_unit=events/second, duration_unit=milliseconds
18/11/20 03:50:44 INFO metrics: type=TIMER, name=application_1540794334804_0259.driver.LiveListenerBus.listenerProcessingTime.com.microsoft.hdinsight.spark.metrics.SparkMetricsListener, count=12, min=0.17029999999999998, max=9.63664, mean=1.5834650894290851, stddev=2.600511557393628, median=0.23560099999999998, p75=1.268206, p95=9.63664, p98=9.63664, p99=9.63664, p999=9.63664, mean_rate=2.1582389290474553, m1=2.1999999999999997, m5=2.1999999999999997, m15=2.1999999999999997, rate_unit=events/second, duration_unit=milliseconds
18/11/20 03:50:44 INFO metrics: type=TIMER, name=application_1540794334804_0259.driver.LiveListenerBus.listenerProcessingTime.org.apache.spark.HeartbeatReceiver, count=12, min=0.0017, max=4.090617, mean=0.769623812643681, stddev=1.3476271585074593, median=0.012199999999999999, p75=0.1162, p95=4.090617, p98=4.090617, p99=4.090617, p999=4.090617, mean_rate=1.9371237443232558, m1=1.4, m5=1.4, m15=1.4, rate_unit=events/second, duration_unit=milliseconds
18/11/20 03:50:44 INFO metrics: type=TIMER, name=application_1540794334804_0259.driver.LiveListenerBus.listenerProcessingTime.org.apache.spark.scheduler.EventLoggingListener, count=12, min=0.224801, max=45.088887, mean=5.743430829955767, stddev=12.116908691523218, median=0.305601, p75=3.942516, p95=45.088887, p98=45.088887, p99=45.088887, p999=45.088887, mean_rate=2.1522273183770655, m1=2.1999999999999997, m5=2.1999999999999997, m15=2.1999999999999997, rate_unit=events/second, duration_unit=milliseconds
18/11/20 03:50:44 INFO metrics: type=TIMER, name=application_1540794334804_0259.driver.LiveListenerBus.listenerProcessingTime.org.apache.spark.sql.scheduler.EnhancementSparkListener, count=12, min=0.0017, max=0.962204, mean=0.07815455697116246, stddev=0.25933464410785684, median=0.002, p75=0.0021999999999999997, p95=0.962204, p98=0.962204, p99=0.962204, p999=0.962204, mean_rate=2.1579230843987034, m1=2.1999999999999997, m5=2.1999999999999997, m15=2.1999999999999997, rate_unit=events/second, duration_unit=milliseconds
18/11/20 03:50:44 INFO metrics: type=TIMER, name=application_1540794334804_0259.driver.LiveListenerBus.listenerProcessingTime.org.apache.spark.status.AppStatusListener, count=12, min=0.048799999999999996, max=46.370191999999996, mean=4.971501241903418, stddev=12.419724727822398, median=0.110801, p75=2.58521, p95=46.370191999999996, p98=46.370191999999996, p99=46.370191999999996, p999=46.370191999999996, mean_rate=1.7269294497150691, m1=0.6, m5=0.6, m15=0.6, rate_unit=events/second, duration_unit=milliseconds
18/11/20 03:50:44 INFO metrics: type=TIMER, name=application_1540794334804_0259.driver.LiveListenerBus.queue.appStatus.listenerProcessingTime, count=12, min=0.08400099999999999, max=46.444092, mean=5.017910572122519, stddev=12.426800904844692, median=0.156101, p75=2.63361, p95=46.444092, p98=46.444092, p99=46.444092, p999=46.444092, mean_rate=1.7261629917969175, m1=0.6, m5=0.6, m15=0.6, rate_unit=events/second, duration_unit=milliseconds
18/11/20 03:50:44 INFO metrics: type=TIMER, name=application_1540794334804_0259.driver.LiveListenerBus.queue.eventLog.listenerProcessingTime, count=12, min=0.255401, max=45.123886999999996, mean=5.855951116882547, stddev=12.089558815948724, median=0.45860199999999995, p75=3.981016, p95=45.123886999999996, p98=45.123886999999996, p99=45.123886999999996, p999=45.123886999999996, mean_rate=2.151074697237433, m1=2.1999999999999997, m5=2.1999999999999997, m15=2.1999999999999997, rate_unit=events/second, duration_unit=milliseconds
18/11/20 03:50:44 INFO metrics: type=TIMER, name=application_1540794334804_0259.driver.LiveListenerBus.queue.executorManagement.listenerProcessingTime, count=12, min=0.0346, max=5.939824, mean=0.9574356949912326, stddev=1.7342478781283521, median=0.08160099999999999, p75=0.1538, p95=5.939824, p98=5.939824, p99=5.939824, p999=5.939824, mean_rate=1.9358085235418623, m1=1.4, m5=1.4, m15=1.4, rate_unit=events/second, duration_unit=milliseconds
18/11/20 03:50:44 INFO metrics: type=TIMER, name=applicINFO: 
INFO: ========== RESULT ==========
ERROR: Job state is dead
ERROR: Diagnostics:     at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$23$$anonfun$apply$15.apply(DataSource.scala:618)
    at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$23$$anonfun$apply$15.apply(DataSource.scala:618)
    at scala.util.Try$.apply(Try.scala:192)
    at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$23.apply(DataSource.scala:618)
    at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$23.apply(DataSource.scala:618)
    at scala.util.Try.orElse(Try.scala:84)
    at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:618)
    ... 9 more
ation_1540794334804_0259.driver.LiveListenerBus.queue.shared.listenerProcessingTime, count=12, min=0.231101, max=9.725539999999999, mean=1.7238401638644045, stddev=2.6667077518486058, median=0.31390199999999996, p75=1.352406, p95=9.725539999999999, p98=9.725539999999999, p99=9.725539999999999, p999=9.725539999999999, mean_rate=2.156218002937709, m1=2.1999999999999997, m5=2.1999999999999997, m15=2.1999999999999997, rate_unit=events/second, duration_unit=milliseconds
18/11/20 03:50:45 INFO YarnAllocator: Driver requested a total number of 0 executor(s).
18/11/20 03:50:45 INFO YarnClusterSchedulerBackend: Shutting down all executors
18/11/20 03:50:45 INFO YarnSchedulerBackend$YarnDriverEndpoint: Asking each executor to shut down
18/11/20 03:50:45 INFO SchedulerExtensionServices: Stopping SchedulerExtensionServices
(serviceOption=None,
 services=List(),
 started=false)
18/11/20 03:50:45 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
18/11/20 03:50:45 INFO MemoryStore: MemoryStore cleared
18/11/20 03:50:45 INFO BlockManager: BlockManager stopped
18/11/20 03:50:45 INFO BlockManagerMaster: BlockManagerMaster stopped
18/11/20 03:50:45 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
18/11/20 03:50:45 INFO SparkContext: Successfully stopped SparkContext
18/11/20 03:50:45 INFO ApplicationMaster: Unregistering ApplicationMaster with FAILED (diag message: User class threw exception: java.lang.ClassNotFoundException: Failed to find data source: sstream. Please find packages at http://spark.apache.org/third-party-projects.html
    at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:635)
    at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:190)
    at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:174)
    at sample.ReadSStream$.main(ReadSStream.scala:6)
    at sample.ReadSStream.main(ReadSStream.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$4.run(ApplicationMaster.scala:721)
Caused by: java.lang.ClassNotFoundException: sstream.DefaultSource
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$23$$anonfun$apply$15.apply(DataSource.scala:618)
    at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$23$$anonfun$apply$15.apply(DataSource.scala:618)
    at scala.util.Try$.apply(Try.scala:192)
    at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$23.apply(DataSource.scala:618)
    at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$23.apply(DataSource.scala:618)
    at scala.util.Try.orElse(Try.scala:84)
    at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:618)
    ... 9 more
)
18/11/20 03:50:45 INFO AMRMClientImpl: Waiting for application to be successfully unregistered.
18/11/20 03:50:45 INFO ApplicationMaster: Deleting staging directory wasb://spark23hdinsight-2018-10-29t06-07-10-410z@catestsa.blob.core.windows.net/user/livy/.sparkStaging/application_1540794334804_0259
18/11/20 03:50:45 WARN AzureFileSystemThreadPoolExecutor: Disabling threads for Delete operation as thread count 0 is <= 1
18/11/20 03:50:45 INFO AzureFileSystemThreadPoolExecutor: Time taken for Delete operation is: 15 ms with threads: 0
18/11/20 03:50:45 INFO ShutdownHookManager: Shutdown hook called
18/11/20 03:50:45 INFO ShutdownHookManager: Deleting directory /mnt/resource/hadoop/yarn/local/usercache/livy/appcache/application_1540794334804_0259/spark-fb45b7a8-a04c-439f-b486-e51195d7ff86
18/11/20 03:50:45 INFO MetricsSystemImpl: Stopping azure-file-system metrics system...
18/11/20 03:50:45 INFO MetricsSinkAdapter: azurefs2 thread interrupted.
18/11/20 03:50:45 INFO MetricsSystemImpl: azure-file-system metrics system stopped.
18/11/20 03:50:45 INFO MetricsSystemImpl: azure-file-system metrics system shutdown complete.
wezhang commented 5 years ago

@jingyanjingyan , you can't access ADLS from a WASB HDInsight cluster.

jingyanjingyan commented 5 years ago

Set to ADLS cluster get the same error, send the mail to rui.

t-rufang commented 5 years ago

Synced offline with @konjac, we find sstream is not supported by HDInsight which means sstream related jars should be specified in the submission aruguments spark.jars. @jingyanjingyan Can you retry to submit this job to HDInsight cluster with sstream lib specified in spark.jars?

jingyanjingyan commented 5 years ago

Submit to spark2-1adls cluster with parameter "Referenced Jars = adl://devtooltelemetryadls.azuredatalakestore.net/tmp/structurestreamforspark_2.11-1.0.10.jar" still got fails.

t-rufang commented 5 years ago

Retried with config below and succeed finally.

Cluster: Spark23-hdi4-yan(Spark version >= 2.3.1)

Referenced Jars: wasbs://spark23-hdi4-yan-2018-12-25t02-43-25-251z@zhwespk21.blob.core.windows.net/sstream/lib/structurestreamforspark_2.11-1.1.3.jar

Class file:

package sample

import org.apache.spark.sql.{SparkSession}

object ReadSStream {
  def main(args: Array[String]) {
    val spark = SparkSession.builder.appName("ReadSStreamDemo").getOrCreate()
    val streamDf = spark.read.format("sstream").load("wasbs://spark23-hdi4-yan-2018-12-25t02-43-25-251z@zhwespk21.blob.core.windows.net/sstream/input/input.ss")
    streamDf.createOrReplaceTempView("streamView")
    val timestamp = System.currentTimeMillis.toString
    spark.sql("SELECT COUNT(*) FROM streamView").rdd.saveAsTextFile(s"wasbs://spark23-hdi4-yan-2018-12-25t02-43-25-251z@zhwespk21.blob.core.windows.net/sstream/output/$timestamp")
  }
}

Remember to use sstream rather than sstreaminterop2 as read format since the latter format is designed for windows platform while our cluster is built on Linux platform.

jingyanjingyan commented 5 years ago

Verify as fixed with the market build at 1/11/2019