microsoft / azure-tools-for-java

Azure tools for Java, including Azure Toolkits for Eclipse, IntelliJ and related projects.
Other
238 stars 158 forks source link

[IntelliJ][Spark on ADL] Fail to submit job, warning no cluster #1751

Closed jingyanjingyan closed 6 years ago

jingyanjingyan commented 6 years ago

Build azure-toolkit-for-intellij-2018.1.develop.659.06-29-2018

Repro Steps: 1, Sign in and set the remote submit settings

  1. Make sure the cluster is valid
  2. Open a script, and submit remotly

Result: Fail to submit image

jingyanjingyan commented 6 years ago
Submit IO script to catest get the error

Package and deploy the job to Spark cluster
INFO: Begin uploading file C:/Users/v-yajing/IdeaProjects/maven220622/out/artifacts/maven220622_DefaultArtifact/default_artifact.jar to Azure Datalake store adl://adlpspark03.caboaccountdogfood.net/SparkSubmission/2018/06/29/0fa9076e-2b60-4c12-9cfd-141781944d91/default_artifact.jar ...
INFO: Upload to Azure Datalake store 67646 bytes successfully.
LOG: 3485685
LOG: "D:\data\yarnnm\local\usercache\350a102a-d910-4538-8e7e-4ce6b905fa3f\appcache\application_1530225782231_0102\container_e4860_1530225782231_0102_01_000002\wd\archive-java-spark-livy-2.3.0.zip\jdk1.8.0_144\bin\java" -Xmx128m -cp ""D:\data\yarnnm\local\usercache\350a102a-d910-4538-8e7e-4ce6b905fa3f\appcache\application_1530225782231_0102\container_e4860_1530225782231_0102_01_000002\wd\archive-java-spark-livy-2.3.0.zip\spark\bin\..\jars"\*" org.apache.spark.launcher.Main org.apache.spark.deploy.SparkSubmit --class sample.SparkCore_WasbIOTest --conf spark.master=spark://BN4PHY101161605:4163 adl://adlpspark03.caboaccountdogfood.net/SparkSubmission/2018/06/29/0fa9076e-2b60-4c12-9cfd-141781944d91/default_artifact.jar
LOG: Invalid switch - "data".
LOG: log4j:WARN Continuable parsing error 2 and column 82
LOG: log4j:WARN Document root element "log4j:configuration", must match DOCTYPE root "null".
LOG: log4j:WARN Continuable parsing error 2 and column 82
LOG: log4j:WARN Document is invalid: no grammar found.
LOG: log4j: reset attribute= "".
LOG: log4j: Threshold ="".
LOG: log4j: Reading configuration from URL jar:file:/D:/data/yarnnm/local/usercache/350a102a-d910-4538-8e7e-4ce6b905fa3f/appcache/application_1530225782231_0102/container_e4860_1530225782231_0102_01_000002/wd/archive-java-spark-livy-2.3.0.zip/spark/jars/spark-core_2.11-2.3.0.jar!/org/apache/spark/log4j-defaults.properties
LOG: log4j: Parsing for [root] with value=[INFO, console].
LOG: log4j: Level token is [INFO].
LOG: log4j: Category root set to INFO
LOG: log4j: Parsing appender named "console".
LOG: log4j: Parsing layout options for "console".
LOG: log4j: Setting property [conversionPattern] to [%d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n].
LOG: log4j: End of parsing for "console".
LOG: log4j: Setting property [target] to [System.err].
LOG: log4j: Parsed "console" options.
LOG: log4j: Parsing for [org.spark_project.jetty] with value=[WARN].
LOG: log4j: Level token is [WARN].
LOG: log4j: Category org.spark_project.jetty set to WARN
LOG: log4j: Handling log4j.additivity.org.spark_project.jetty=[null]
LOG: log4j: Parsing for [parquet.CorruptStatistics] with value=[ERROR].
LOG: log4j: Level token is [ERROR].
LOG: log4j: Category parquet.CorruptStatistics set to ERROR
LOG: log4j: Handling log4j.additivity.parquet.CorruptStatistics=[null]
LOG: log4j: Parsing for [org.apache.hadoop.hive.metastore.RetryingHMSHandler] with value=[FATAL].
LOG: log4j: Level token is [FATAL].
LOG: log4j: Category org.apache.hadoop.hive.metastore.RetryingHMSHandler set to FATAL
LOG: log4j: Handling log4j.additivity.org.apache.hadoop.hive.metastore.RetryingHMSHandler=[null]
LOG: log4j: Parsing for [org.spark_project.jetty.util.component.AbstractLifeCycle] with value=[ERROR].
LOG: log4j: Level token is [ERROR].
LOG: log4j: Category org.spark_project.jetty.util.component.AbstractLifeCycle set to ERROR
LOG: log4j: Handling log4j.additivity.org.spark_project.jetty.util.component.AbstractLifeCycle=[null]
LOG: log4j: Parsing for [org.apache.spark.repl.SparkILoop$SparkILoopInterpreter] with value=[INFO].
LOG: log4j: Level token is [INFO].
LOG: log4j: Category org.apache.spark.repl.SparkILoop$SparkILoopInterpreter set to INFO
LOG: log4j: Handling log4j.additivity.org.apache.spark.repl.SparkILoop$SparkILoopInterpreter=[null]
LOG: log4j: Parsing for [org.apache.parquet.CorruptStatistics] with value=[ERROR].
LOG: log4j: Level token is [ERROR].
LOG: log4j: Category org.apache.parquet.CorruptStatistics set to ERROR
LOG: log4j: Handling log4j.additivity.org.apache.parquet.CorruptStatistics=[null]
LOG: log4j: Parsing for [org.apache.spark.repl.Main] with value=[WARN].
LOG: log4j: Level token is [WARN].
LOG: log4j: Category org.apache.spark.repl.Main set to WARN
LOG: log4j: Handling log4j.additivity.org.apache.spark.repl.Main=[null]
LOG: log4j: Parsing for [org.apache.spark.repl.SparkIMain$exprTyper] with value=[INFO].
LOG: log4j: Level token is [INFO].
LOG: log4j: Category org.apache.spark.repl.SparkIMain$exprTyper set to INFO
LOG: log4j: Handling log4j.additivity.org.apache.spark.repl.SparkIMain$exprTyper=[null]
LOG: log4j: Parsing for [org.apache.hadoop.hive.ql.exec.FunctionRegistry] with value=[ERROR].
LOG: log4j: Level token is [ERROR].
LOG: log4j: Category org.apache.hadoop.hive.ql.exec.FunctionRegistry set to ERROR
LOG: log4j: Handling log4j.additivity.org.apache.hadoop.hive.ql.exec.FunctionRegistry=[null]
LOG: log4j: Finished configuring.
LOG: 18/06/29 03:48:58 WARN SparkConf: The configuration key 'spark.executor.port' has been deprecated as of Spark 2.0.0 and may be removed in the future. Not used any more
LOG: i,06/29/2018 03:48:59.435,Common,CommonLibMain,SrcFile="main.cpp" SrcFunc="apsdk::CommonInit" SrcLine="86" Pid="56956" Tid="54776" TS="0x01D40F96C91F124A" String1="CommonInit called at 2018-06-29T03:48:59.435L+7h, applicationDir=(null), defaultConfigName=cosmos.ini.flattened.ini, bootstrapConfigDir=(null)"
LOG: i,06/29/2018 03:48:59.435,Common,Configuration,SrcFile="configuration.cpp" SrcFunc="apsdk::configuration::Configuration::PreInitialize" SrcLine="138" Pid="56956" Tid="54776" TS="0x01D40F96C91F124A" String1="CConfiguration::Preinitialize called at 2018-06-29T03:48:59.435L+7h"
LOG: i,06/29/2018 03:48:59.436,Common,ConfigurationManager,SrcFile="configurationmanager.cpp" SrcFunc="apsdk::configuration::ConfigurationManager::ProcessOverrides" SrcLine="228" Pid="56956" Tid="54776" TS="0x01D40F96C91F393F" String1="Processing overrides at 2018-06-29T03:48:59.436L+7h"
LOG: i,06/29/2018 03:48:59.436,Common,ConfigurationCollection,SrcFile="configurationcollection.cpp" SrcFunc="apsdk::configuration::internal::ConfigurationCollection::Init" SrcLine="110" Pid="56956" Tid="54776" TS="0x01D40F96C91F393F" String1="unable to read configuration file D:\data\yarnnm\logs\application_1530225782231_0102\container_e4860_1530225782231_0102_01_000002\data\brs\brs.ini"
LOG: i,06/29/2018 03:48:59.436,Common,ConfigurationCollection,SrcFile="configurationcollection.cpp" SrcFunc="apsdk::configuration::internal::ConfigurationCollection::Init" SrcLine="110" Pid="56956" Tid="54776" TS="0x01D40F96C91F393F" String1="unable to read configuration file D:\data\yarnnm\logs\application_1530225782231_0102\container_e4860_1530225782231_0102_01_000002\data\brs\GlobalBrs.ini"
LOG: i,06/29/2018 03:48:59.437,Common,ConfigurationCollection,SrcFile="configurationcollection.cpp" SrcFunc="apsdk::configuration::internal::ConfigurationCollection::Init" SrcLine="110" Pid="56956" Tid="54776" TS="0x01D40F96C91F6056" String1="unable to read configuration file D:\data\yarnnm\local\usercache\350a102a-d910-4538-8e7e-4ce6b905fa3f\appcache\application_1530225782231_0102\container_e4860_1530225782231_0102_01_000002\wd\cosmos.ini.flattened.ini"
LOG: w,06/29/2018 03:48:59.437,Common,ConfigurationManager,SrcFile="configurationmanager.cpp" SrcFunc="apsdk::configuration::ConfigurationManager::RegisterBasicConfigurations" SrcLine="383" Pid="56956" Tid="54776" TS="0x01D40F96C91F6056" String1="default configuration D:\data\yarnnm\local\usercache\350a102a-d910-4538-8e7e-4ce6b905fa3f\appcache\application_1530225782231_0102\container_e4860_1530225782231_0102_01_000002\wd\cosmos.ini.flattened.ini not found -- assuming empty"
LOG: i,06/29/2018 03:48:59.437,Common,ChangeNotify,SrcFile="changenotify.cpp" SrcFunc="apsdk::ChangeNotifier::Init" SrcLine="71" Pid="56956" Tid="54776" TS="0x01D40F96C91F6056" String1="First ChangeNotifier::Init called at 2018-06-29T03:48:59.437L+7h; ScanIntervalMSec=1000, MainLoopSleepIntervalMSec=100"
LOG: i,06/29/2018 03:48:59.437,Common,Configuration,SrcFile="configuration.cpp" SrcFunc="apsdk::configuration::Configuration::CompleteInitialize" SrcLine="181" Pid="56956" Tid="54776" TS="0x01D40F96C91F6056" String1="Configuration::CompleteInitialize called at 2018-06-29T03:48:59.437L+7h"
LOG: i,06/29/2018 03:48:59.437,Common,ConfigurationCollection,SrcFile="configurationcollection.cpp" SrcFunc="apsdk::configuration::internal::ConfigurationCollection::Init" SrcLine="110" Pid="56956" Tid="54776" TS="0x01D40F96C91F6056" String1="unable to read configuration file D:\data\yarnnm\logs\application_1530225782231_0102\container_e4860_1530225782231_0102_01_000002\data\autopilotData\shared.ini"
LOG: i,06/29/2018 03:48:59.437,Common,ConfigurationCollection,SrcFile="configurationcollection.cpp" SrcFunc="apsdk::configuration::internal::ConfigurationCollection::Init" SrcLine="110" Pid="56956" Tid="54776" TS="0x01D40F96C91F6056" String1="unable to read configuration file D:\data\yarnnm\logs\application_1530225782231_0102\container_e4860_1530225782231_0102_01_000002\data\autopilotData\Environment.ini"
LOG: i,06/29/2018 03:48:59.437,Common,SysInfo::Init,SrcFile="sysinfo.cpp" SrcFunc="apsdk::SysInfo::Init" SrcLine="404" Pid="56956" Tid="54776" TS="0x01D40F96C91F6056" String1="Initializing SysInfo at 2018-06-29T03:48:59.437L+7h"
LOG: i,06/29/2018 03:48:59.441,Counters,Percentiles,SrcFile="counters.cpp" SrcFunc="apsdk::Counters::ReadTruePercentilesCountersConfiguration" SrcLine="2327" Pid="56956" Tid="54776" TS="0x01D40F96C91FFC6E" String1="support true percentiles = 0, counters = , counters created = 0"
LOG: log4j:WARN No appenders could be found for logger (org.apache.spark.SparkConf).
LOG: log4j:WARN Please initialize the log4j system properly.
LOG: log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
LOG: log4j: Reading configuration from URL jar:file:/D:/data/yarnnm/local/usercache/350a102a-d910-4538-8e7e-4ce6b905fa3f/appcache/application_1530225782231_0102/container_e4860_1530225782231_0102_01_000002/wd/archive-java-spark-livy-2.3.0.zip/spark/jars/spark-core_2.11-2.3.0.jar!/org/apache/spark/log4j-defaults.properties
LOG: log4j: Parsing for [root] with value=[INFO, console].
LOG: log4j: Level token is [INFO].
LOG: log4j: Category root set to INFO
LOG: log4j: Parsing appender named "console".
LOG: log4j: Parsing layout options for "console".
LOG: log4j: Setting property [conversionPattern] to [%d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n].
LOG: log4j: End of parsing for "console".
LOG: log4j: Setting property [target] to [System.err].
LOG: log4j: Parsed "console" options.
LOG: log4j: Parsing for [org.spark_project.jetty] with value=[WARN].
LOG: log4j: Level token is [WARN].
LOG: log4j: Category org.spark_project.jetty set to WARN
LOG: log4j: Handling log4j.additivity.org.spark_project.jetty=[null]
LOG: log4j: Parsing for [parquet.CorruptStatistics] with value=[ERROR].
LOG: log4j: Level token is [ERROR].
LOG: log4j: Category parquet.CorruptStatistics set to ERROR
LOG: log4j: Handling log4j.additivity.parquet.CorruptStatistics=[null]
LOG: log4j: Parsing for [org.apache.hadoop.hive.metastore.RetryingHMSHandler] with value=[FATAL].
LOG: log4j: Level token is [FATAL].
LOG: log4j: Category org.apache.hadoop.hive.metastore.RetryingHMSHandler set to FATAL
LOG: log4j: Handling log4j.additivity.org.apache.hadoop.hive.metastore.RetryingHMSHandler=[null]
LOG: log4j: Parsing for [org.spark_project.jetty.util.component.AbstractLifeCycle] with value=[ERROR].
LOG: log4j: Level token is [ERROR].
LOG: log4j: Category org.spark_project.jetty.util.component.AbstractLifeCycle set to ERROR
LOG: log4j: Handling log4j.additivity.org.spark_project.jetty.util.component.AbstractLifeCycle=[null]
LOG: log4j: Parsing for [org.apache.spark.repl.SparkILoop$SparkILoopInterpreter] with value=[INFO].
LOG: log4j: Level token is [INFO].
LOG: log4j: Category org.apache.spark.repl.SparkILoop$SparkILoopInterpreter set to INFO
LOG: log4j: Handling log4j.additivity.org.apache.spark.repl.SparkILoop$SparkILoopInterpreter=[null]
LOG: log4j: Parsing for [org.apache.parquet.CorruptStatistics] with value=[ERROR].
LOG: log4j: Level token is [ERROR].
LOG: log4j: Category org.apache.parquet.CorruptStatistics set to ERROR
LOG: log4j: Handling log4j.additivity.org.apache.parquet.CorruptStatistics=[null]
LOG: log4j: Parsing for [org.apache.spark.repl.Main] with value=[WARN].
LOG: log4j: Level token is [WARN].
LOG: log4j: Category org.apache.spark.repl.Main set to WARN
LOG: log4j: Handling log4j.additivity.org.apache.spark.repl.Main=[null]
LOG: log4j: Parsing for [org.apache.spark.repl.SparkIMain$exprTyper] with value=[INFO].
LOG: log4j: Level token is [INFO].
LOG: log4j: Category org.apache.spark.repl.SparkIMain$exprTyper set to INFO
LOG: log4j: Handling log4j.additivity.org.apache.spark.repl.SparkIMain$exprTyper=[null]
LOG: log4j: Parsing for [org.apache.hadoop.hive.ql.exec.FunctionRegistry] with value=[ERROR].
LOG: log4j: Level token is [ERROR].
LOG: log4j: Category org.apache.hadoop.hive.ql.exec.FunctionRegistry set to ERROR
LOG: log4j: Handling log4j.additivity.org.apache.hadoop.hive.ql.exec.FunctionRegistry=[null]
LOG: log4j: Finished configuring.
LOG: Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
LOG: 18/06/29 03:48:59 INFO SparkContext: Running Spark version 2.3.0
LOG: 18/06/29 03:48:59 INFO SparkContext: Submitted application: SparkCore_WasbIOTest
LOG: 18/06/29 03:48:59 INFO SparkContext: Cosmos environment not set.
LOG: 18/06/29 03:49:00 INFO SecurityManager: Changing view acls to: YarnppNMUser
LOG: 18/06/29 03:49:00 INFO SecurityManager: Changing modify acls to: YarnppNMUser
LOG: 18/06/29 03:49:00 INFO SecurityManager: Changing view acls groups to: 
LOG: 18/06/29 03:49:00 INFO SecurityManager: Changing modify acls groups to: 
LOG: 18/06/29 03:49:00 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(YarnppNMUser); groups with view permissions: Set(); users  with modify permissions: Set(YarnppNMUser); groups with modify permissions: Set()
LOG: 18/06/29 03:49:00 INFO Utils: Successfully started service 'sparkDriver' on port 4613.
LOG: 18/06/29 03:49:00 INFO SparkEnv: Registering MapOutputTracker
LOG: 18/06/29 03:49:00 INFO SparkEnv: Registering BlockManagerMaster
LOG: 18/06/29 03:49:00 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
LOG: 18/06/29 03:49:00 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
LOG: 18/06/29 03:49:00 INFO DiskBlockManager: Created local directory at D:\data\yarnnm\local\usercache\350a102a-d910-4538-8e7e-4ce6b905fa3f\appcache\application_1530225782231_0102\blockmgr-3ef80189-7ed6-47d4-960d-5989440cb419
LOG: 18/06/29 03:49:00 INFO MemoryStore: MemoryStore started with capacity 366.3 MB
LOG: 18/06/29 03:49:00 INFO SparkEnv: Registering OutputCommitCoordinator
LOG: 18/06/29 03:49:00 WARN Utils: The configured local directories are not expected to be URIs; however, got suspicious values [D:/data/yarnnm/local/usercache/350a102a-d910-4538-8e7e-4ce6b905fa3f/appcache/application_1530225782231_0102]. Please check your configured local directories.
LOG: 18/06/29 03:49:00 INFO Utils: Successfully started service 'SparkUI' on port 4040.
LOG: 18/06/29 03:49:00 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://BN4PHY101161605:4040
LOG: 18/06/29 03:49:00 INFO SparkContext: Added JAR adl://adlpspark03.caboaccountdogfood.net/SparkSubmission/2018/06/29/0fa9076e-2b60-4c12-9cfd-141781944d91/default_artifact.jar at adl://adlpspark03.caboaccountdogfood.net/SparkSubmission/2018/06/29/0fa9076e-2b60-4c12-9cfd-141781944d91/default_artifact.jar with timestamp 1530269340997
LOG:    at org.apache.spark.rdd.RDD.withScope(RDD.scala:363)
LOG:    at org.apache.spark.rdd.PairRDDFunctions.saveAsHadoopDataset(PairRDDFunctions.scala:1094)
LOG:    at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopFile$4.apply$mcV$sp(PairRDDFunctions.scala:1067)
LOG:    at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopFile$4.apply(PairRDDFunctions.scala:1032)
LOG:    at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopFile$4.apply(PairRDDFunctions.scala:1032)
LOG:    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
LOG:    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
LOG:    at org.apache.spark.rdd.RDD.withScope(RDD.scala:363)
LOG:    at org.apache.spark.rdd.PairRDDFunctions.saveAsHadoopFile(PairRDDFunctions.scala:1032)
LOG:    at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopFile$1.apply$mcV$sp(PairRDDFunctions.scala:958)
LOG:    at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopFile$1.apply(PairRDDFunctions.scala:958)
LOG:    at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopFile$1.apply(PairRDDFunctions.scala:958)
LOG:    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
LOG:    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
LOG:    at org.apache.spark.rdd.RDD.withScope(RDD.scala:363)
LOG:    at org.apache.spark.rdd.PairRDDFunctions.saveAsHadoopFile(PairRDDFunctions.scala:957)
LOG:    at org.apache.spark.rdd.RDD$$anonfun$saveAsTextFile$1.apply$mcV$sp(RDD.scala:1493)
LOG:    at org.apache.spark.rdd.RDD$$anonfun$saveAsTextFile$1.apply(RDD.scala:1472)
LOG:    at org.apache.spark.rdd.RDD$$anonfun$saveAsTextFile$1.apply(RDD.scala:1472)
LOG:    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
LOG:    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
LOG:    at org.apache.spark.rdd.RDD.withScope(RDD.scala:363)
LOG:    at org.apache.spark.rdd.RDD.saveAsTextFile(RDD.scala:1472)
LOG:    at sample.SparkCore_WasbIOTest$.main(SparkCore_WasbIOTest.scala:19)
LOG:    at sample.SparkCore_WasbIOTest.main(SparkCore_WasbIOTest.scala)
LOG:    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
LOG:    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
LOG:    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
LOG:    at java.lang.reflect.Method.invoke(Method.java:498)
LOG:    at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
LOG:    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:879)
LOG:    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:197)
LOG:    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:227)
LOG:    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:136)
LOG:    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
LOG: 18/06/29 03:49:03 INFO SparkContext: Invoking stop() from shutdown hook
LOG: 18/06/29 03:49:03 INFO CosmosFileSystem: Shim=container_e4860_1530225782231_0102_01_000002,cf653f8c-ae7c-4f75-87cf-ff3a72af3e2c,2a1a3f16-8add-45ab-a57e-3ca1b23cc563: flush called
LOG: 18/06/29 03:49:03 INFO CosmosFileSystem: Shim=container_e4860_1530225782231_0102_01_000002,cf653f8c-ae7c-4f75-87cf-ff3a72af3e2c,31ceba4e-5ff3-4920-b5c8-97e1bdc75e10: FsWrite starting at 70776
LOG: 18/06/29 03:49:03 INFO SparkUI: Stopped Spark web UI at http://BN4PHY101161605:4040
LOG: 18/06/29 03:49:03 INFO CosmosFileSystem: Shim=container_e4860_1530225782231_0102_01_000002,cf653f8c-ae7c-4f75-87cf-ff3a72af3e2c,eb16da23-c3e3-4368-a706-51fa40ff47c2: doFlush called
LOG: 18/06/29 03:49:03 INFO CosmosFileSystem: Shim=container_e4860_1530225782231_0102_01_000002,cf653f8c-ae7c-4f75-87cf-ff3a72af3e2c,996ca287-e8a4-4ad9-b7e7-6c6f0ecd9a0a: flush called
LOG: 18/06/29 03:49:03 INFO CosmosFileSystem: Shim=container_e4860_1530225782231_0102_01_000002,cf653f8c-ae7c-4f75-87cf-ff3a72af3e2c,c1037b0c-89aa-4bbc-b54d-b5049f40886c: FsWrite starting at 70843
LOG: 18/06/29 03:49:04 INFO CosmosFileSystem: Shim=container_e4860_1530225782231_0102_01_000002,cf653f8c-ae7c-4f75-87cf-ff3a72af3e2c,70d21b56-4e18-4534-b666-ac498ca88bd1: doFlush called
LOG: 18/06/29 03:49:04 INFO CosmosFileSystem: Shim=container_e4860_1530225782231_0102_01_000002,cf653f8c-ae7c-4f75-87cf-ff3a72af3e2c,8d0e3861-144f-4418-b90b-9b59b7cbd7b0: flush called
LOG: 18/06/29 03:49:04 INFO CosmosFileSystem: Shim=container_e4860_1530225782231_0102_01_000002,cf653f8c-ae7c-4f75-87cf-ff3a72af3e2c,f3574617-81e9-412b-b593-483cf8c00d81: FsWrite starting at 70843
LOG: 18/06/29 03:49:04 INFO CosmosFileSystem: Shim=container_e4860_1530225782231_0102_01_000002,cf653f8c-ae7c-4f75-87cf-ff3a72af3e2c,598ce219-5563-4a1d-8785-ba8fd93a3157: doFlush called
LOG: 18/06/29 03:49:04 INFO CosmosFileSystem: Shim=container_e4860_1530225782231_0102_01_000002,cf653f8c-ae7c-4f75-87cf-ff3a72af3e2c,62f4dff2-a23c-4dfb-9d9d-93c5d1d94be9: close called
LOG: 18/06/29 03:49:04 INFO CosmosFileSystem: Shim=container_e4860_1530225782231_0102_01_000002,cf653f8c-ae7c-4f75-87cf-ff3a72af3e2c,56a78b90-4fb5-4c75-b9c6-278473cd7df7: FsWrite starting at 70843
LOG: 18/06/29 03:49:04 INFO CosmosFileSystem: Shim=container_e4860_1530225782231_0102_01_000002,f56536cf-2771-46b4-86dc-f2a00cf1bf58: Cosmos Baseclass getFileStatus got: adl://adlpspark03.caboaccountdogfood.net/spark-events/app-20180629034901-0001 
LOG: 18/06/29 03:49:04 INFO CosmosFileSystem: Shim=container_e4860_1530225782231_0102_01_000002,55e39a1f-82b9-46ad-b4e6-d2bf31022d01: ADL doGetFileStatus called with path: adl://adlpspark03.caboaccountdogfood.net/spark-events/app-20180629034901-0001, makeQualified returned: adl://adlpspark03.caboaccountdogfood.net/spark-events/app-20180629034901-0001
LOG: 18/06/29 03:49:04 INFO CoarseGrainedSchedulerBackend$DriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (25.72.10.26:13463) with ID 1
LOG: 18/06/29 03:49:04 INFO CoarseGrainedSchedulerBackend$DriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (25.77.114.88:52020) with ID 0
LOG: 18/06/29 03:49:04 INFO BlockManagerMasterEndpoint: Registering block manager 25.72.10.26:3163 with 366.3 MB RAM, BlockManagerId(1, 25.72.10.26, 3163, None)
LOG: 18/06/29 03:49:04 INFO BlockManagerMasterEndpoint: Registering block manager 25.77.114.88:3163 with 366.3 MB RAM, BlockManagerId(0, 25.77.114.88, 3163, None)
LOG: 18/06/29 03:49:04 INFO CosmosFileSystem: Shim=container_e4860_1530225782231_0102_01_000002,98ec3e3f-66c6-4f72-bfc3-a4a4d8fa018d: Rename called with src: adl://adlpspark03.caboaccountdogfood.net/spark-events/app-20180629034901-0001.inprogress, dst: adl://adlpspark03.caboaccountdogfood.net/spark-events/app-20180629034901-0001
LOG: 18/06/29 03:49:04 INFO CosmosFileSystem: Shim=container_e4860_1530225782231_0102_01_000002,462a2cc8-e29c-44f1-a29f-ddb427182e5c: Cosmos Baseclass getFileStatus got: adl://adlpspark03.caboaccountdogfood.net/spark-events/app-20180629034901-0001 
LOG: 18/06/29 03:49:04 INFO CosmosFileSystem: Shim=container_e4860_1530225782231_0102_01_000002,7433f923-6488-4d76-beb8-0e5ba584fc41: ADL doGetFileStatus called with path: adl://adlpspark03.caboaccountdogfood.net/spark-events/app-20180629034901-0001, makeQualified returned: adl://adlpspark03.caboaccountdogfood.net/spark-events/app-20180629034901-0001
ERROR: The Spark job failed to start due to     at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1988)
    at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(ShutdownHookManager.scala:188)
    at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188)
    at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188)
    at scala.util.Try$.apply(Try.scala:192)
    at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:188)
    at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:178)
    at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)
18/06/29 03:49:08 INFO ShutdownHookManager: Deleting directory D:\data\yarnnm\local\usercache\350a102a-d910-4538-8e7e-4ce6b905fa3f\appcache\application_1530225782231_0102\spark-778bdd34-12c8-47aa-8990-1e1bba253602

stderr: 
 stack trace: com.microsoft.azure.hdinsight.spark.common.SparkJobException: The Spark job failed to start due to    at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1988)
    at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(ShutdownHookManager.scala:188)
    at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188)
    at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188)
    at scala.util.Try$.apply(Try.scala:192)
    at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:188)
    at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:178)
    at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)
18/06/29 03:49:08 INFO ShutdownHookManager: Deleting directory D:\data\yarnnm\local\usercache\350a102a-d910-4538-8e7e-4ce6b905fa3f\appcache\application_1530225782231_0102\spark-778bdd34-12c8-47aa-8990-1e1bba253602

stderr: 
    at com.microsoft.azure.hdinsight.spark.common.SparkBatchJob.lambda$awaitStarted$49(SparkBatchJob.java:1125)
    at rx.internal.operators.OnSubscribeMap$MapSubscriber.onNext(OnSubscribeMap.java:69)
    at rx.internal.operators.OnSubscribeFilter$FilterSubscriber.onNext(OnSubscribeFilter.java:76)
    at rx.internal.operators.OperatorTakeUntilPredicate$ParentSubscriber.onNext(OperatorTakeUntilPredicate.java:61)
    at rx.internal.operators.OnSubscribeRedo$2$1.onNext(OnSubscribeRedo.java:244)
    at rx.internal.operators.OnSubscribeRedo$2$1.onNext(OnSubscribeRedo.java:244)
    at rx.internal.operators.OnSubscribeMap$MapSubscriber.onNext(OnSubscribeMap.java:77)
    at rx.internal.producers.SingleDelayedProducer.emit(SingleDelayedProducer.java:102)
    at rx.internal.producers.SingleDelayedProducer.setValue(SingleDelayedProducer.java:85)
    at rx.internal.operators.OnSubscribeFromCallable.call(OnSubscribeFromCallable.java:48)
    at rx.internal.operators.OnSubscribeFromCallable.call(OnSubscribeFromCallable.java:33)
    at rx.Observable.unsafeSubscribe(Observable.java:10142)
    at rx.internal.operators.OnSubscribeMap.call(OnSubscribeMap.java:48)
    at rx.internal.operators.OnSubscribeMap.call(OnSubscribeMap.java:33)
    at rx.Observable.unsafeSubscribe(Observable.java:10142)
    at rx.internal.operators.OnSubscribeRedo$2.call(OnSubscribeRedo.java:273)
    at rx.internal.schedulers.TrampolineScheduler$InnerCurrentThreadScheduler.enqueue(TrampolineScheduler.java:73)
    at rx.internal.schedulers.TrampolineScheduler$InnerCurrentThreadScheduler.schedule(TrampolineScheduler.java:52)
    at rx.internal.operators.OnSubscribeRedo$5.request(OnSubscribeRedo.java:361)
    at rx.internal.producers.ProducerArbiter.setProducer(ProducerArbiter.java:126)
    at rx.internal.operators.OnSubscribeRedo$2$1.setProducer(OnSubscribeRedo.java:267)
    at rx.internal.operators.OnSubscribeRedo.call(OnSubscribeRedo.java:353)
    at rx.internal.operators.OnSubscribeRedo.call(OnSubscribeRedo.java:47)
    at rx.Observable.unsafeSubscribe(Observable.java:10142)
    at rx.internal.operators.OnSubscribeRedo$2.call(OnSubscribeRedo.java:273)
    at rx.internal.schedulers.TrampolineScheduler$InnerCurrentThreadScheduler.enqueue(TrampolineScheduler.java:73)
    at rx.internal.schedulers.TrampolineScheduler$InnerCurrentThreadScheduler.schedule(TrampolineScheduler.java:52)
    at rx.internal.operators.OnSubscribeRedo$5.request(OnSubscribeRedo.java:361)
    at rx.Subscriber.setProducer(Subscriber.java:211)
    at rx.internal.operators.OnSubscribeRedo.call(OnSubscribeRedo.java:353)
    at rx.internal.operators.OnSubscribeRedo.call(OnSubscribeRedo.java:47)
    at rx.internal.operators.OnSubscribeLift.call(OnSubscribeLift.java:48)
    at rx.internal.operators.OnSubscribeLift.call(OnSubscribeLift.java:30)
    at rx.Observable.unsafeSubscribe(Observable.java:10142)
    at rx.internal.operators.OnSubscribeFilter.call(OnSubscribeFilter.java:45)
    at rx.internal.operators.OnSubscribeFilter.call(OnSubscribeFilter.java:30)
    at rx.Observable.unsafeSubscribe(Observable.java:10142)
    at rx.internal.operators.OnSubscribeMap.call(OnSubscribeMap.java:48)
    at rx.internal.operators.OnSubscribeMap.call(OnSubscribeMap.java:33)
    at rx.internal.operators.OnSubscribeLift.call(OnSubscribeLift.java:48)
    at rx.internal.operators.OnSubscribeLift.call(OnSubscribeLift.java:30)
    at rx.Observable.unsafeSubscribe(Observable.java:10142)
    at rx.internal.operators.OnSubscribeMap.call(OnSubscribeMap.java:48)
    at rx.internal.operators.OnSubscribeMap.call(OnSubscribeMap.java:33)
    at rx.internal.operators.OnSubscribeLift.call(OnSubscribeLift.java:48)
    at rx.internal.operators.OnSubscribeLift.call(OnSubscribeLift.java:30)
    at rx.Observable.unsafeSubscribe(Observable.java:10142)
    at rx.internal.operators.OnSubscribeMap.call(OnSubscribeMap.java:48)
    at rx.internal.operators.OnSubscribeMap.call(OnSubscribeMap.java:33)
    at rx.Observable.unsafeSubscribe(Observable.java:10142)
    at rx.internal.operators.OnSubscribeMap.call(OnSubscribeMap.java:48)
    at rx.internal.operators.OnSubscribeMap.call(OnSubscribeMap.java:33)
    at rx.Observable.unsafeSubscribe(Observable.java:10142)
    at rx.internal.operators.OperatorMerge$MergeSubscriber.onNext(OperatorMerge.java:248)
    at rx.internal.operators.OperatorMerge$MergeSubscriber.onNext(OperatorMerge.java:148)
    at rx.internal.operators.OnSubscribeMap$MapSubscriber.onNext(OnSubscribeMap.java:77)
    at rx.internal.operators.OperatorMerge$MergeSubscriber.emitScalar(OperatorMerge.java:395)
    at rx.internal.operators.OperatorMerge$MergeSubscriber.tryEmit(OperatorMerge.java:355)
    at rx.internal.operators.OperatorMerge$InnerSubscriber.onNext(OperatorMerge.java:846)
    at rx.internal.operators.OnSubscribeDoOnEach$DoOnEachSubscriber.onNext(OnSubscribeDoOnEach.java:101)
    at rx.internal.operators.OperatorMerge$MergeSubscriber.emitScalar(OperatorMerge.java:395)
    at rx.internal.operators.OperatorMerge$MergeSubscriber.tryEmit(OperatorMerge.java:355)
    at rx.internal.operators.OperatorMerge$InnerSubscriber.onNext(OperatorMerge.java:846)
    at rx.internal.operators.OnSubscribeMap$MapSubscriber.onNext(OnSubscribeMap.java:77)
    at rx.internal.producers.SingleProducer.request(SingleProducer.java:65)
    at rx.Subscriber.setProducer(Subscriber.java:211)
    at rx.internal.operators.OnSubscribeMap$MapSubscriber.setProducer(OnSubscribeMap.java:102)
    at rx.internal.operators.OperatorSingle$ParentSubscriber.onCompleted(OperatorSingle.java:110)
    at rx.internal.operators.DeferredScalarSubscriber.complete(DeferredScalarSubscriber.java:102)
    at rx.internal.operators.OnSubscribeTakeLastOne$TakeLastOneSubscriber.onCompleted(OnSubscribeTakeLastOne.java:57)
    at rx.internal.operators.OnSubscribeDoOnEach$DoOnEachSubscriber.onCompleted(OnSubscribeDoOnEach.java:70)
    at rx.internal.operators.OnSubscribeDoOnEach$DoOnEachSubscriber.onCompleted(OnSubscribeDoOnEach.java:70)
    at com.microsoft.azure.hdinsight.spark.common.SparkBatchJob.lambda$getSubmissionLog$29(SparkBatchJob.java:817)
    at rx.Observable.unsafeSubscribe(Observable.java:10142)
    at rx.internal.operators.OnSubscribeDoOnEach.call(OnSubscribeDoOnEach.java:41)
    at rx.internal.operators.OnSubscribeDoOnEach.call(OnSubscribeDoOnEach.java:30)
    at rx.Observable.unsafeSubscribe(Observable.java:10142)
    at rx.internal.operators.OnSubscribeDoOnEach.call(OnSubscribeDoOnEach.java:41)
    at rx.internal.operators.OnSubscribeDoOnEach.call(OnSubscribeDoOnEach.java:30)
    at rx.Observable.unsafeSubscribe(Observable.java:10142)
    at rx.internal.operators.DeferredScalarSubscriber.subscribeTo(DeferredScalarSubscriber.java:153)
    at rx.internal.operators.OnSubscribeTakeLastOne.call(OnSubscribeTakeLastOne.java:32)
    at rx.internal.operators.OnSubscribeTakeLastOne.call(OnSubscribeTakeLastOne.java:22)
    at rx.internal.operators.OnSubscribeLift.call(OnSubscribeLift.java:48)
    at rx.internal.operators.OnSubscribeLift.call(OnSubscribeLift.java:30)
    at rx.Observable.unsafeSubscribe(Observable.java:10142)
    at rx.internal.operators.OnSubscribeMap.call(OnSubscribeMap.java:48)
    at rx.internal.operators.OnSubscribeMap.call(OnSubscribeMap.java:33)
    at rx.Observable.unsafeSubscribe(Observable.java:10142)
    at rx.internal.operators.OperatorMerge$MergeSubscriber.onNext(OperatorMerge.java:248)
    at rx.internal.operators.OperatorMerge$MergeSubscriber.onNext(OperatorMerge.java:148)
    at rx.internal.operators.OnSubscribeMap$MapSubscriber.onNext(OnSubscribeMap.java:77)
    at rx.internal.operators.OperatorSubscribeOn$1$1.onNext(OperatorSubscribeOn.java:53)
    at rx.internal.producers.SingleDelayedProducer.emit(SingleDelayedProducer.java:102)
    at rx.internal.producers.SingleDelayedProducer.setValue(SingleDelayedProducer.java:85)
    at rx.internal.operators.OnSubscribeFromCallable.call(OnSubscribeFromCallable.java:48)
    at rx.internal.operators.OnSubscribeFromCallable.call(OnSubscribeFromCallable.java:33)
    at rx.Observable.unsafeSubscribe(Observable.java:10142)
    at rx.internal.operators.OperatorSubscribeOn$1.call(OperatorSubscribeOn.java:94)
    at rx.internal.schedulers.ScheduledAction.run(ScheduledAction.java:55)
    at rx.internal.schedulers.ExecutorScheduler$ExecutorSchedulerWorker.run(ExecutorScheduler.java:107)
    at com.microsoft.intellij.rxjava.IdeaSchedulers$1.run(IdeaSchedulers.java:53)
    at com.intellij.openapi.progress.impl.CoreProgressManager$TaskRunnable.run(CoreProgressManager.java:750)
    at com.intellij.openapi.progress.impl.CoreProgressManager.lambda$runProcess$1(CoreProgressManager.java:157)
    at com.intellij.openapi.progress.impl.CoreProgressManager.registerIndicatorAndRun(CoreProgressManager.java:580)
    at com.intellij.openapi.progress.impl.CoreProgressManager.executeProcessUnderProgress(CoreProgressManager.java:525)
    at com.intellij.openapi.progress.impl.ProgressManagerImpl.executeProcessUnderProgress(ProgressManagerImpl.java:85)
    at com.intellij.openapi.progress.impl.CoreProgressManager.runProcess(CoreProgressManager.java:144)
    at com.intellij.openapi.progress.impl.CoreProgressManager$4.run(CoreProgressManager.java:395)
    at com.intellij.openapi.application.impl.ApplicationImpl$1.run(ApplicationImpl.java:305)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
jingyanjingyan commented 6 years ago

fix