yahoo / TensorFlowOnSpark

TensorFlowOnSpark brings TensorFlow programs to Apache Spark clusters.
Apache License 2.0
3.88k stars 939 forks source link

Mnist example stuck in "Waiting for model to be ready" in Yarn with Spark 1.6 #206

Closed amantrac closed 6 years ago

amantrac commented 6 years ago

It seems this related to the closed post here, https://github.com/yahoo/TensorFlowOnSpark/issues/119

In a nutshell, the nodes continue to output the following, and are blocked there:

` INFO:tensorflow:Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:11:34,049 INFO (MainThread-110524) Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:12:04.089953: I tensorflow/core/distributed_runtime/master_session.cc:999] Start master session 3885418072df8dbf with config:

INFO:tensorflow:Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:12:04,115 INFO (MainThread-110524) Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad `

I am running Spark version 1.6.1 And I just updated the last version of TensorFlowOnSpark from the git

Trying to run TensorFlowOnSpark/examples/mnist/spark/mnist_dist.py

Here are the parameters I am using

export HADOOP_HOME=/usr/hdp/current/hadoop-client/ export HADOOP_PREFIX=/usr/hdp/current/hadoop-client/ export HADOOP_HDFS_HOME=/usr/hdp/current/hadoop-hdfs-client export HADOOP_CONF_DIR=/etc/hadoop/conf export HADOOP_COMMON_HOME=/usr/hdp/current/hadoop-client/ export HADOOP_MAPRED_HOME=/usr/hdp/current/hadoop-client/ export HADOOP_LIBEXEC_DIR=/usr/hdp/current/hadoop-client/libexec export HADOOP_PREFIX=/usr/hdp/current/hadoop-client export LIB_HDFS=/usr/hdp/2.4.2.0-258/usr/lib export PYTHON_ROOT=./Python

export LD_LIBRARY_PATH=${PATH}

export PYSPARK_PYTHON=${PYTHON_ROOT}/bin/python export SPARK_YARN_USER_ENV="PYSPARK_PYTHON=/usr/bin/python" export PATH=${PYTHON_ROOT}/bin/:$PATH export QUEUE=default export LIB_JVM=/usr/lib/jvm/java-7-openjdk-amd64/jre/lib/amd64/server export LIB_HDFS=/usr/hdp/2.4.2.0-258/usr/lib export SPARK_HOME=/usr/hdp/current/spark-client export PATH=${PATH}:${HADOOP_HOME}/bin:${SPARK_HOME}/bin

run Command:

${SPARK_HOME}/bin/spark-submit \ --master yarn \ --deploy-mode cluster \ --queue ${QUEUE} \ --num-executors 4 \ --executor-memory 2G \ --py-files TensorFlowOnSpark/tfspark.zip,TensorFlowOnSpark/examples/mnist/spark/mnist_dist.py \ --conf spark.dynamicAllocation.enabled=false \ --conf spark.yarn.maxAppAttempts=1 \ --archives hdfs:///user/${USER}/Python.zip#Python \ --conf spark.executorEnv.LD_LIBRARY_PATH="/usr/local/cuda/lib64:$JAVA_HOME/jre/lib/amd64/server:$HADOOP_HOME/lib/native:$LIB_HDFS" \ --conf spark.executorEnv.JAVA_HOME="$JAVA_HOME" \ --conf spark.executorEnv.HADOOP_HDFS_HOME=”$HADOOP_HOME” \ --conf spark.executorEnv.HADOOP_PREFIX="$HADOOP_PREFIX" \ --conf spark.executorEnv.CLASSPATH="$($HADOOP_HOME/bin/hadoop classpath --glob):${CLASSPATH}" \ TensorFlowOnSpark/examples/mnist/spark/mnist_spark.py \ --images mnist/csv/train/images \ --labels mnist/csv/train/labels \ --mode train \ --model mnist_model

driver stdout:

I tried to put the logDir = None as suggested in a post , but this did not solve the issue

Driver stdout

args: Namespace(batch_size=100, cluster_size=4, epochs=1, format='csv', images='mnist/csv/train/images', labels='mnist/csv/train/labels', mode='train', model='mnist_model', output='predictions', rdma=False, readers=1, steps=1000, tensorboard=False) 2018-01-16T00:10:54.030925 ===== Start zipping images and labels 2018-01-16 00:10:54,700 INFO (MainThread-52644) Reserving TFSparkNodes 2018-01-16 00:10:54,703 INFO (MainThread-52644) listening for reservations at ('10.4.117.3', 39909) 2018-01-16 00:10:54,703 INFO (MainThread-52644) Starting TensorFlow on executors 2018-01-16 00:10:54,714 INFO (MainThread-52644) Waiting for TFSparkNodes to start 2018-01-16 00:10:54,715 INFO (MainThread-52644) waiting for 4 reservations 2018-01-16 00:10:55,716 INFO (MainThread-52644) waiting for 4 reservations 2018-01-16 00:10:56,717 INFO (MainThread-52644) all reservations completed 2018-01-16 00:10:56,717 INFO (MainThread-52644) All TFSparkNodes started 2018-01-16 00:10:56,717 INFO (MainThread-52644) {'addr': '/tmp/pymp-zf55NH/listener-A8vko2', 'task_index': 2, 'port': 38280, 'authkey': 'u(CW\xd0\xe1O\x13\x98\x8er\xdb\xcc\x91sY', 'worker_num': 3, 'host': '10.4.100.8', 'ppid': 66809, 'job_name': 'worker', 'tb_pid': 0, 'tb_port': 0} 2018-01-16 00:10:56,717 INFO (MainThread-52644) {'addr': '/tmp/pymp-IPJQ1U/listener-GpK5sS', 'task_index': 1, 'port': 53601, 'authkey': '\x04\xdf\x87_X\x8eM\x1e\xb0|2\x95\xcf\x8c\xa6L', 'worker_num': 2, 'host': '10.4.97.115', 'ppid': 110354, 'job_name': 'worker', 'tb_pid': 0, 'tb_port': 0} 2018-01-16 00:10:56,717 INFO (MainThread-52644) {'addr': '/tmp/pymp-X8WJpQ/listener-9MvGAO', 'task_index': 0, 'port': 36831, 'authkey': "\x90\xde\x1d\xb5~;@\xe4\x939.\x93\t\xe5\x8d'", 'worker_num': 1, 'host': '10.4.116.175', 'ppid': 74380, 'job_name': 'worker', 'tb_pid': 0, 'tb_port': 0} 2018-01-16 00:10:56,717 INFO (MainThread-52644) {'addr': ('10.4.111.25', 35673), 'task_index': 0, 'port': 55327, 'authkey': '@\x93\xfc\xf1u!D\xc3\xa8jk\x1d\xad\xc7E,', 'worker_num': 0, 'host': '10.4.111.25', 'ppid': 33624, 'job_name': 'ps', 'tb_pid': 0, 'tb_port': 0} 2018-01-16 00:10:56,718 INFO (MainThread-52644) Feeding training data

Drive stderr:

`SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/data7/hadoop/yarn/local/usercache/amantrach/filecache/1122/spark-assembly-1.6.1.2.4.2.0-258-hadoop2.7.1.2.4.2.0-258.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/usr/hdp/2.4.2.0-258/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] 18/01/16 00:10:33 INFO ApplicationMaster: Registered signal handlers for [TERM, HUP, INT] 18/01/16 00:10:33 INFO ApplicationMaster: ApplicationAttemptId: appattempt_1515444508016_276923_000001 18/01/16 00:10:34 INFO SecurityManager: Changing view acls to: yarn,amantrach 18/01/16 00:10:34 INFO SecurityManager: Changing modify acls to: yarn,amantrach 18/01/16 00:10:34 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(yarn, amantrach); users with modify permissions: Set(yarn, amantrach) 18/01/16 00:10:35 INFO ApplicationMaster: Starting the user application in a separate Thread 18/01/16 00:10:35 INFO ApplicationMaster: Waiting for spark context initialization 18/01/16 00:10:35 INFO ApplicationMaster: Waiting for spark context initialization ... 18/01/16 00:10:37 INFO SparkContext: Running Spark version 1.6.1 18/01/16 00:10:37 INFO SecurityManager: Changing view acls to: yarn,amantrach 18/01/16 00:10:37 INFO SecurityManager: Changing modify acls to: yarn,amantrach 18/01/16 00:10:37 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(yarn, amantrach); users with modify permissions: Set(yarn, amantrach) 18/01/16 00:10:37 INFO Utils: Successfully started service 'sparkDriver' on port 47482. 18/01/16 00:10:37 INFO Slf4jLogger: Slf4jLogger started 18/01/16 00:10:37 INFO Remoting: Starting remoting 18/01/16 00:10:37 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriverActorSystem@10.4.117.3:42366] 18/01/16 00:10:37 INFO Utils: Successfully started service 'sparkDriverActorSystem' on port 42366. 18/01/16 00:10:37 INFO SparkEnv: Registering MapOutputTracker 18/01/16 00:10:37 INFO SparkEnv: Registering BlockManagerMaster 18/01/16 00:10:37 INFO DiskBlockManager: Created local directory at /data5/hadoop/yarn/local/usercache/amantrach/appcache/application_1515444508016_276923/blockmgr-8701b7d3-9ccf-41a1-97be-60a5b507b8a0 18/01/16 00:10:37 INFO DiskBlockManager: Created local directory at /data7/hadoop/yarn/local/usercache/amantrach/appcache/application_1515444508016_276923/blockmgr-1c518c84-f33a-4660-8241-7a62e724323f 18/01/16 00:10:37 INFO DiskBlockManager: Created local directory at /data3/hadoop/yarn/local/usercache/amantrach/appcache/application_1515444508016_276923/blockmgr-81c16125-37ee-40b5-8046-00e6197aa655 18/01/16 00:10:37 INFO DiskBlockManager: Created local directory at /data4/hadoop/yarn/local/usercache/amantrach/appcache/application_1515444508016_276923/blockmgr-5ec657ec-8c8a-4120-af46-94e1ddba1af9 18/01/16 00:10:37 INFO DiskBlockManager: Created local directory at /data6/hadoop/yarn/local/usercache/amantrach/appcache/application_1515444508016_276923/blockmgr-92309597-ba37-434d-a74b-31ffde1f65c6 18/01/16 00:10:37 INFO DiskBlockManager: Created local directory at /data8/hadoop/yarn/local/usercache/amantrach/appcache/application_1515444508016_276923/blockmgr-ce6566c2-34b6-4951-b5ae-99d81f703d7d 18/01/16 00:10:38 INFO DiskBlockManager: Created local directory at /data9/hadoop/yarn/local/usercache/amantrach/appcache/application_1515444508016_276923/blockmgr-72aebdd9-2eb7-4fbf-9ea9-ea55fb22816e 18/01/16 00:10:38 INFO DiskBlockManager: Created local directory at /data1/hadoop/yarn/local/usercache/amantrach/appcache/application_1515444508016_276923/blockmgr-d049f00b-c42a-4474-adac-7433b92906a1 18/01/16 00:10:38 INFO DiskBlockManager: Created local directory at /data2/hadoop/yarn/local/usercache/amantrach/appcache/application_1515444508016_276923/blockmgr-9bda8d12-6278-4982-ae2e-c78f8e98edb0 18/01/16 00:10:38 INFO DiskBlockManager: Created local directory at /data10/hadoop/yarn/local/usercache/amantrach/appcache/application_1515444508016_276923/blockmgr-17f79160-515e-4765-8051-862fb11397ea 18/01/16 00:10:38 INFO MemoryStore: MemoryStore started with capacity 457.9 MB 18/01/16 00:10:38 INFO SparkEnv: Registering OutputCommitCoordinator 18/01/16 00:10:38 INFO JettyUtils: Adding filter: org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter 18/01/16 00:10:38 INFO Server: jetty-8.y.z-SNAPSHOT 18/01/16 00:10:38 INFO AbstractConnector: Started SelectChannelConnector@0.0.0.0:56057 18/01/16 00:10:38 INFO Utils: Successfully started service 'SparkUI' on port 56057. 18/01/16 00:10:38 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://10.4.117.3:56057 18/01/16 00:10:38 INFO YarnClusterScheduler: Created YarnClusterScheduler 18/01/16 00:10:38 INFO SchedulerExtensionServices: Starting Yarn extension services with app application_1515444508016_276923 and attemptId Some(appattempt_1515444508016_276923_000001) 18/01/16 00:10:38 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 48485. 18/01/16 00:10:38 INFO NettyBlockTransferService: Server created on 48485 18/01/16 00:10:38 INFO BlockManagerMaster: Trying to register BlockManager 18/01/16 00:10:38 INFO BlockManagerMasterEndpoint: Registering block manager 10.4.117.3:48485 with 457.9 MB RAM, BlockManagerId(driver, 10.4.117.3, 48485) 18/01/16 00:10:38 INFO BlockManagerMaster: Registered BlockManager 18/01/16 00:10:38 INFO EventLoggingListener: Logging events to hdfs:///spark-history/application_1515444508016_276923_1 18/01/16 00:10:38 INFO YarnSchedulerBackend$YarnSchedulerEndpoint: ApplicationMaster registered as NettyRpcEndpointRef(spark://YarnAM@10.4.117.3:47482) 18/01/16 00:10:38 INFO YarnRMClient: Registering the ApplicationMaster 18/01/16 00:10:38 INFO YarnAllocator: Will request 4 executor containers, each with 1 cores and 2432 MB memory including 384 MB overhead 18/01/16 00:10:38 INFO YarnAllocator: Container request (host: Any, capability: <memory:2432, vCores:1>) 18/01/16 00:10:38 INFO YarnAllocator: Container request (host: Any, capability: <memory:2432, vCores:1>) 18/01/16 00:10:38 INFO YarnAllocator: Container request (host: Any, capability: <memory:2432, vCores:1>) 18/01/16 00:10:38 INFO YarnAllocator: Container request (host: Any, capability: <memory:2432, vCores:1>) 18/01/16 00:10:38 INFO ApplicationMaster: Started progress reporter thread with (heartbeat : 5000, initial allocation : 200) intervals 18/01/16 00:10:38 INFO AMRMClientImpl: Received new token for : hdpdatanode28:45454 18/01/16 00:10:38 INFO YarnAllocator: Launching container container_e82_1515444508016_276923_01_000002 for on host hdpdatanode28 18/01/16 00:10:38 INFO YarnAllocator: Launching ExecutorRunnable. driverUrl: spark://CoarseGrainedScheduler@10.4.117.3:47482, executorHostname: hdpdatanode28 18/01/16 00:10:38 INFO YarnAllocator: Received 1 containers from YARN, launching executors on 1 of them. 18/01/16 00:10:38 INFO ExecutorRunnable: Starting Executor Container 18/01/16 00:10:38 INFO ContainerManagementProtocolProxy: yarn.client.max-cached-nodemanagers-proxies : 0 18/01/16 00:10:38 INFO ExecutorRunnable: Setting up ContainerLaunchContext 18/01/16 00:10:38 INFO ExecutorRunnable: Preparing Local resources 18/01/16 00:10:39 INFO ExecutorRunnable: Prepared Local resources Map(spark.jar -> resource { scheme: "hdfs" host: "kandula" port: -1 file: "/user/amantrach/.sparkStaging/application_1515444508016_276923/spark-assembly-1.6.1.2.4.2.0-258-hadoop2.7.1.2.4.2.0-258.jar" } size: 185971180 timestamp: 1516061423167 type: FILE visibility: PRIVATE, pyspark.zip -> resource { scheme: "hdfs" host: "kandula" port: -1 file: "/user/amantrach/.sparkStaging/application_1515444508016_276923/pyspark.zip" } size: 357163 timestamp: 1516061423332 type: FILE visibility: PRIVATE, py4j-0.9-src.zip -> resource { scheme: "hdfs" host: "kandula" port: -1 file: "/user/amantrach/.sparkStaging/application_1515444508016_276923/py4j-0.9-src.zip" } size: 44846 timestamp: 1516061423383 type: FILE visibility: PRIVATE, spark_conf -> resource { scheme: "hdfs" host: "kandula" port: -1 file: "/user/amantrach/.sparkStaging/application_1515444508016_276923/spark_conf3358051839168329638.zip" } size: 132754 timestamp: 1516061423590 type: ARCHIVE visibility: PRIVATE, pyfiles/mnist_dist.py -> resource { scheme: "hdfs" host: "kandula" port: -1 file: "/user/amantrach/.sparkStaging/application_1515444508016_276923/mnist_dist.py" } size: 6409 timestamp: 1516061423491 type: FILE visibility: PRIVATE, tfspark.zip -> resource { scheme: "hdfs" host: "kandula" port: -1 file: "/user/amantrach/.sparkStaging/application_1515444508016_276923/tfspark.zip" } size: 31147 timestamp: 1516061423441 type: FILE visibility: PRIVATE, Python -> resource { scheme: "hdfs" host: "kandula" port: -1 file: "/user/amantrach/Python.zip" } size: 189601770 timestamp: 1516058786174 type: ARCHIVE visibility: PUBLIC) 18/01/16 00:10:39 WARN Client: No spark assembly jar for HDP on HDFS, defaultSparkAssembly:hdfs://kandula/hdp/apps/2.4.2.0-258/spark/spark-hdp-assembly.jar 18/01/16 00:10:39 INFO ExecutorRunnable:

YARN executor launch context: env: SPARK_YARN_CACHE_ARCHIVES -> hdfs://kandula/user/amantrach/Python.zip#Python,hdfs://kandula/user/amantrach/.sparkStaging/application_1515444508016_276923/spark_conf3358051839168329638.zip#spark_conf CLASSPATH -> {{PWD}}{{PWD}}/spark_conf{{PWD}}/spark__.jar$HADOOP_CONF_DIR/usr/hdp/current/hadoop-client//usr/hdp/current/hadoop-client/lib//usr/hdp/current/hadoop-hdfs-client//usr/hdp/current/hadoop-hdfs-client/lib//usr/hdp/current/hadoop-yarn-client//usr/hdp/current/hadoop-yarn-client/lib//usr/hdp/current/oozie-client/lib/$PWD/mr-framework/hadoop/share/hadoop/mapreduce/:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/lib/:$PWD/mr-framework/hadoop/share/hadoop/common/:$PWD/mr-framework/hadoop/share/hadoop/common/lib/:$PWD/mr-framework/hadoop/share/hadoop/yarn/:$PWD/mr-framework/hadoop/share/hadoop/yarn/lib/:$PWD/mr-framework/hadoop/share/hadoop/hdfs/:$PWD/mr-framework/hadoop/share/hadoop/hdfs/lib/:$PWD/mr-framework/hadoop/share/hadoop/tools/lib/:/usr/hdp/2.4.2.0-258/hadoop/lib/hadoop-lzo-0.6.0.2.4.2.0-258.jar:/etc/hadoop/conf/secure/usr/hdp/2.4.2.0-258/hadoop/conf:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jettison-1.1.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jackson-core-2.2.3.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-compress-1.4.1.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/asm-3.2.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/avro-1.7.4.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jersey-json-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/ranger-plugin-classloader-0.5.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/log4j-1.2.17.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/curator-framework-2.7.1.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/spark-yarn-shuffle.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-io-2.4.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-httpclient-3.1.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/mockito-all-1.8.5.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/ojdbc6.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-cli-1.2.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/activation-1.1.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jersey-server-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/xz-1.0.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/hadoop-lzo-0.6.0.2.4.2.0-258-sources.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/microsoft-windowsazure-storage-sdk-0.6.0.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-configuration-1.6.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/azure-storage-2.2.0.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/servlet-api-2.5.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-net-3.1.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/api-util-1.0.0-M20.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/curator-client-2.7.1.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-math3-3.1.1.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/gson-2.2.4.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/ranger-hdfs-plugin-shim-0.5.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jets3t-0.9.0.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jsr305-3.0.0.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jaxb-api-2.2.2.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jersey-core-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-collections-3.2.2.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/junit-4.11.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/slf4j-api-1.7.10.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/netty-3.6.2.Final.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/httpcore-4.2.5.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/ranger-yarn-plugin-shim-0.5.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/zookeeper-3.4.6.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/paranamer-2.3.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-logging-1.1.3.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/hadoop-lzo-0.6.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/hadoop-lzo-0.6.0.2.4.2.0-258-javadoc.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jsp-api-2.1.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-digester-1.8.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-lang-2.6.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/xmlenc-0.52.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/stax-api-1.0-2.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/guava-11.0.2.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/aws-java-sdk-1.7.4.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jackson-databind-2.2.3.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jsch-0.1.42.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jackson-xc-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/httpclient-4.2.5.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-codec-1.4.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/hamcrest-core-1.3.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/curator-recipes-2.7.1.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jackson-annotations-2.2.3.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-common.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-aws-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-auth.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-auth-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-common-2.7.1.2.4.2.0-258-tests.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-aws.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-nfs.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-nfs-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-common-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-azure-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-annotations-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-annotations.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-common-tests.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-azure.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/./:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/asm-3.2.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/okio-1.4.0.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/okhttp-2.4.0.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/.//hadoop-hdfs-tests.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/.//hadoop-hdfs.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/.//hadoop-hdfs-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/.//hadoop-hdfs-nfs-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/.//hadoop-hdfs-nfs.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/.//hadoop-hdfs-2.7.1.2.4.2.0-258-tests.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jettison-1.1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jackson-core-2.2.3.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/asm-3.2.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/avro-1.7.4.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/log4j-1.2.17.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/curator-framework-2.7.1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-io-2.4.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-httpclient-3.1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/htrace-core-3.1.0-incubating.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/objenesis-2.1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/activation-1.1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/xz-1.0.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/microsoft-windowsazure-storage-sdk-0.6.0.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-configuration-1.6.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-net-3.1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/java-xmlbuilder-0.4.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/api-util-1.0.0-M20.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/fst-2.24.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/javassist-3.18.1-GA.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/curator-client-2.7.1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-math3-3.1.1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/gson-2.2.4.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jets3t-0.9.0.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/guice-3.0.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/javax.inject-1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/httpcore-4.2.5.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/zookeeper-3.4.6.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/paranamer-2.3.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/zookeeper-3.4.6.2.4.2.0-258-tests.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jsp-api-2.1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-digester-1.8.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/xmlenc-0.52.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/guava-11.0.2.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jackson-databind-2.2.3.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jsch-0.1.42.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/httpclient-4.2.5.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/curator-recipes-2.7.1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jackson-annotations-2.2.3.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-tests-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-applications-distributedshell.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-common.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-web-proxy.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-timeline-plugins.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-client-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-common.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-common-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-nodemanager-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-registry.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-api-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-client.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-registry-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-applications-distributedshell-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-nodemanager.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-timeline-plugins-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-tests.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-resourcemanager.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-web-proxy-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-common-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-api.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-resourcemanager-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/asm-3.2.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/xz-1.0.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/guice-3.0.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/junit-4.11.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-openstack-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-beanutils-1.7.0.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-sls.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-distcp-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jettison-1.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jackson-core-2.2.3.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-compress-1.4.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//asm-3.2.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//avro-1.7.4.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-app.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jersey-json-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-extras-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-tests.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//log4j-1.2.17.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-ant.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//curator-framework-2.7.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-io-2.4.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-httpclient-3.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-streaming-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//htrace-core-3.1.0-incubating.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jetty-util-6.1.26.hwx.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-openstack.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-hs.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-extras.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-distcp.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//mockito-all-1.8.5.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//snappy-java-1.0.4.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-cli-1.2.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//activation-1.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jersey-server-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//xz-1.0.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-auth.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jaxb-impl-2.2.3-1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//microsoft-windowsazure-storage-sdk-0.6.0.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-configuration-1.6.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-auth-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-datajoin-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//servlet-api-2.5.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-net-3.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-archives-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//java-xmlbuilder-0.4.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-rumen.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//api-util-1.0.0-M20.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-app-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//curator-client-2.7.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-math3-3.1.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-examples-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//gson-2.2.4.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jets3t-0.9.0.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-ant-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-core.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-core-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jsr305-3.0.0.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jaxb-api-2.2.2.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jersey-core-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-collections-3.2.2.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//junit-4.11.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//netty-3.6.2.Final.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//protobuf-java-2.5.0.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-archives.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-datajoin.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//httpcore-4.2.5.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//zookeeper-3.4.6.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//paranamer-2.3.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-streaming.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-gridmix-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-logging-1.1.3.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-examples.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-rumen-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//metrics-core-3.0.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jsp-api-2.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-digester-1.8.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-lang-2.6.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//xmlenc-0.52.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//stax-api-1.0-2.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-beanutils-core-1.8.0.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//guava-11.0.2.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jsch-0.1.42.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jackson-xc-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-2.7.1.2.4.2.0-258-tests.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//httpclient-4.2.5.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jackson-core-asl-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-codec-1.4.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-common.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-lang3-3.3.2.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jackson-jaxrs-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-sls-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jetty-6.1.26.hwx.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//joda-time-2.9.3.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hamcrest-core-1.3.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-gridmix.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-common-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//curator-recipes-2.7.1.jar:.:/etc/hadoop/conf:/etc/oozie/conf:/usr/hdp/current/oozie-client/lib/.jar:/usr/hdp/current/hadoop-client/lib/.jar:jdbc-mysql.jar:mysql-connector-java-5.1.37-bin.jar:mysql-connector-java.jar:/usr/hdp/current/oozie-client/lib/commons-beanutils-1.7.0.jar:/usr/hdp/current/oozie-client/lib/activemq-client-5.8.0.jar:/usr/hdp/current/oozie-client/lib/jettison-1.1.jar:/usr/hdp/current/oozie-client/lib/jackson-core-2.2.3.jar:/usr/hdp/current/oozie-client/lib/commons-compress-1.4.1.jar:/usr/hdp/current/oozie-client/lib/avro-1.7.4.jar:/usr/hdp/current/oozie-client/lib/jersey-json-1.9.jar:/usr/hdp/current/oozie-client/lib/curator-framework-2.7.1.jar:/usr/hdp/current/oozie-client/lib/commons-io-2.4.jar:/usr/hdp/current/oozie-client/lib/commons-httpclient-3.1.jar:/usr/hdp/current/oozie-client/lib/htrace-core-3.1.0-incubating.jar:/usr/hdp/current/oozie-client/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/current/oozie-client/lib/hadoop-aws-2.7.1.2.4.2.0-258.jar:/usr/hdp/current/oozie-client/lib/snappy-java-1.0.4.1.jar:/usr/hdp/current/oozie-client/lib/commons-cli-1.2.jar:/usr/hdp/current/oozie-client/lib/activation-1.1.jar:/usr/hdp/current/oozie-client/lib/jersey-server-1.9.jar:/usr/hdp/current/oozie-client/lib/xz-1.0.jar:/usr/hdp/current/oozie-client/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/current/oozie-client/lib/microsoft-windowsazure-storage-sdk-0.6.0.jar:/usr/hdp/current/oozie-client/lib/jackson-jaxrs-1.8.3.jar:/usr/hdp/current/oozie-client/lib/joda-time-2.1.jar:/usr/hdp/current/oozie-client/lib/log4j-1.2.16.jar:/usr/hdp/current/oozie-client/lib/commons-configuration-1.6.jar:/usr/hdp/current/oozie-client/lib/hadoop-auth-2.7.1.2.4.2.0-258.jar:/usr/hdp/current/oozie-client/lib/azure-storage-2.2.0.jar:/usr/hdp/current/oozie-client/lib/servlet-api-2.5.jar:/usr/hdp/current/oozie-client/lib/commons-net-3.1.jar:/usr/hdp/current/oozie-client/lib/java-xmlbuilder-0.4.jar:/usr/hdp/current/oozie-client/lib/api-util-1.0.0-M20.jar:/usr/hdp/current/oozie-client/lib/curator-client-2.7.1.jar:/usr/hdp/current/oozie-client/lib/commons-math3-3.1.1.jar:/usr/hdp/current/oozie-client/lib/json-simple-1.1.jar:/usr/hdp/current/oozie-client/lib/gson-2.2.4.jar:/usr/hdp/current/oozie-client/lib/jets3t-0.9.0.jar:/usr/hdp/current/oozie-client/lib/geronimo-j2ee-management_1.1_spec-1.0.1.jar:/usr/hdp/current/oozie-client/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/current/oozie-client/lib/jaxb-api-2.2.2.jar:/usr/hdp/current/oozie-client/lib/jersey-core-1.9.jar:/usr/hdp/current/oozie-client/lib/curator-recipes-2.5.0.jar:/usr/hdp/current/oozie-client/lib/commons-collections-3.2.2.jar:/usr/hdp/current/oozie-client/lib/hadoop-common-2.7.1.2.4.2.0-258.jar:/usr/hdp/current/oozie-client/lib/xercesImpl-2.10.0.jar:/usr/hdp/current/oozie-client/lib/hadoop-azure-2.7.1.2.4.2.0-258.jar:/usr/hdp/current/oozie-client/lib/protobuf-java-2.5.0.jar:/usr/hdp/current/oozie-client/lib/hadoop-annotations-2.7.1.2.4.2.0-258.jar:/usr/hdp/current/oozie-client/lib/hawtbuf-1.9.jar:/usr/hdp/current/oozie-client/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/current/oozie-client/lib/zookeeper-3.4.6.2.4.2.0-258.jar:/usr/hdp/current/oozie-client/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/current/oozie-client/lib/paranamer-2.3.jar:/usr/hdp/current/oozie-client/lib/jsr305-1.3.9.jar:/usr/hdp/current/oozie-client/lib/jetty-6.1.14.jar:/usr/hdp/current/oozie-client/lib/oozie-client-4.2.0.2.4.2.0-258.jar:/usr/hdp/current/oozie-client/lib/commons-logging-1.1.jar:/usr/hdp/current/oozie-client/lib/httpclient-4.3.jar:/usr/hdp/current/oozie-client/lib/commons-digester-1.8.jar:/usr/hdp/current/oozie-client/lib/xmlenc-0.52.jar:/usr/hdp/current/oozie-client/lib/httpcore-4.3.jar:/usr/hdp/current/oozie-client/lib/jline-2.12.jar:/usr/hdp/current/oozie-client/lib/jdk.tools-1.7.jar:/usr/hdp/current/oozie-client/lib/stax-api-1.0-2.jar:/usr/hdp/current/oozie-client/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/current/oozie-client/lib/guava-11.0.2.jar:/usr/hdp/current/oozie-client/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/current/oozie-client/lib/servlet-api-2.5-6.1.14.jar:/usr/hdp/current/oozie-client/lib/aws-java-sdk-1.7.4.jar:/usr/hdp/current/oozie-client/lib/geronimo-jms_1.1_spec-1.1.1.jar:/usr/hdp/current/oozie-client/lib/jackson-databind-2.2.3.jar:/usr/hdp/current/oozie-client/lib/jsch-0.1.42.jar:/usr/hdp/current/oozie-client/lib/jackson-xc-1.8.3.jar:/usr/hdp/current/oozie-client/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/current/oozie-client/lib/commons-codec-1.4.jar:/usr/hdp/current/oozie-client/lib/commons-lang3-3.3.2.jar:/usr/hdp/current/oozie-client/lib/slf4j-api-1.6.6.jar:/usr/hdp/current/oozie-client/lib/asm-3.1.jar:/usr/hdp/current/oozie-client/lib/commons-lang-2.4.jar:/usr/hdp/current/oozie-client/lib/xml-apis-1.4.01.jar:/usr/hdp/current/oozie-client/lib/jackson-annotations-2.2.3.jar:/usr/hdp/current/oozie-client/lib/netty-3.7.0.Final.jar:/usr/hdp/current/oozie-client/lib/oozie-hadoop-auth-hadoop-2-4.2.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-yarn-timeline-history-with-fs-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-yarn-timeline-cache-plugin-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-runtime-library-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-dag-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-yarn-timeline-history-with-acls-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-api-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-examples-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-mapreduce-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-history-parser-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-runtime-internals-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-tests-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-yarn-timeline-history-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-common-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/lib/commons-collections4-4.1.jar:/usr/hdp/2.4.2.0-258/tez/lib/jersey-client-1.9.jar:/usr/hdp/2.4.2.0-258/tez/lib/jersey-json-1.9.jar:/usr/hdp/2.4.2.0-258/tez/lib/commons-io-2.4.jar:/usr/hdp/2.4.2.0-258/tez/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.4.2.0-258/tez/lib/jettison-1.3.4.jar:/usr/hdp/2.4.2.0-258/tez/lib/hadoop-aws-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/lib/jsr305-2.0.3.jar:/usr/hdp/2.4.2.0-258/tez/lib/slf4j-api-1.7.5.jar:/usr/hdp/2.4.2.0-258/tez/lib/commons-cli-1.2.jar:/usr/hdp/2.4.2.0-258/tez/lib/servlet-api-2.5.jar:/usr/hdp/2.4.2.0-258/tez/lib/commons-math3-3.1.1.jar:/usr/hdp/2.4.2.0-258/tez/lib/hadoop-mapreduce-client-core-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/lib/commons-collections-3.2.2.jar:/usr/hdp/2.4.2.0-258/tez/lib/hadoop-azure-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.4.2.0-258/tez/lib/hadoop-annotations-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/lib/commons-lang-2.6.jar:/usr/hdp/2.4.2.0-258/tez/lib/hadoop-yarn-server-timeline-plugins-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/lib/guava-11.0.2.jar:/usr/hdp/2.4.2.0-258/tez/lib/hadoop-yarn-server-web-proxy-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/lib/commons-codec-1.4.jar:/usr/hdp/2.4.2.0-258/tez/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.4.2.0-258/tez/lib/hadoop-mapreduce-client-common-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/conf:.:/etc/hadoop/conf:/etc/oozie/conf:/usr/hdp/current/oozie-client/lib/activation-1.1.jar:/usr/hdp/current/oozie-client/lib/activemq-client-5.8.0.jar:/usr/hdp/current/oozie-client/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/current/oozie-client/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/current/oozie-client/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/current/oozie-client/lib/api-util-1.0.0-M20.jar:/usr/hdp/current/oozie-client/lib/asm-3.1.jar:/usr/hdp/current/oozie-client/lib/avro-1.7.4.jar:/usr/hdp/current/oozie-client/lib/aws-java-sdk-1.7.4.jar:/usr/hdp/current/oozie-client/lib/azure-storage-2.2.0.jar:/usr/hdp/current/oozie-client/lib/commons-beanutils-1.7.0.jar:/usr/hdp/current/oozie-client/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/current/oozie-client/lib/commons-cli-1.2.jar:/usr/hdp/current/oozie-client/lib/commons-codec-1.4.jar:/usr/hdp/current/oozie-client/lib/commons-collections-3.2.2.jar:/usr/hdp/current/oozie-client/lib/commons-compress-1.4.1.jar:/usr/hdp/current/oozie-client/lib/comm SPARK_YARN_USER_ENV -> PYSPARK_PYTHON=/usr/bin/python SPARK_LOG_URL_STDERR -> http://hdpdatanode28:8042/node/containerlogs/container_e82_1515444508016_276923_01_000002/amantrach/stderr?start=-4096 SPARK_YARN_STAGING_DIR -> .sparkStaging/application_1515444508016_276923 SPARK_YARN_CACHE_FILES_FILE_SIZES -> 185971180,357163,44846,31147,6409 SPARK_YARN_CACHE_FILES_VISIBILITIES -> PRIVATE,PRIVATE,PRIVATE,PRIVATE,PRIVATE SPARK_YARN_CACHE_ARCHIVES_FILE_SIZES -> 189601770,132754 SPARK_USER -> amantrach HADOOP_HDFS_HOME -> ”/usr/hdp/current/hadoop-client/” HADOOP_PREFIX -> /usr/hdp/current/hadoop-client SPARK_YARN_CACHE_ARCHIVES_TIME_STAMPS -> 1516058786174,1516061423590 SPARK_YARN_MODE -> true SPARK_YARN_CACHE_FILES_TIME_STAMPS -> 1516061423167,1516061423332,1516061423383,1516061423441,1516061423491 JAVA_HOME -> /usr/lib/jvm/java-7-openjdk-amd64 PYTHONPATH -> {{PWD}}/pyfiles{{PWD}}/pyspark.zip{{PWD}}/py4j-0.9-src.zip{{PWD}}/tfspark.zip LD_LIBRARY_PATH -> /usr/local/cuda/lib64:/usr/lib/jvm/java-7-openjdk-amd64/jre/lib/amd64/server:/usr/hdp/current/hadoop-client//lib/native:/usr/hdp/2.4.2.0-258/usr/lib SPARK_LOG_URL_STDOUT -> http://hdpdatanode28:8042/node/containerlogs/container_e82_1515444508016_276923_01_000002/amantrach/stdout?start=-4096 PYSPARK_PYTHON -> /usr/bin/python SPARK_YARN_CACHE_ARCHIVES_VISIBILITIES -> PUBLIC,PRIVATE SPARK_YARN_CACHE_FILES -> hdfs://kandula/user/amantrach/.sparkStaging/application_1515444508016_276923/spark-assembly-1.6.1.2.4.2.0-258-hadoop2.7.1.2.4.2.0-258.jar#spark.jar,hdfs://kandula/user/amantrach/.sparkStaging/application_1515444508016_276923/pyspark.zip#pyspark.zip,hdfs://kandula/user/amantrach/.sparkStaging/application_1515444508016_276923/py4j-0.9-src.zip#py4j-0.9-src.zip,hdfs://kandula/user/amantrach/.sparkStaging/application_1515444508016_276923/tfspark.zip#tfspark.zip,hdfs://kandula/user/amantrach/.sparkStaging/application_1515444508016_276923/mnist_dist.py#pyfiles__/mnist_dist.py

command: {{JAVA_HOME}}/bin/java -server -XX:OnOutOfMemoryError='kill %p' -Xms2048m -Xmx2048m -Djava.io.tmpdir={{PWD}}/tmp '-Dspark.history.ui.port=18080' '-Dspark.driver.port=47482' '-Dspark.ui.port=0' -Dspark.yarn.app.container.log.dir= -XX:MaxPermSize=256m org.apache.spark.executor.CoarseGrainedExecutorBackend --driver-url spark://CoarseGrainedScheduler@10.4.117.3:47482 --executor-id 1 --hostname hdpdatanode28 --cores 1 --app-id application_1515444508016_276923 --user-class-path file:$PWD/app.jar 1> /stdout 2> /stderr

18/01/16 00:10:39 INFO ContainerManagementProtocolProxy: Opening proxy : hdpdatanode28:45454 18/01/16 00:10:39 INFO AMRMClientImpl: Received new token for : hdpdatanode33:45454 18/01/16 00:10:39 INFO AMRMClientImpl: Received new token for : hdpdatanode59:45454 18/01/16 00:10:39 INFO AMRMClientImpl: Received new token for : hdpdatanode40:45454 18/01/16 00:10:39 INFO YarnAllocator: Launching container container_e82_1515444508016_276923_01_000003 for on host hdpdatanode33 18/01/16 00:10:39 INFO YarnAllocator: Launching ExecutorRunnable. driverUrl: spark://CoarseGrainedScheduler@10.4.117.3:47482, executorHostname: hdpdatanode33 18/01/16 00:10:39 INFO YarnAllocator: Launching container container_e82_1515444508016_276923_01_000004 for on host hdpdatanode59 18/01/16 00:10:39 INFO ExecutorRunnable: Starting Executor Container 18/01/16 00:10:39 INFO YarnAllocator: Launching ExecutorRunnable. driverUrl: spark://CoarseGrainedScheduler@10.4.117.3:47482, executorHostname: hdpdatanode59 18/01/16 00:10:39 INFO ContainerManagementProtocolProxy: yarn.client.max-cached-nodemanagers-proxies : 0 18/01/16 00:10:39 INFO ExecutorRunnable: Starting Executor Container 18/01/16 00:10:39 INFO YarnAllocator: Launching container container_e82_1515444508016_276923_01_000005 for on host hdpdatanode40 18/01/16 00:10:39 INFO ExecutorRunnable: Setting up ContainerLaunchContext 18/01/16 00:10:39 INFO ExecutorRunnable: Preparing Local resources 18/01/16 00:10:39 INFO ContainerManagementProtocolProxy: yarn.client.max-cached-nodemanagers-proxies : 0 18/01/16 00:10:39 INFO ExecutorRunnable: Setting up ContainerLaunchContext 18/01/16 00:10:39 INFO ExecutorRunnable: Preparing Local resources 18/01/16 00:10:39 INFO YarnAllocator: Launching ExecutorRunnable. driverUrl: spark://CoarseGrainedScheduler@10.4.117.3:47482, executorHostname: hdpdatanode40 18/01/16 00:10:39 INFO YarnAllocator: Received 3 containers from YARN, launching executors on 3 of them. 18/01/16 00:10:39 INFO ExecutorRunnable: Starting Executor Container 18/01/16 00:10:39 INFO ContainerManagementProtocolProxy: yarn.client.max-cached-nodemanagers-proxies : 0 18/01/16 00:10:39 INFO ExecutorRunnable: Setting up ContainerLaunchContext 18/01/16 00:10:39 INFO ExecutorRunnable: Preparing Local resources 18/01/16 00:10:39 INFO ExecutorRunnable: Prepared Local resources Map(spark.jar -> resource { scheme: "hdfs" host: "kandula" port: -1 file: "/user/amantrach/.sparkStaging/application_1515444508016_276923/spark-assembly-1.6.1.2.4.2.0-258-hadoop2.7.1.2.4.2.0-258.jar" } size: 185971180 timestamp: 1516061423167 type: FILE visibility: PRIVATE, pyspark.zip -> resource { scheme: "hdfs" host: "kandula" port: -1 file: "/user/amantrach/.sparkStaging/application_1515444508016_276923/pyspark.zip" } size: 357163 timestamp: 1516061423332 type: FILE visibility: PRIVATE, py4j-0.9-src.zip -> resource { scheme: "hdfs" host: "kandula" port: -1 file: "/user/amantrach/.sparkStaging/application_1515444508016_276923/py4j-0.9-src.zip" } size: 44846 timestamp: 1516061423383 type: FILE visibility: PRIVATE, spark_conf -> resource { scheme: "hdfs" host: "kandula" port: -1 file: "/user/amantrach/.sparkStaging/application_1515444508016_276923/spark_conf3358051839168329638.zip" } size: 132754 timestamp: 1516061423590 type: ARCHIVE visibility: PRIVATE, pyfiles/mnist_dist.py -> resource { scheme: "hdfs" host: "kandula" port: -1 file: "/user/amantrach/.sparkStaging/application_1515444508016_276923/mnist_dist.py" } size: 6409 timestamp: 1516061423491 type: FILE visibility: PRIVATE, tfspark.zip -> resource { scheme: "hdfs" host: "kandula" port: -1 file: "/user/amantrach/.sparkStaging/application_1515444508016_276923/tfspark.zip" } size: 31147 timestamp: 1516061423441 type: FILE visibility: PRIVATE, Python -> resource { scheme: "hdfs" host: "kandula" port: -1 file: "/user/amantrach/Python.zip" } size: 189601770 timestamp: 1516058786174 type: ARCHIVE visibility: PUBLIC) 18/01/16 00:10:39 WARN Client: No spark assembly jar for HDP on HDFS, defaultSparkAssembly:hdfs://kandula/hdp/apps/2.4.2.0-258/spark/spark-hdp-assembly.jar 18/01/16 00:10:39 INFO ExecutorRunnable: Prepared Local resources Map(spark.jar -> resource { scheme: "hdfs" host: "kandula" port: -1 file: "/user/amantrach/.sparkStaging/application_1515444508016_276923/spark-assembly-1.6.1.2.4.2.0-258-hadoop2.7.1.2.4.2.0-258.jar" } size: 185971180 timestamp: 1516061423167 type: FILE visibility: PRIVATE, pyspark.zip -> resource { scheme: "hdfs" host: "kandula" port: -1 file: "/user/amantrach/.sparkStaging/application_1515444508016_276923/pyspark.zip" } size: 357163 timestamp: 1516061423332 type: FILE visibility: PRIVATE, py4j-0.9-src.zip -> resource { scheme: "hdfs" host: "kandula" port: -1 file: "/user/amantrach/.sparkStaging/application_1515444508016_276923/py4j-0.9-src.zip" } size: 44846 timestamp: 1516061423383 type: FILE visibility: PRIVATE, spark_conf -> resource { scheme: "hdfs" host: "kandula" port: -1 file: "/user/amantrach/.sparkStaging/application_1515444508016_276923/spark_conf3358051839168329638.zip" } size: 132754 timestamp: 1516061423590 type: ARCHIVE visibility: PRIVATE, pyfiles/mnist_dist.py -> resource { scheme: "hdfs" host: "kandula" port: -1 file: "/user/amantrach/.sparkStaging/application_1515444508016_276923/mnist_dist.py" } size: 6409 timestamp: 1516061423491 type: FILE visibility: PRIVATE, tfspark.zip -> resource { scheme: "hdfs" host: "kandula" port: -1 file: "/user/amantrach/.sparkStaging/application_1515444508016_276923/tfspark.zip" } size: 31147 timestamp: 1516061423441 type: FILE visibility: PRIVATE, Python -> resource { scheme: "hdfs" host: "kandula" port: -1 file: "/user/amantrach/Python.zip" } size: 189601770 timestamp: 1516058786174 type: ARCHIVE visibility: PUBLIC) 18/01/16 00:10:39 INFO ExecutorRunnable: Prepared Local resources Map(spark.jar -> resource { scheme: "hdfs" host: "kandula" port: -1 file: "/user/amantrach/.sparkStaging/application_1515444508016_276923/spark-assembly-1.6.1.2.4.2.0-258-hadoop2.7.1.2.4.2.0-258.jar" } size: 185971180 timestamp: 1516061423167 type: FILE visibility: PRIVATE, pyspark.zip -> resource { scheme: "hdfs" host: "kandula" port: -1 file: "/user/amantrach/.sparkStaging/application_1515444508016_276923/pyspark.zip" } size: 357163 timestamp: 1516061423332 type: FILE visibility: PRIVATE, py4j-0.9-src.zip -> resource { scheme: "hdfs" host: "kandula" port: -1 file: "/user/amantrach/.sparkStaging/application_1515444508016_276923/py4j-0.9-src.zip" } size: 44846 timestamp: 1516061423383 type: FILE visibility: PRIVATE, spark_conf -> resource { scheme: "hdfs" host: "kandula" port: -1 file: "/user/amantrach/.sparkStaging/application_1515444508016_276923/spark_conf3358051839168329638.zip" } size: 132754 timestamp: 1516061423590 type: ARCHIVE visibility: PRIVATE, pyfiles/mnist_dist.py -> resource { scheme: "hdfs" host: "kandula" port: -1 file: "/user/amantrach/.sparkStaging/application_1515444508016_276923/mnist_dist.py" } size: 6409 timestamp: 1516061423491 type: FILE visibility: PRIVATE, tfspark.zip -> resource { scheme: "hdfs" host: "kandula" port: -1 file: "/user/amantrach/.sparkStaging/application_1515444508016_276923/tfspark.zip" } size: 31147 timestamp: 1516061423441 type: FILE visibility: PRIVATE, Python -> resource { scheme: "hdfs" host: "kandula" port: -1 file: "/user/amantrach/Python.zip" } size: 189601770 timestamp: 1516058786174 type: ARCHIVE visibility: PUBLIC) 18/01/16 00:10:39 WARN Client: No spark assembly jar for HDP on HDFS, defaultSparkAssembly:hdfs://kandula/hdp/apps/2.4.2.0-258/spark/spark-hdp-assembly.jar 18/01/16 00:10:39 INFO ExecutorRunnable:

YARN executor launch context: env: SPARK_YARN_CACHE_ARCHIVES -> hdfs://kandula/user/amantrach/Python.zip#Python,hdfs://kandula/user/amantrach/.sparkStaging/application_1515444508016_276923/spark_conf3358051839168329638.zip#spark_conf CLASSPATH -> {{PWD}}{{PWD}}/spark_conf{{PWD}}/spark__.jar$HADOOP_CONF_DIR/usr/hdp/current/hadoop-client//usr/hdp/current/hadoop-client/lib//usr/hdp/current/hadoop-hdfs-client//usr/hdp/current/hadoop-hdfs-client/lib//usr/hdp/current/hadoop-yarn-client//usr/hdp/current/hadoop-yarn-client/lib//usr/hdp/current/oozie-client/lib/$PWD/mr-framework/hadoop/share/hadoop/mapreduce/:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/lib/:$PWD/mr-framework/hadoop/share/hadoop/common/:$PWD/mr-framework/hadoop/share/hadoop/common/lib/:$PWD/mr-framework/hadoop/share/hadoop/yarn/:$PWD/mr-framework/hadoop/share/hadoop/yarn/lib/:$PWD/mr-framework/hadoop/share/hadoop/hdfs/:$PWD/mr-framework/hadoop/share/hadoop/hdfs/lib/:$PWD/mr-framework/hadoop/share/hadoop/tools/lib/:/usr/hdp/2.4.2.0-258/hadoop/lib/hadoop-lzo-0.6.0.2.4.2.0-258.jar:/etc/hadoop/conf/secure/usr/hdp/2.4.2.0-258/hadoop/conf:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jettison-1.1.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jackson-core-2.2.3.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-compress-1.4.1.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/asm-3.2.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/avro-1.7.4.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jersey-json-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/ranger-plugin-classloader-0.5.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/log4j-1.2.17.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/curator-framework-2.7.1.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/spark-yarn-shuffle.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-io-2.4.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-httpclient-3.1.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/mockito-all-1.8.5.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/ojdbc6.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-cli-1.2.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/activation-1.1.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jersey-server-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/xz-1.0.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/hadoop-lzo-0.6.0.2.4.2.0-258-sources.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/microsoft-windowsazure-storage-sdk-0.6.0.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-configuration-1.6.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/azure-storage-2.2.0.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/servlet-api-2.5.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-net-3.1.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/api-util-1.0.0-M20.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/curator-client-2.7.1.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-math3-3.1.1.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/gson-2.2.4.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/ranger-hdfs-plugin-shim-0.5.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jets3t-0.9.0.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jsr305-3.0.0.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jaxb-api-2.2.2.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jersey-core-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-collections-3.2.2.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/junit-4.11.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/slf4j-api-1.7.10.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/netty-3.6.2.Final.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/httpcore-4.2.5.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/ranger-yarn-plugin-shim-0.5.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/zookeeper-3.4.6.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/paranamer-2.3.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-logging-1.1.3.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/hadoop-lzo-0.6.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/hadoop-lzo-0.6.0.2.4.2.0-258-javadoc.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jsp-api-2.1.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-digester-1.8.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-lang-2.6.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/xmlenc-0.52.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/stax-api-1.0-2.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/guava-11.0.2.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/aws-java-sdk-1.7.4.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jackson-databind-2.2.3.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jsch-0.1.42.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jackson-xc-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/httpclient-4.2.5.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-codec-1.4.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/hamcrest-core-1.3.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/curator-recipes-2.7.1.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jackson-annotations-2.2.3.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-common.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-aws-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-auth.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-auth-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-common-2.7.1.2.4.2.0-258-tests.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-aws.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-nfs.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-nfs-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-common-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-azure-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-annotations-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-annotations.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-common-tests.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-azure.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/./:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/asm-3.2.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/okio-1.4.0.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/okhttp-2.4.0.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/.//hadoop-hdfs-tests.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/.//hadoop-hdfs.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/.//hadoop-hdfs-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/.//hadoop-hdfs-nfs-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/.//hadoop-hdfs-nfs.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/.//hadoop-hdfs-2.7.1.2.4.2.0-258-tests.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jettison-1.1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jackson-core-2.2.3.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/asm-3.2.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/avro-1.7.4.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/log4j-1.2.17.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/curator-framework-2.7.1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-io-2.4.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-httpclient-3.1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/htrace-core-3.1.0-incubating.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/objenesis-2.1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/activation-1.1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/xz-1.0.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/microsoft-windowsazure-storage-sdk-0.6.0.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-configuration-1.6.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-net-3.1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/java-xmlbuilder-0.4.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/api-util-1.0.0-M20.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/fst-2.24.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/javassist-3.18.1-GA.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/curator-client-2.7.1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-math3-3.1.1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/gson-2.2.4.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jets3t-0.9.0.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/guice-3.0.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/javax.inject-1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/httpcore-4.2.5.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/zookeeper-3.4.6.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/paranamer-2.3.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/zookeeper-3.4.6.2.4.2.0-258-tests.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jsp-api-2.1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-digester-1.8.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/xmlenc-0.52.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/guava-11.0.2.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jackson-databind-2.2.3.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jsch-0.1.42.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/httpclient-4.2.5.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/curator-recipes-2.7.1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jackson-annotations-2.2.3.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-tests-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-applications-distributedshell.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-common.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-web-proxy.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-timeline-plugins.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-client-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-common.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-common-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-nodemanager-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-registry.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-api-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-client.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-registry-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-applications-distributedshell-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-nodemanager.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-timeline-plugins-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-tests.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-resourcemanager.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-web-proxy-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-common-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-api.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-resourcemanager-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/asm-3.2.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/xz-1.0.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/guice-3.0.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/junit-4.11.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-openstack-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-beanutils-1.7.0.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-sls.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-distcp-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jettison-1.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jackson-core-2.2.3.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-compress-1.4.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//asm-3.2.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//avro-1.7.4.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-app.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jersey-json-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-extras-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-tests.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//log4j-1.2.17.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-ant.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//curator-framework-2.7.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-io-2.4.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-httpclient-3.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-streaming-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//htrace-core-3.1.0-incubating.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jetty-util-6.1.26.hwx.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-openstack.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-hs.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-extras.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-distcp.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//mockito-all-1.8.5.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//snappy-java-1.0.4.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-cli-1.2.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//activation-1.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jersey-server-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//xz-1.0.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-auth.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jaxb-impl-2.2.3-1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//microsoft-windowsazure-storage-sdk-0.6.0.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-configuration-1.6.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-auth-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-datajoin-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//servlet-api-2.5.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-net-3.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-archives-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//java-xmlbuilder-0.4.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-rumen.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//api-util-1.0.0-M20.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-app-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//curator-client-2.7.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-math3-3.1.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-examples-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//gson-2.2.4.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jets3t-0.9.0.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-ant-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-core.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-core-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jsr305-3.0.0.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jaxb-api-2.2.2.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jersey-core-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-collections-3.2.2.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//junit-4.11.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//netty-3.6.2.Final.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//protobuf-java-2.5.0.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-archives.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-datajoin.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//httpcore-4.2.5.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//zookeeper-3.4.6.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//paranamer-2.3.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-streaming.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-gridmix-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-logging-1.1.3.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-examples.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-rumen-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//metrics-core-3.0.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jsp-api-2.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-digester-1.8.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-lang-2.6.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//xmlenc-0.52.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//stax-api-1.0-2.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-beanutils-core-1.8.0.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//guava-11.0.2.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jsch-0.1.42.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jackson-xc-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-2.7.1.2.4.2.0-258-tests.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//httpclient-4.2.5.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jackson-core-asl-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-codec-1.4.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-common.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-lang3-3.3.2.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jackson-jaxrs-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-sls-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jetty-6.1.26.hwx.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//joda-time-2.9.3.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hamcrest-core-1.3.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-gridmix.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-common-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//curator-recipes-2.7.1.jar:.:/etc/hadoop/conf:/etc/oozie/conf:/usr/hdp/current/oozie-client/lib/.jar:/usr/hdp/current/hadoop-client/lib/.jar:jdbc-mysql.jar:mysql-connector-java-5.1.37-bin.jar:mysql-connector-java.jar:/usr/hdp/current/oozie-client/lib/commons-beanutils-1.7.0.jar:/usr/hdp/current/oozie-client/lib/activemq-client-5.8.0.jar:/usr/hdp/current/oozie-client/lib/jettison-1.1.jar:/usr/hdp/current/oozie-client/lib/jackson-core-2.2.3.jar:/usr/hdp/current/oozie-client/lib/commons-compress-1.4.1.jar:/usr/hdp/current/oozie-client/lib/avro-1.7.4.jar:/usr/hdp/current/oozie-client/lib/jersey-json-1.9.jar:/usr/hdp/current/oozie-client/lib/curator-framework-2.7.1.jar:/usr/hdp/current/oozie-client/lib/commons-io-2.4.jar:/usr/hdp/current/oozie-client/lib/commons-httpclient-3.1.jar:/usr/hdp/current/oozie-client/lib/htrace-core-3.1.0-incubating.jar:/usr/hdp/current/oozie-client/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/current/oozie-client/lib/hadoop-aws-2.7.1.2.4.2.0-258.jar:/usr/hdp/current/oozie-client/lib/snappy-java-1.0.4.1.jar:/usr/hdp/current/oozie-client/lib/commons-cli-1.2.jar:/usr/hdp/current/oozie-client/lib/activation-1.1.jar:/usr/hdp/current/oozie-client/lib/jersey-server-1.9.jar:/usr/hdp/current/oozie-client/lib/xz-1.0.jar:/usr/hdp/current/oozie-client/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/current/oozie-client/lib/microsoft-windowsazure-storage-sdk-0.6.0.jar:/usr/hdp/current/oozie-client/lib/jackson-jaxrs-1.8.3.jar:/usr/hdp/current/oozie-client/lib/joda-time-2.1.jar:/usr/hdp/current/oozie-client/lib/log4j-1.2.16.jar:/usr/hdp/current/oozie-client/lib/commons-configuration-1.6.jar:/usr/hdp/current/oozie-client/lib/hadoop-auth-2.7.1.2.4.2.0-258.jar:/usr/hdp/current/oozie-client/lib/azure-storage-2.2.0.jar:/usr/hdp/current/oozie-client/lib/servlet-api-2.5.jar:/usr/hdp/current/oozie-client/lib/commons-net-3.1.jar:/usr/hdp/current/oozie-client/lib/java-xmlbuilder-0.4.jar:/usr/hdp/current/oozie-client/lib/api-util-1.0.0-M20.jar:/usr/hdp/current/oozie-client/lib/curator-client-2.7.1.jar:/usr/hdp/current/oozie-client/lib/commons-math3-3.1.1.jar:/usr/hdp/current/oozie-client/lib/json-simple-1.1.jar:/usr/hdp/current/oozie-client/lib/gson-2.2.4.jar:/usr/hdp/current/oozie-client/lib/jets3t-0.9.0.jar:/usr/hdp/current/oozie-client/lib/geronimo-j2ee-management_1.1_spec-1.0.1.jar:/usr/hdp/current/oozie-client/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/current/oozie-client/lib/jaxb-api-2.2.2.jar:/usr/hdp/current/oozie-client/lib/jersey-core-1.9.jar:/usr/hdp/current/oozie-client/lib/curator-recipes-2.5.0.jar:/usr/hdp/current/oozie-client/lib/commons-collections-3.2.2.jar:/usr/hdp/current/oozie-client/lib/hadoop-common-2.7.1.2.4.2.0-258.jar:/usr/hdp/current/oozie-client/lib/xercesImpl-2.10.0.jar:/usr/hdp/current/oozie-client/lib/hadoop-azure-2.7.1.2.4.2.0-258.jar:/usr/hdp/current/oozie-client/lib/protobuf-java-2.5.0.jar:/usr/hdp/current/oozie-client/lib/hadoop-annotations-2.7.1.2.4.2.0-258.jar:/usr/hdp/current/oozie-client/lib/hawtbuf-1.9.jar:/usr/hdp/current/oozie-client/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/current/oozie-client/lib/zookeeper-3.4.6.2.4.2.0-258.jar:/usr/hdp/current/oozie-client/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/current/oozie-client/lib/paranamer-2.3.jar:/usr/hdp/current/oozie-client/lib/jsr305-1.3.9.jar:/usr/hdp/current/oozie-client/lib/jetty-6.1.14.jar:/usr/hdp/current/oozie-client/lib/oozie-client-4.2.0.2.4.2.0-258.jar:/usr/hdp/current/oozie-client/lib/commons-logging-1.1.jar:/usr/hdp/current/oozie-client/lib/httpclient-4.3.jar:/usr/hdp/current/oozie-client/lib/commons-digester-1.8.jar:/usr/hdp/current/oozie-client/lib/xmlenc-0.52.jar:/usr/hdp/current/oozie-client/lib/httpcore-4.3.jar:/usr/hdp/current/oozie-client/lib/jline-2.12.jar:/usr/hdp/current/oozie-client/lib/jdk.tools-1.7.jar:/usr/hdp/current/oozie-client/lib/stax-api-1.0-2.jar:/usr/hdp/current/oozie-client/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/current/oozie-client/lib/guava-11.0.2.jar:/usr/hdp/current/oozie-client/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/current/oozie-client/lib/servlet-api-2.5-6.1.14.jar:/usr/hdp/current/oozie-client/lib/aws-java-sdk-1.7.4.jar:/usr/hdp/current/oozie-client/lib/geronimo-jms_1.1_spec-1.1.1.jar:/usr/hdp/current/oozie-client/lib/jackson-databind-2.2.3.jar:/usr/hdp/current/oozie-client/lib/jsch-0.1.42.jar:/usr/hdp/current/oozie-client/lib/jackson-xc-1.8.3.jar:/usr/hdp/current/oozie-client/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/current/oozie-client/lib/commons-codec-1.4.jar:/usr/hdp/current/oozie-client/lib/commons-lang3-3.3.2.jar:/usr/hdp/current/oozie-client/lib/slf4j-api-1.6.6.jar:/usr/hdp/current/oozie-client/lib/asm-3.1.jar:/usr/hdp/current/oozie-client/lib/commons-lang-2.4.jar:/usr/hdp/current/oozie-client/lib/xml-apis-1.4.01.jar:/usr/hdp/current/oozie-client/lib/jackson-annotations-2.2.3.jar:/usr/hdp/current/oozie-client/lib/netty-3.7.0.Final.jar:/usr/hdp/current/oozie-client/lib/oozie-hadoop-auth-hadoop-2-4.2.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-yarn-timeline-history-with-fs-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-yarn-timeline-cache-plugin-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-runtime-library-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-dag-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-yarn-timeline-history-with-acls-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-api-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-examples-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-mapreduce-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-history-parser-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-runtime-internals-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-tests-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-yarn-timeline-history-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-common-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/lib/commons-collections4-4.1.jar:/usr/hdp/2.4.2.0-258/tez/lib/jersey-client-1.9.jar:/usr/hdp/2.4.2.0-258/tez/lib/jersey-json-1.9.jar:/usr/hdp/2.4.2.0-258/tez/lib/commons-io-2.4.jar:/usr/hdp/2.4.2.0-258/tez/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.4.2.0-258/tez/lib/jettison-1.3.4.jar:/usr/hdp/2.4.2.0-258/tez/lib/hadoop-aws-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/lib/jsr305-2.0.3.jar:/usr/hdp/2.4.2.0-258/tez/lib/slf4j-api-1.7.5.jar:/usr/hdp/2.4.2.0-258/tez/lib/commons-cli-1.2.jar:/usr/hdp/2.4.2.0-258/tez/lib/servlet-api-2.5.jar:/usr/hdp/2.4.2.0-258/tez/lib/commons-math3-3.1.1.jar:/usr/hdp/2.4.2.0-258/tez/lib/hadoop-mapreduce-client-core-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/lib/commons-collections-3.2.2.jar:/usr/hdp/2.4.2.0-258/tez/lib/hadoop-azure-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.4.2.0-258/tez/lib/hadoop-annotations-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/lib/commons-lang-2.6.jar:/usr/hdp/2.4.2.0-258/tez/lib/hadoop-yarn-server-timeline-plugins-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/lib/guava-11.0.2.jar:/usr/hdp/2.4.2.0-258/tez/lib/hadoop-yarn-server-web-proxy-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/lib/commons-codec-1.4.jar:/usr/hdp/2.4.2.0-258/tez/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.4.2.0-258/tez/lib/hadoop-mapreduce-client-common-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/conf:.:/etc/hadoop/conf:/etc/oozie/conf:/usr/hdp/current/oozie-client/lib/activation-1.1.jar:/usr/hdp/current/oozie-client/lib/activemq-client-5.8.0.jar:/usr/hdp/current/oozie-client/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/current/oozie-client/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/current/oozie-client/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/current/oozie-client/lib/api-util-1.0.0-M20.jar:/usr/hdp/current/oozie-client/lib/asm-3.1.jar:/usr/hdp/current/oozie-client/lib/avro-1.7.4.jar:/usr/hdp/current/oozie-client/lib/aws-java-sdk-1.7.4.jar:/usr/hdp/current/oozie-client/lib/azure-storage-2.2.0.jar:/usr/hdp/current/oozie-client/lib/commons-beanutils-1.7.0.jar:/usr/hdp/current/oozie-client/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/current/oozie-client/lib/commons-cli-1.2.jar:/usr/hdp/current/oozie-client/lib/commons-codec-1.4.jar:/usr/hdp/current/oozie-client/lib/commons-collections-3.2.2.jar:/usr/hdp/current/oozie-client/lib/commons-compress-1.4.1.jar:/usr/hdp/current/oozie-client/lib/comm SPARK_YARN_USER_ENV -> PYSPARK_PYTHON=/usr/bin/python SPARK_LOG_URL_STDERR -> http://hdpdatanode59:8042/node/containerlogs/container_e82_1515444508016_276923_01_000004/amantrach/stderr?start=-4096 SPARK_YARN_STAGING_DIR -> .sparkStaging/application_1515444508016_276923 SPARK_YARN_CACHE_FILES_FILE_SIZES -> 185971180,357163,44846,31147,6409 SPARK_YARN_CACHE_FILES_VISIBILITIES -> PRIVATE,PRIVATE,PRIVATE,PRIVATE,PRIVATE SPARK_YARN_CACHE_ARCHIVES_FILE_SIZES -> 189601770,132754 SPARK_USER -> amantrach HADOOP_HDFS_HOME -> ”/usr/hdp/current/hadoop-client/” HADOOP_PREFIX -> /usr/hdp/current/hadoop-client SPARK_YARN_CACHE_ARCHIVES_TIME_STAMPS -> 1516058786174,1516061423590 SPARK_YARN_MODE -> true SPARK_YARN_CACHE_FILES_TIME_STAMPS -> 1516061423167,1516061423332,1516061423383,1516061423441,1516061423491 JAVA_HOME -> /usr/lib/jvm/java-7-openjdk-amd64 PYTHONPATH -> {{PWD}}/pyfiles{{PWD}}/pyspark.zip{{PWD}}/py4j-0.9-src.zip{{PWD}}/tfspark.zip LD_LIBRARY_PATH -> /usr/local/cuda/lib64:/usr/lib/jvm/java-7-openjdk-amd64/jre/lib/amd64/server:/usr/hdp/current/hadoop-client//lib/native:/usr/hdp/2.4.2.0-258/usr/lib SPARK_LOG_URL_STDOUT -> http://hdpdatanode59:8042/node/containerlogs/container_e82_1515444508016_276923_01_000004/amantrach/stdout?start=-4096 PYSPARK_PYTHON -> /usr/bin/python SPARK_YARN_CACHE_ARCHIVES_VISIBILITIES -> PUBLIC,PRIVATE SPARK_YARN_CACHE_FILES -> hdfs://kandula/user/amantrach/.sparkStaging/application_1515444508016_276923/spark-assembly-1.6.1.2.4.2.0-258-hadoop2.7.1.2.4.2.0-258.jar#spark.jar,hdfs://kandula/user/amantrach/.sparkStaging/application_1515444508016_276923/pyspark.zip#pyspark.zip,hdfs://kandula/user/amantrach/.sparkStaging/application_1515444508016_276923/py4j-0.9-src.zip#py4j-0.9-src.zip,hdfs://kandula/user/amantrach/.sparkStaging/application_1515444508016_276923/tfspark.zip#tfspark.zip,hdfs://kandula/user/amantrach/.sparkStaging/application_1515444508016_276923/mnist_dist.py#pyfiles__/mnist_dist.py

command: {{JAVA_HOME}}/bin/java -server -XX:OnOutOfMemoryError='kill %p' -Xms2048m -Xmx2048m -Djava.io.tmpdir={{PWD}}/tmp '-Dspark.history.ui.port=18080' '-Dspark.driver.port=47482' '-Dspark.ui.port=0' -Dspark.yarn.app.container.log.dir= -XX:MaxPermSize=256m org.apache.spark.executor.CoarseGrainedExecutorBackend --driver-url spark://CoarseGrainedScheduler@10.4.117.3:47482 --executor-id 3 --hostname hdpdatanode59 --cores 1 --app-id application_1515444508016_276923 --user-class-path file:$PWD/app.jar 1> /stdout 2> /stderr

18/01/16 00:10:39 INFO ContainerManagementProtocolProxy: Opening proxy : hdpdatanode59:45454 18/01/16 00:10:39 WARN Client: No spark assembly jar for HDP on HDFS, defaultSparkAssembly:hdfs://kandula/hdp/apps/2.4.2.0-258/spark/spark-hdp-assembly.jar 18/01/16 00:10:39 INFO ExecutorRunnable:

YARN executor launch context: env: SPARK_YARN_CACHE_ARCHIVES -> hdfs://kandula/user/amantrach/Python.zip#Python,hdfs://kandula/user/amantrach/.sparkStaging/application_1515444508016_276923/spark_conf3358051839168329638.zip#spark_conf CLASSPATH -> {{PWD}}{{PWD}}/spark_conf{{PWD}}/spark__.jar$HADOOP_CONF_DIR/usr/hdp/current/hadoop-client//usr/hdp/current/hadoop-client/lib//usr/hdp/current/hadoop-hdfs-client//usr/hdp/current/hadoop-hdfs-client/lib//usr/hdp/current/hadoop-yarn-client//usr/hdp/current/hadoop-yarn-client/lib//usr/hdp/current/oozie-client/lib/$PWD/mr-framework/hadoop/share/hadoop/mapreduce/:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/lib/:$PWD/mr-framework/hadoop/share/hadoop/common/:$PWD/mr-framework/hadoop/share/hadoop/common/lib/:$PWD/mr-framework/hadoop/share/hadoop/yarn/:$PWD/mr-framework/hadoop/share/hadoop/yarn/lib/:$PWD/mr-framework/hadoop/share/hadoop/hdfs/:$PWD/mr-framework/hadoop/share/hadoop/hdfs/lib/:$PWD/mr-framework/hadoop/share/hadoop/tools/lib/:/usr/hdp/2.4.2.0-258/hadoop/lib/hadoop-lzo-0.6.0.2.4.2.0-258.jar:/etc/hadoop/conf/secure/usr/hdp/2.4.2.0-258/hadoop/conf:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jettison-1.1.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jackson-core-2.2.3.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-compress-1.4.1.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/asm-3.2.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/avro-1.7.4.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jersey-json-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/ranger-plugin-classloader-0.5.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/log4j-1.2.17.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/curator-framework-2.7.1.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/spark-yarn-shuffle.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-io-2.4.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-httpclient-3.1.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/mockito-all-1.8.5.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/ojdbc6.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-cli-1.2.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/activation-1.1.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jersey-server-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/xz-1.0.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/hadoop-lzo-0.6.0.2.4.2.0-258-sources.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/microsoft-windowsazure-storage-sdk-0.6.0.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-configuration-1.6.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/azure-storage-2.2.0.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/servlet-api-2.5.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-net-3.1.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/api-util-1.0.0-M20.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/curator-client-2.7.1.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-math3-3.1.1.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/gson-2.2.4.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/ranger-hdfs-plugin-shim-0.5.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jets3t-0.9.0.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jsr305-3.0.0.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jaxb-api-2.2.2.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jersey-core-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-collections-3.2.2.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/junit-4.11.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/slf4j-api-1.7.10.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/netty-3.6.2.Final.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/httpcore-4.2.5.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/ranger-yarn-plugin-shim-0.5.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/zookeeper-3.4.6.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/paranamer-2.3.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-logging-1.1.3.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/hadoop-lzo-0.6.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/hadoop-lzo-0.6.0.2.4.2.0-258-javadoc.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jsp-api-2.1.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-digester-1.8.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-lang-2.6.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/xmlenc-0.52.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/stax-api-1.0-2.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/guava-11.0.2.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/aws-java-sdk-1.7.4.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jackson-databind-2.2.3.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jsch-0.1.42.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jackson-xc-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/httpclient-4.2.5.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-codec-1.4.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/hamcrest-core-1.3.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/curator-recipes-2.7.1.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jackson-annotations-2.2.3.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-common.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-aws-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-auth.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-auth-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-common-2.7.1.2.4.2.0-258-tests.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-aws.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-nfs.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-nfs-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-common-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-azure-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-annotations-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-annotations.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-common-tests.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-azure.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/./:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/asm-3.2.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/okio-1.4.0.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/okhttp-2.4.0.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/.//hadoop-hdfs-tests.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/.//hadoop-hdfs.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/.//hadoop-hdfs-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/.//hadoop-hdfs-nfs-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/.//hadoop-hdfs-nfs.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/.//hadoop-hdfs-2.7.1.2.4.2.0-258-tests.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jettison-1.1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jackson-core-2.2.3.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/asm-3.2.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/avro-1.7.4.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/log4j-1.2.17.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/curator-framework-2.7.1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-io-2.4.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-httpclient-3.1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/htrace-core-3.1.0-incubating.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/objenesis-2.1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/activation-1.1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/xz-1.0.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/microsoft-windowsazure-storage-sdk-0.6.0.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-configuration-1.6.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-net-3.1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/java-xmlbuilder-0.4.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/api-util-1.0.0-M20.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/fst-2.24.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/javassist-3.18.1-GA.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/curator-client-2.7.1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-math3-3.1.1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/gson-2.2.4.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jets3t-0.9.0.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/guice-3.0.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/javax.inject-1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/httpcore-4.2.5.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/zookeeper-3.4.6.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/paranamer-2.3.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/zookeeper-3.4.6.2.4.2.0-258-tests.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jsp-api-2.1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-digester-1.8.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/xmlenc-0.52.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/guava-11.0.2.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jackson-databind-2.2.3.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jsch-0.1.42.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/httpclient-4.2.5.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/curator-recipes-2.7.1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jackson-annotations-2.2.3.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-tests-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-applications-distributedshell.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-common.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-web-proxy.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-timeline-plugins.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-client-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-common.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-common-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-nodemanager-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-registry.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-api-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-client.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-registry-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-applications-distributedshell-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-nodemanager.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-timeline-plugins-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-tests.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-resourcemanager.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-web-proxy-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-common-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-api.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-resourcemanager-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/asm-3.2.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/xz-1.0.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/guice-3.0.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/junit-4.11.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-openstack-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-beanutils-1.7.0.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-sls.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-distcp-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jettison-1.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jackson-core-2.2.3.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-compress-1.4.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//asm-3.2.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//avro-1.7.4.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-app.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jersey-json-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-extras-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-tests.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//log4j-1.2.17.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-ant.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//curator-framework-2.7.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-io-2.4.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-httpclient-3.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-streaming-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//htrace-core-3.1.0-incubating.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jetty-util-6.1.26.hwx.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-openstack.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-hs.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-extras.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-distcp.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//mockito-all-1.8.5.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//snappy-java-1.0.4.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-cli-1.2.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//activation-1.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jersey-server-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//xz-1.0.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-auth.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jaxb-impl-2.2.3-1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//microsoft-windowsazure-storage-sdk-0.6.0.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-configuration-1.6.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-auth-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-datajoin-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//servlet-api-2.5.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-net-3.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-archives-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//java-xmlbuilder-0.4.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-rumen.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//api-util-1.0.0-M20.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-app-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//curator-client-2.7.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-math3-3.1.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-examples-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//gson-2.2.4.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jets3t-0.9.0.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-ant-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-core.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-core-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jsr305-3.0.0.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jaxb-api-2.2.2.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jersey-core-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-collections-3.2.2.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//junit-4.11.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//netty-3.6.2.Final.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//protobuf-java-2.5.0.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-archives.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-datajoin.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//httpcore-4.2.5.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//zookeeper-3.4.6.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//paranamer-2.3.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-streaming.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-gridmix-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-logging-1.1.3.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-examples.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-rumen-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//metrics-core-3.0.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jsp-api-2.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-digester-1.8.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-lang-2.6.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//xmlenc-0.52.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//stax-api-1.0-2.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-beanutils-core-1.8.0.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//guava-11.0.2.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jsch-0.1.42.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jackson-xc-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-2.7.1.2.4.2.0-258-tests.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//httpclient-4.2.5.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jackson-core-asl-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-codec-1.4.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-common.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-lang3-3.3.2.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jackson-jaxrs-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-sls-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jetty-6.1.26.hwx.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//joda-time-2.9.3.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hamcrest-core-1.3.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-gridmix.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-common-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//curator-recipes-2.7.1.jar:.:/etc/hadoop/conf:/etc/oozie/conf:/usr/hdp/current/oozie-client/lib/.jar:/usr/hdp/current/hadoop-client/lib/.jar:jdbc-mysql.jar:mysql-connector-java-5.1.37-bin.jar:mysql-connector-java.jar:/usr/hdp/current/oozie-client/lib/commons-beanutils-1.7.0.jar:/usr/hdp/current/oozie-client/lib/activemq-client-5.8.0.jar:/usr/hdp/current/oozie-client/lib/jettison-1.1.jar:/usr/hdp/current/oozie-client/lib/jackson-core-2.2.3.jar:/usr/hdp/current/oozie-client/lib/commons-compress-1.4.1.jar:/usr/hdp/current/oozie-client/lib/avro-1.7.4.jar:/usr/hdp/current/oozie-client/lib/jersey-json-1.9.jar:/usr/hdp/current/oozie-client/lib/curator-framework-2.7.1.jar:/usr/hdp/current/oozie-client/lib/commons-io-2.4.jar:/usr/hdp/current/oozie-client/lib/commons-httpclient-3.1.jar:/usr/hdp/current/oozie-client/lib/htrace-core-3.1.0-incubating.jar:/usr/hdp/current/oozie-client/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/current/oozie-client/lib/hadoop-aws-2.7.1.2.4.2.0-258.jar:/usr/hdp/current/oozie-client/lib/snappy-java-1.0.4.1.jar:/usr/hdp/current/oozie-client/lib/commons-cli-1.2.jar:/usr/hdp/current/oozie-client/lib/activation-1.1.jar:/usr/hdp/current/oozie-client/lib/jersey-server-1.9.jar:/usr/hdp/current/oozie-client/lib/xz-1.0.jar:/usr/hdp/current/oozie-client/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/current/oozie-client/lib/microsoft-windowsazure-storage-sdk-0.6.0.jar:/usr/hdp/current/oozie-client/lib/jackson-jaxrs-1.8.3.jar:/usr/hdp/current/oozie-client/lib/joda-time-2.1.jar:/usr/hdp/current/oozie-client/lib/log4j-1.2.16.jar:/usr/hdp/current/oozie-client/lib/commons-configuration-1.6.jar:/usr/hdp/current/oozie-client/lib/hadoop-auth-2.7.1.2.4.2.0-258.jar:/usr/hdp/current/oozie-client/lib/azure-storage-2.2.0.jar:/usr/hdp/current/oozie-client/lib/servlet-api-2.5.jar:/usr/hdp/current/oozie-client/lib/commons-net-3.1.jar:/usr/hdp/current/oozie-client/lib/java-xmlbuilder-0.4.jar:/usr/hdp/current/oozie-client/lib/api-util-1.0.0-M20.jar:/usr/hdp/current/oozie-client/lib/curator-client-2.7.1.jar:/usr/hdp/current/oozie-client/lib/commons-math3-3.1.1.jar:/usr/hdp/current/oozie-client/lib/json-simple-1.1.jar:/usr/hdp/current/oozie-client/lib/gson-2.2.4.jar:/usr/hdp/current/oozie-client/lib/jets3t-0.9.0.jar:/usr/hdp/current/oozie-client/lib/geronimo-j2ee-management_1.1_spec-1.0.1.jar:/usr/hdp/current/oozie-client/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/current/oozie-client/lib/jaxb-api-2.2.2.jar:/usr/hdp/current/oozie-client/lib/jersey-core-1.9.jar:/usr/hdp/current/oozie-client/lib/curator-recipes-2.5.0.jar:/usr/hdp/current/oozie-client/lib/commons-collections-3.2.2.jar:/usr/hdp/current/oozie-client/lib/hadoop-common-2.7.1.2.4.2.0-258.jar:/usr/hdp/current/oozie-client/lib/xercesImpl-2.10.0.jar:/usr/hdp/current/oozie-client/lib/hadoop-azure-2.7.1.2.4.2.0-258.jar:/usr/hdp/current/oozie-client/lib/protobuf-java-2.5.0.jar:/usr/hdp/current/oozie-client/lib/hadoop-annotations-2.7.1.2.4.2.0-258.jar:/usr/hdp/current/oozie-client/lib/hawtbuf-1.9.jar:/usr/hdp/current/oozie-client/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/current/oozie-client/lib/zookeeper-3.4.6.2.4.2.0-258.jar:/usr/hdp/current/oozie-client/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/current/oozie-client/lib/paranamer-2.3.jar:/usr/hdp/current/oozie-client/lib/jsr305-1.3.9.jar:/usr/hdp/current/oozie-client/lib/jetty-6.1.14.jar:/usr/hdp/current/oozie-client/lib/oozie-client-4.2.0.2.4.2.0-258.jar:/usr/hdp/current/oozie-client/lib/commons-logging-1.1.jar:/usr/hdp/current/oozie-client/lib/httpclient-4.3.jar:/usr/hdp/current/oozie-client/lib/commons-digester-1.8.jar:/usr/hdp/current/oozie-client/lib/xmlenc-0.52.jar:/usr/hdp/current/oozie-client/lib/httpcore-4.3.jar:/usr/hdp/current/oozie-client/lib/jline-2.12.jar:/usr/hdp/current/oozie-client/lib/jdk.tools-1.7.jar:/usr/hdp/current/oozie-client/lib/stax-api-1.0-2.jar:/usr/hdp/current/oozie-client/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/current/oozie-client/lib/guava-11.0.2.jar:/usr/hdp/current/oozie-client/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/current/oozie-client/lib/servlet-api-2.5-6.1.14.jar:/usr/hdp/current/oozie-client/lib/aws-java-sdk-1.7.4.jar:/usr/hdp/current/oozie-client/lib/geronimo-jms_1.1_spec-1.1.1.jar:/usr/hdp/current/oozie-client/lib/jackson-databind-2.2.3.jar:/usr/hdp/current/oozie-client/lib/jsch-0.1.42.jar:/usr/hdp/current/oozie-client/lib/jackson-xc-1.8.3.jar:/usr/hdp/current/oozie-client/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/current/oozie-client/lib/commons-codec-1.4.jar:/usr/hdp/current/oozie-client/lib/commons-lang3-3.3.2.jar:/usr/hdp/current/oozie-client/lib/slf4j-api-1.6.6.jar:/usr/hdp/current/oozie-client/lib/asm-3.1.jar:/usr/hdp/current/oozie-client/lib/commons-lang-2.4.jar:/usr/hdp/current/oozie-client/lib/xml-apis-1.4.01.jar:/usr/hdp/current/oozie-client/lib/jackson-annotations-2.2.3.jar:/usr/hdp/current/oozie-client/lib/netty-3.7.0.Final.jar:/usr/hdp/current/oozie-client/lib/oozie-hadoop-auth-hadoop-2-4.2.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-yarn-timeline-history-with-fs-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-yarn-timeline-cache-plugin-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-runtime-library-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-dag-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-yarn-timeline-history-with-acls-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-api-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-examples-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-mapreduce-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-history-parser-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-runtime-internals-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-tests-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-yarn-timeline-history-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-common-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/lib/commons-collections4-4.1.jar:/usr/hdp/2.4.2.0-258/tez/lib/jersey-client-1.9.jar:/usr/hdp/2.4.2.0-258/tez/lib/jersey-json-1.9.jar:/usr/hdp/2.4.2.0-258/tez/lib/commons-io-2.4.jar:/usr/hdp/2.4.2.0-258/tez/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.4.2.0-258/tez/lib/jettison-1.3.4.jar:/usr/hdp/2.4.2.0-258/tez/lib/hadoop-aws-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/lib/jsr305-2.0.3.jar:/usr/hdp/2.4.2.0-258/tez/lib/slf4j-api-1.7.5.jar:/usr/hdp/2.4.2.0-258/tez/lib/commons-cli-1.2.jar:/usr/hdp/2.4.2.0-258/tez/lib/servlet-api-2.5.jar:/usr/hdp/2.4.2.0-258/tez/lib/commons-math3-3.1.1.jar:/usr/hdp/2.4.2.0-258/tez/lib/hadoop-mapreduce-client-core-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/lib/commons-collections-3.2.2.jar:/usr/hdp/2.4.2.0-258/tez/lib/hadoop-azure-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.4.2.0-258/tez/lib/hadoop-annotations-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/lib/commons-lang-2.6.jar:/usr/hdp/2.4.2.0-258/tez/lib/hadoop-yarn-server-timeline-plugins-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/lib/guava-11.0.2.jar:/usr/hdp/2.4.2.0-258/tez/lib/hadoop-yarn-server-web-proxy-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/lib/commons-codec-1.4.jar:/usr/hdp/2.4.2.0-258/tez/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.4.2.0-258/tez/lib/hadoop-mapreduce-client-common-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/conf:.:/etc/hadoop/conf:/etc/oozie/conf:/usr/hdp/current/oozie-client/lib/activation-1.1.jar:/usr/hdp/current/oozie-client/lib/activemq-client-5.8.0.jar:/usr/hdp/current/oozie-client/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/current/oozie-client/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/current/oozie-client/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/current/oozie-client/lib/api-util-1.0.0-M20.jar:/usr/hdp/current/oozie-client/lib/asm-3.1.jar:/usr/hdp/current/oozie-client/lib/avro-1.7.4.jar:/usr/hdp/current/oozie-client/lib/aws-java-sdk-1.7.4.jar:/usr/hdp/current/oozie-client/lib/azure-storage-2.2.0.jar:/usr/hdp/current/oozie-client/lib/commons-beanutils-1.7.0.jar:/usr/hdp/current/oozie-client/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/current/oozie-client/lib/commons-cli-1.2.jar:/usr/hdp/current/oozie-client/lib/commons-codec-1.4.jar:/usr/hdp/current/oozie-client/lib/commons-collections-3.2.2.jar:/usr/hdp/current/oozie-client/lib/commons-compress-1.4.1.jar:/usr/hdp/current/oozie-client/lib/comm SPARK_YARN_USER_ENV -> PYSPARK_PYTHON=/usr/bin/python SPARK_LOG_URL_STDERR -> http://hdpdatanode33:8042/node/containerlogs/container_e82_1515444508016_276923_01_000003/amantrach/stderr?start=-4096 SPARK_YARN_STAGING_DIR -> .sparkStaging/application_1515444508016_276923 SPARK_YARN_CACHE_FILES_FILE_SIZES -> 185971180,357163,44846,31147,6409 SPARK_YARN_CACHE_FILES_VISIBILITIES -> PRIVATE,PRIVATE,PRIVATE,PRIVATE,PRIVATE SPARK_YARN_CACHE_ARCHIVES_FILE_SIZES -> 189601770,132754 SPARK_USER -> amantrach HADOOP_HDFS_HOME -> ”/usr/hdp/current/hadoop-client/” HADOOP_PREFIX -> /usr/hdp/current/hadoop-client SPARK_YARN_CACHE_ARCHIVES_TIME_STAMPS -> 1516058786174,1516061423590 SPARK_YARN_MODE -> true SPARK_YARN_CACHE_FILES_TIME_STAMPS -> 1516061423167,1516061423332,1516061423383,1516061423441,1516061423491 JAVA_HOME -> /usr/lib/jvm/java-7-openjdk-amd64 PYTHONPATH -> {{PWD}}/pyfiles{{PWD}}/pyspark.zip{{PWD}}/py4j-0.9-src.zip{{PWD}}/tfspark.zip LD_LIBRARY_PATH -> /usr/local/cuda/lib64:/usr/lib/jvm/java-7-openjdk-amd64/jre/lib/amd64/server:/usr/hdp/current/hadoop-client//lib/native:/usr/hdp/2.4.2.0-258/usr/lib SPARK_LOG_URL_STDOUT -> http://hdpdatanode33:8042/node/containerlogs/container_e82_1515444508016_276923_01_000003/amantrach/stdout?start=-4096 PYSPARK_PYTHON -> /usr/bin/python SPARK_YARN_CACHE_ARCHIVES_VISIBILITIES -> PUBLIC,PRIVATE SPARK_YARN_CACHE_FILES -> hdfs://kandula/user/amantrach/.sparkStaging/application_1515444508016_276923/spark-assembly-1.6.1.2.4.2.0-258-hadoop2.7.1.2.4.2.0-258.jar#spark.jar,hdfs://kandula/user/amantrach/.sparkStaging/application_1515444508016_276923/pyspark.zip#pyspark.zip,hdfs://kandula/user/amantrach/.sparkStaging/application_1515444508016_276923/py4j-0.9-src.zip#py4j-0.9-src.zip,hdfs://kandula/user/amantrach/.sparkStaging/application_1515444508016_276923/tfspark.zip#tfspark.zip,hdfs://kandula/user/amantrach/.sparkStaging/application_1515444508016_276923/mnist_dist.py#pyfiles__/mnist_dist.py

command: {{JAVA_HOME}}/bin/java -server -XX:OnOutOfMemoryError='kill %p' -Xms2048m -Xmx2048m -Djava.io.tmpdir={{PWD}}/tmp '-Dspark.history.ui.port=18080' '-Dspark.driver.port=47482' '-Dspark.ui.port=0' -Dspark.yarn.app.container.log.dir= -XX:MaxPermSize=256m org.apache.spark.executor.CoarseGrainedExecutorBackend --driver-url spark://CoarseGrainedScheduler@10.4.117.3:47482 --executor-id 2 --hostname hdpdatanode33 --cores 1 --app-id application_1515444508016_276923 --user-class-path file:$PWD/app.jar 1> /stdout 2> /stderr

18/01/16 00:10:39 INFO ContainerManagementProtocolProxy: Opening proxy : hdpdatanode33:45454 18/01/16 00:10:39 INFO ExecutorRunnable:

YARN executor launch context: env: SPARK_YARN_CACHE_ARCHIVES -> hdfs://kandula/user/amantrach/Python.zip#Python,hdfs://kandula/user/amantrach/.sparkStaging/application_1515444508016_276923/spark_conf3358051839168329638.zip#spark_conf CLASSPATH -> {{PWD}}{{PWD}}/spark_conf{{PWD}}/spark__.jar$HADOOP_CONF_DIR/usr/hdp/current/hadoop-client//usr/hdp/current/hadoop-client/lib//usr/hdp/current/hadoop-hdfs-client//usr/hdp/current/hadoop-hdfs-client/lib//usr/hdp/current/hadoop-yarn-client//usr/hdp/current/hadoop-yarn-client/lib//usr/hdp/current/oozie-client/lib/$PWD/mr-framework/hadoop/share/hadoop/mapreduce/:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/lib/:$PWD/mr-framework/hadoop/share/hadoop/common/:$PWD/mr-framework/hadoop/share/hadoop/common/lib/:$PWD/mr-framework/hadoop/share/hadoop/yarn/:$PWD/mr-framework/hadoop/share/hadoop/yarn/lib/:$PWD/mr-framework/hadoop/share/hadoop/hdfs/:$PWD/mr-framework/hadoop/share/hadoop/hdfs/lib/:$PWD/mr-framework/hadoop/share/hadoop/tools/lib/:/usr/hdp/2.4.2.0-258/hadoop/lib/hadoop-lzo-0.6.0.2.4.2.0-258.jar:/etc/hadoop/conf/secure/usr/hdp/2.4.2.0-258/hadoop/conf:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jettison-1.1.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jackson-core-2.2.3.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-compress-1.4.1.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/asm-3.2.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/avro-1.7.4.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jersey-json-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/ranger-plugin-classloader-0.5.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/log4j-1.2.17.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/curator-framework-2.7.1.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/spark-yarn-shuffle.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-io-2.4.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-httpclient-3.1.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/mockito-all-1.8.5.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/ojdbc6.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-cli-1.2.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/activation-1.1.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jersey-server-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/xz-1.0.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/hadoop-lzo-0.6.0.2.4.2.0-258-sources.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/microsoft-windowsazure-storage-sdk-0.6.0.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-configuration-1.6.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/azure-storage-2.2.0.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/servlet-api-2.5.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-net-3.1.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/api-util-1.0.0-M20.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/curator-client-2.7.1.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-math3-3.1.1.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/gson-2.2.4.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/ranger-hdfs-plugin-shim-0.5.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jets3t-0.9.0.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jsr305-3.0.0.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jaxb-api-2.2.2.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jersey-core-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-collections-3.2.2.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/junit-4.11.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/slf4j-api-1.7.10.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/netty-3.6.2.Final.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/httpcore-4.2.5.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/ranger-yarn-plugin-shim-0.5.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/zookeeper-3.4.6.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/paranamer-2.3.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-logging-1.1.3.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/hadoop-lzo-0.6.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/hadoop-lzo-0.6.0.2.4.2.0-258-javadoc.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jsp-api-2.1.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-digester-1.8.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-lang-2.6.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/xmlenc-0.52.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/stax-api-1.0-2.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/guava-11.0.2.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/aws-java-sdk-1.7.4.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jackson-databind-2.2.3.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jsch-0.1.42.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jackson-xc-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/httpclient-4.2.5.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-codec-1.4.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/hamcrest-core-1.3.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/curator-recipes-2.7.1.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jackson-annotations-2.2.3.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-common.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-aws-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-auth.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-auth-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-common-2.7.1.2.4.2.0-258-tests.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-aws.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-nfs.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-nfs-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-common-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-azure-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-annotations-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-annotations.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-common-tests.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-azure.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/./:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/asm-3.2.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/okio-1.4.0.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/okhttp-2.4.0.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/.//hadoop-hdfs-tests.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/.//hadoop-hdfs.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/.//hadoop-hdfs-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/.//hadoop-hdfs-nfs-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/.//hadoop-hdfs-nfs.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/.//hadoop-hdfs-2.7.1.2.4.2.0-258-tests.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jettison-1.1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jackson-core-2.2.3.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/asm-3.2.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/avro-1.7.4.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/log4j-1.2.17.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/curator-framework-2.7.1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-io-2.4.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-httpclient-3.1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/htrace-core-3.1.0-incubating.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/objenesis-2.1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/activation-1.1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/xz-1.0.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/microsoft-windowsazure-storage-sdk-0.6.0.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-configuration-1.6.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-net-3.1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/java-xmlbuilder-0.4.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/api-util-1.0.0-M20.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/fst-2.24.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/javassist-3.18.1-GA.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/curator-client-2.7.1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-math3-3.1.1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/gson-2.2.4.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jets3t-0.9.0.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/guice-3.0.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/javax.inject-1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/httpcore-4.2.5.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/zookeeper-3.4.6.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/paranamer-2.3.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/zookeeper-3.4.6.2.4.2.0-258-tests.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jsp-api-2.1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-digester-1.8.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/xmlenc-0.52.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/guava-11.0.2.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jackson-databind-2.2.3.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jsch-0.1.42.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/httpclient-4.2.5.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/curator-recipes-2.7.1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jackson-annotations-2.2.3.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-tests-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-applications-distributedshell.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-common.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-web-proxy.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-timeline-plugins.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-client-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-common.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-common-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-nodemanager-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-registry.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-api-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-client.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-registry-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-applications-distributedshell-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-nodemanager.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-timeline-plugins-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-tests.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-resourcemanager.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-web-proxy-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-common-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-api.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-resourcemanager-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/asm-3.2.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/xz-1.0.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/guice-3.0.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/junit-4.11.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-openstack-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-beanutils-1.7.0.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-sls.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-distcp-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jettison-1.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jackson-core-2.2.3.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-compress-1.4.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//asm-3.2.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//avro-1.7.4.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-app.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jersey-json-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-extras-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-tests.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//log4j-1.2.17.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-ant.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//curator-framework-2.7.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-io-2.4.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-httpclient-3.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-streaming-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//htrace-core-3.1.0-incubating.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jetty-util-6.1.26.hwx.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-openstack.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-hs.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-extras.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-distcp.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//mockito-all-1.8.5.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//snappy-java-1.0.4.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-cli-1.2.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//activation-1.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jersey-server-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//xz-1.0.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-auth.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jaxb-impl-2.2.3-1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//microsoft-windowsazure-storage-sdk-0.6.0.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-configuration-1.6.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-auth-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-datajoin-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//servlet-api-2.5.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-net-3.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-archives-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//java-xmlbuilder-0.4.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-rumen.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//api-util-1.0.0-M20.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-app-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//curator-client-2.7.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-math3-3.1.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-examples-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//gson-2.2.4.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jets3t-0.9.0.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-ant-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-core.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-core-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jsr305-3.0.0.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jaxb-api-2.2.2.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jersey-core-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-collections-3.2.2.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//junit-4.11.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//netty-3.6.2.Final.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//protobuf-java-2.5.0.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-archives.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-datajoin.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//httpcore-4.2.5.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//zookeeper-3.4.6.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//paranamer-2.3.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-streaming.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-gridmix-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-logging-1.1.3.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-examples.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-rumen-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//metrics-core-3.0.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jsp-api-2.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-digester-1.8.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-lang-2.6.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//xmlenc-0.52.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//stax-api-1.0-2.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-beanutils-core-1.8.0.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//guava-11.0.2.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jsch-0.1.42.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jackson-xc-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-2.7.1.2.4.2.0-258-tests.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//httpclient-4.2.5.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jackson-core-asl-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-codec-1.4.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-common.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-lang3-3.3.2.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jackson-jaxrs-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-sls-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jetty-6.1.26.hwx.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//joda-time-2.9.3.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hamcrest-core-1.3.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-gridmix.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-common-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//curator-recipes-2.7.1.jar:.:/etc/hadoop/conf:/etc/oozie/conf:/usr/hdp/current/oozie-client/lib/.jar:/usr/hdp/current/hadoop-client/lib/.jar:jdbc-mysql.jar:mysql-connector-java-5.1.37-bin.jar:mysql-connector-java.jar:/usr/hdp/current/oozie-client/lib/commons-beanutils-1.7.0.jar:/usr/hdp/current/oozie-client/lib/activemq-client-5.8.0.jar:/usr/hdp/current/oozie-client/lib/jettison-1.1.jar:/usr/hdp/current/oozie-client/lib/jackson-core-2.2.3.jar:/usr/hdp/current/oozie-client/lib/commons-compress-1.4.1.jar:/usr/hdp/current/oozie-client/lib/avro-1.7.4.jar:/usr/hdp/current/oozie-client/lib/jersey-json-1.9.jar:/usr/hdp/current/oozie-client/lib/curator-framework-2.7.1.jar:/usr/hdp/current/oozie-client/lib/commons-io-2.4.jar:/usr/hdp/current/oozie-client/lib/commons-httpclient-3.1.jar:/usr/hdp/current/oozie-client/lib/htrace-core-3.1.0-incubating.jar:/usr/hdp/current/oozie-client/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/current/oozie-client/lib/hadoop-aws-2.7.1.2.4.2.0-258.jar:/usr/hdp/current/oozie-client/lib/snappy-java-1.0.4.1.jar:/usr/hdp/current/oozie-client/lib/commons-cli-1.2.jar:/usr/hdp/current/oozie-client/lib/activation-1.1.jar:/usr/hdp/current/oozie-client/lib/jersey-server-1.9.jar:/usr/hdp/current/oozie-client/lib/xz-1.0.jar:/usr/hdp/current/oozie-client/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/current/oozie-client/lib/microsoft-windowsazure-storage-sdk-0.6.0.jar:/usr/hdp/current/oozie-client/lib/jackson-jaxrs-1.8.3.jar:/usr/hdp/current/oozie-client/lib/joda-time-2.1.jar:/usr/hdp/current/oozie-client/lib/log4j-1.2.16.jar:/usr/hdp/current/oozie-client/lib/commons-configuration-1.6.jar:/usr/hdp/current/oozie-client/lib/hadoop-auth-2.7.1.2.4.2.0-258.jar:/usr/hdp/current/oozie-client/lib/azure-storage-2.2.0.jar:/usr/hdp/current/oozie-client/lib/servlet-api-2.5.jar:/usr/hdp/current/oozie-client/lib/commons-net-3.1.jar:/usr/hdp/current/oozie-client/lib/java-xmlbuilder-0.4.jar:/usr/hdp/current/oozie-client/lib/api-util-1.0.0-M20.jar:/usr/hdp/current/oozie-client/lib/curator-client-2.7.1.jar:/usr/hdp/current/oozie-client/lib/commons-math3-3.1.1.jar:/usr/hdp/current/oozie-client/lib/json-simple-1.1.jar:/usr/hdp/current/oozie-client/lib/gson-2.2.4.jar:/usr/hdp/current/oozie-client/lib/jets3t-0.9.0.jar:/usr/hdp/current/oozie-client/lib/geronimo-j2ee-management_1.1_spec-1.0.1.jar:/usr/hdp/current/oozie-client/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/current/oozie-client/lib/jaxb-api-2.2.2.jar:/usr/hdp/current/oozie-client/lib/jersey-core-1.9.jar:/usr/hdp/current/oozie-client/lib/curator-recipes-2.5.0.jar:/usr/hdp/current/oozie-client/lib/commons-collections-3.2.2.jar:/usr/hdp/current/oozie-client/lib/hadoop-common-2.7.1.2.4.2.0-258.jar:/usr/hdp/current/oozie-client/lib/xercesImpl-2.10.0.jar:/usr/hdp/current/oozie-client/lib/hadoop-azure-2.7.1.2.4.2.0-258.jar:/usr/hdp/current/oozie-client/lib/protobuf-java-2.5.0.jar:/usr/hdp/current/oozie-client/lib/hadoop-annotations-2.7.1.2.4.2.0-258.jar:/usr/hdp/current/oozie-client/lib/hawtbuf-1.9.jar:/usr/hdp/current/oozie-client/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/current/oozie-client/lib/zookeeper-3.4.6.2.4.2.0-258.jar:/usr/hdp/current/oozie-client/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/current/oozie-client/lib/paranamer-2.3.jar:/usr/hdp/current/oozie-client/lib/jsr305-1.3.9.jar:/usr/hdp/current/oozie-client/lib/jetty-6.1.14.jar:/usr/hdp/current/oozie-client/lib/oozie-client-4.2.0.2.4.2.0-258.jar:/usr/hdp/current/oozie-client/lib/commons-logging-1.1.jar:/usr/hdp/current/oozie-client/lib/httpclient-4.3.jar:/usr/hdp/current/oozie-client/lib/commons-digester-1.8.jar:/usr/hdp/current/oozie-client/lib/xmlenc-0.52.jar:/usr/hdp/current/oozie-client/lib/httpcore-4.3.jar:/usr/hdp/current/oozie-client/lib/jline-2.12.jar:/usr/hdp/current/oozie-client/lib/jdk.tools-1.7.jar:/usr/hdp/current/oozie-client/lib/stax-api-1.0-2.jar:/usr/hdp/current/oozie-client/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/current/oozie-client/lib/guava-11.0.2.jar:/usr/hdp/current/oozie-client/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/current/oozie-client/lib/servlet-api-2.5-6.1.14.jar:/usr/hdp/current/oozie-client/lib/aws-java-sdk-1.7.4.jar:/usr/hdp/current/oozie-client/lib/geronimo-jms_1.1_spec-1.1.1.jar:/usr/hdp/current/oozie-client/lib/jackson-databind-2.2.3.jar:/usr/hdp/current/oozie-client/lib/jsch-0.1.42.jar:/usr/hdp/current/oozie-client/lib/jackson-xc-1.8.3.jar:/usr/hdp/current/oozie-client/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/current/oozie-client/lib/commons-codec-1.4.jar:/usr/hdp/current/oozie-client/lib/commons-lang3-3.3.2.jar:/usr/hdp/current/oozie-client/lib/slf4j-api-1.6.6.jar:/usr/hdp/current/oozie-client/lib/asm-3.1.jar:/usr/hdp/current/oozie-client/lib/commons-lang-2.4.jar:/usr/hdp/current/oozie-client/lib/xml-apis-1.4.01.jar:/usr/hdp/current/oozie-client/lib/jackson-annotations-2.2.3.jar:/usr/hdp/current/oozie-client/lib/netty-3.7.0.Final.jar:/usr/hdp/current/oozie-client/lib/oozie-hadoop-auth-hadoop-2-4.2.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-yarn-timeline-history-with-fs-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-yarn-timeline-cache-plugin-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-runtime-library-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-dag-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-yarn-timeline-history-with-acls-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-api-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-examples-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-mapreduce-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-history-parser-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-runtime-internals-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-tests-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-yarn-timeline-history-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-common-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/lib/commons-collections4-4.1.jar:/usr/hdp/2.4.2.0-258/tez/lib/jersey-client-1.9.jar:/usr/hdp/2.4.2.0-258/tez/lib/jersey-json-1.9.jar:/usr/hdp/2.4.2.0-258/tez/lib/commons-io-2.4.jar:/usr/hdp/2.4.2.0-258/tez/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.4.2.0-258/tez/lib/jettison-1.3.4.jar:/usr/hdp/2.4.2.0-258/tez/lib/hadoop-aws-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/lib/jsr305-2.0.3.jar:/usr/hdp/2.4.2.0-258/tez/lib/slf4j-api-1.7.5.jar:/usr/hdp/2.4.2.0-258/tez/lib/commons-cli-1.2.jar:/usr/hdp/2.4.2.0-258/tez/lib/servlet-api-2.5.jar:/usr/hdp/2.4.2.0-258/tez/lib/commons-math3-3.1.1.jar:/usr/hdp/2.4.2.0-258/tez/lib/hadoop-mapreduce-client-core-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/lib/commons-collections-3.2.2.jar:/usr/hdp/2.4.2.0-258/tez/lib/hadoop-azure-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.4.2.0-258/tez/lib/hadoop-annotations-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/lib/commons-lang-2.6.jar:/usr/hdp/2.4.2.0-258/tez/lib/hadoop-yarn-server-timeline-plugins-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/lib/guava-11.0.2.jar:/usr/hdp/2.4.2.0-258/tez/lib/hadoop-yarn-server-web-proxy-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/lib/commons-codec-1.4.jar:/usr/hdp/2.4.2.0-258/tez/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.4.2.0-258/tez/lib/hadoop-mapreduce-client-common-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/conf:.:/etc/hadoop/conf:/etc/oozie/conf:/usr/hdp/current/oozie-client/lib/activation-1.1.jar:/usr/hdp/current/oozie-client/lib/activemq-client-5.8.0.jar:/usr/hdp/current/oozie-client/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/current/oozie-client/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/current/oozie-client/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/current/oozie-client/lib/api-util-1.0.0-M20.jar:/usr/hdp/current/oozie-client/lib/asm-3.1.jar:/usr/hdp/current/oozie-client/lib/avro-1.7.4.jar:/usr/hdp/current/oozie-client/lib/aws-java-sdk-1.7.4.jar:/usr/hdp/current/oozie-client/lib/azure-storage-2.2.0.jar:/usr/hdp/current/oozie-client/lib/commons-beanutils-1.7.0.jar:/usr/hdp/current/oozie-client/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/current/oozie-client/lib/commons-cli-1.2.jar:/usr/hdp/current/oozie-client/lib/commons-codec-1.4.jar:/usr/hdp/current/oozie-client/lib/commons-collections-3.2.2.jar:/usr/hdp/current/oozie-client/lib/commons-compress-1.4.1.jar:/usr/hdp/current/oozie-client/lib/comm SPARK_YARN_USER_ENV -> PYSPARK_PYTHON=/usr/bin/python SPARK_LOG_URL_STDERR -> http://hdpdatanode40:8042/node/containerlogs/container_e82_1515444508016_276923_01_000005/amantrach/stderr?start=-4096 SPARK_YARN_STAGING_DIR -> .sparkStaging/application_1515444508016_276923 SPARK_YARN_CACHE_FILES_FILE_SIZES -> 185971180,357163,44846,31147,6409 SPARK_YARN_CACHE_FILES_VISIBILITIES -> PRIVATE,PRIVATE,PRIVATE,PRIVATE,PRIVATE SPARK_YARN_CACHE_ARCHIVES_FILE_SIZES -> 189601770,132754 SPARK_USER -> amantrach HADOOP_HDFS_HOME -> ”/usr/hdp/current/hadoop-client/” HADOOP_PREFIX -> /usr/hdp/current/hadoop-client SPARK_YARN_CACHE_ARCHIVES_TIME_STAMPS -> 1516058786174,1516061423590 SPARK_YARN_MODE -> true SPARK_YARN_CACHE_FILES_TIME_STAMPS -> 1516061423167,1516061423332,1516061423383,1516061423441,1516061423491 JAVA_HOME -> /usr/lib/jvm/java-7-openjdk-amd64 PYTHONPATH -> {{PWD}}/pyfiles{{PWD}}/pyspark.zip{{PWD}}/py4j-0.9-src.zip{{PWD}}/tfspark.zip LD_LIBRARY_PATH -> /usr/local/cuda/lib64:/usr/lib/jvm/java-7-openjdk-amd64/jre/lib/amd64/server:/usr/hdp/current/hadoop-client//lib/native:/usr/hdp/2.4.2.0-258/usr/lib SPARK_LOG_URL_STDOUT -> http://hdpdatanode40:8042/node/containerlogs/container_e82_1515444508016_276923_01_000005/amantrach/stdout?start=-4096 PYSPARK_PYTHON -> /usr/bin/python SPARK_YARN_CACHE_ARCHIVES_VISIBILITIES -> PUBLIC,PRIVATE SPARK_YARN_CACHE_FILES -> hdfs://kandula/user/amantrach/.sparkStaging/application_1515444508016_276923/spark-assembly-1.6.1.2.4.2.0-258-hadoop2.7.1.2.4.2.0-258.jar#spark.jar,hdfs://kandula/user/amantrach/.sparkStaging/application_1515444508016_276923/pyspark.zip#pyspark.zip,hdfs://kandula/user/amantrach/.sparkStaging/application_1515444508016_276923/py4j-0.9-src.zip#py4j-0.9-src.zip,hdfs://kandula/user/amantrach/.sparkStaging/application_1515444508016_276923/tfspark.zip#tfspark.zip,hdfs://kandula/user/amantrach/.sparkStaging/application_1515444508016_276923/mnist_dist.py#pyfiles__/mnist_dist.py

command: {{JAVA_HOME}}/bin/java -server -XX:OnOutOfMemoryError='kill %p' -Xms2048m -Xmx2048m -Djava.io.tmpdir={{PWD}}/tmp '-Dspark.history.ui.port=18080' '-Dspark.driver.port=47482' '-Dspark.ui.port=0' -Dspark.yarn.app.container.log.dir= -XX:MaxPermSize=256m org.apache.spark.executor.CoarseGrainedExecutorBackend --driver-url spark://CoarseGrainedScheduler@10.4.117.3:47482 --executor-id 4 --hostname hdpdatanode40 --cores 1 --app-id application_1515444508016_276923 --user-class-path file:$PWD/app.jar 1> /stdout 2> /stderr

18/01/16 00:10:39 INFO ContainerManagementProtocolProxy: Opening proxy : hdpdatanode40:45454 18/01/16 00:10:44 INFO AMRMClientImpl: Received new token for : hdpdatanode67:45454 18/01/16 00:10:44 INFO AMRMClientImpl: Received new token for : hdpdatanode79:45454 18/01/16 00:10:44 INFO AMRMClientImpl: Received new token for : hdpdatanode15:45454 18/01/16 00:10:44 INFO YarnAllocator: Received 3 containers from YARN, launching executors on 0 of them. 18/01/16 00:10:52 INFO YarnClusterSchedulerBackend: Registered executor NettyRpcEndpointRef(null) (hdpdatanode33:54414) with ID 2 18/01/16 00:10:52 INFO BlockManagerMasterEndpoint: Registering block manager hdpdatanode33:52249 with 1247.6 MB RAM, BlockManagerId(2, hdpdatanode33, 52249) 18/01/16 00:10:53 INFO YarnClusterSchedulerBackend: Registered executor NettyRpcEndpointRef(null) (hdpdatanode40:58875) with ID 4 18/01/16 00:10:53 INFO BlockManagerMasterEndpoint: Registering block manager hdpdatanode40:47214 with 1247.6 MB RAM, BlockManagerId(4, hdpdatanode40, 47214) 18/01/16 00:10:53 INFO YarnClusterSchedulerBackend: Registered executor NettyRpcEndpointRef(null) (hdpdatanode59:46650) with ID 3 18/01/16 00:10:53 INFO BlockManagerMasterEndpoint: Registering block manager hdpdatanode59:58653 with 1247.6 MB RAM, BlockManagerId(3, hdpdatanode59, 58653) 18/01/16 00:10:53 INFO YarnClusterSchedulerBackend: Registered executor NettyRpcEndpointRef(null) (hdpdatanode28:56003) with ID 1 18/01/16 00:10:53 INFO YarnClusterSchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.8 18/01/16 00:10:53 INFO YarnClusterScheduler: YarnClusterScheduler.postStartHook done 18/01/16 00:10:53 INFO BlockManagerMasterEndpoint: Registering block manager hdpdatanode28:35635 with 1247.6 MB RAM, BlockManagerId(1, hdpdatanode28, 35635) 18/01/16 00:10:54 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 235.5 KB, free 235.5 KB) 18/01/16 00:10:54 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 28.0 KB, free 263.5 KB) 18/01/16 00:10:54 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on 10.4.117.3:48485 (size: 28.0 KB, free: 457.8 MB) 18/01/16 00:10:54 INFO SparkContext: Created broadcast 0 from textFile at NativeMethodAccessorImpl.java:-2 18/01/16 00:10:54 INFO MemoryStore: Block broadcast_1 stored as values in memory (estimated size 235.5 KB, free 499.0 KB) 18/01/16 00:10:54 INFO MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 28.0 KB, free 527.0 KB) 18/01/16 00:10:54 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on 10.4.117.3:48485 (size: 28.0 KB, free: 457.8 MB) 18/01/16 00:10:54 INFO SparkContext: Created broadcast 1 from textFile at NativeMethodAccessorImpl.java:-2 18/01/16 00:10:54 INFO FileInputFormat: Total input paths to process : 10 18/01/16 00:10:54 INFO FileInputFormat: Total input paths to process : 10 18/01/16 00:10:54 INFO SparkContext: Starting job: foreachPartition at /data1/hadoop/yarn/local/usercache/amantrach/appcache/application_1515444508016_276923/container_e82_1515444508016_276923_01_000001/tfspark.zip/tensorflowonspark/TFCluster.py:247 18/01/16 00:10:54 INFO DAGScheduler: Got job 0 (foreachPartition at /data1/hadoop/yarn/local/usercache/amantrach/appcache/application_1515444508016_276923/container_e82_1515444508016_276923_01_000001/tfspark.zip/tensorflowonspark/TFCluster.py:247) with 4 output partitions 18/01/16 00:10:54 INFO DAGScheduler: Final stage: ResultStage 0 (foreachPartition at /data1/hadoop/yarn/local/usercache/amantrach/appcache/application_1515444508016_276923/container_e82_1515444508016_276923_01_000001/tfspark.zip/tensorflowonspark/TFCluster.py:247) 18/01/16 00:10:54 INFO DAGScheduler: Parents of final stage: List() 18/01/16 00:10:54 INFO DAGScheduler: Missing parents: List() 18/01/16 00:10:54 INFO DAGScheduler: Submitting ResultStage 0 (PythonRDD[8] at foreachPartition at /data1/hadoop/yarn/local/usercache/amantrach/appcache/application_1515444508016_276923/container_e82_1515444508016_276923_01_000001/tfspark.zip/tensorflowonspark/TFCluster.py:247), which has no missing parents 18/01/16 00:10:54 INFO MemoryStore: Block broadcast_2 stored as values in memory (estimated size 41.9 KB, free 568.9 KB) 18/01/16 00:10:54 INFO MemoryStore: Block broadcast_2_piece0 stored as bytes in memory (estimated size 13.5 KB, free 582.3 KB) 18/01/16 00:10:54 INFO BlockManagerInfo: Added broadcast_2_piece0 in memory on 10.4.117.3:48485 (size: 13.5 KB, free: 457.8 MB) 18/01/16 00:10:54 INFO SparkContext: Created broadcast 2 from broadcast at DAGScheduler.scala:1006 18/01/16 00:10:54 INFO DAGScheduler: Submitting 4 missing tasks from ResultStage 0 (PythonRDD[8] at foreachPartition at /data1/hadoop/yarn/local/usercache/amantrach/appcache/application_1515444508016_276923/container_e82_1515444508016_276923_01_000001/tfspark.zip/tensorflowonspark/TFCluster.py:247) 18/01/16 00:10:54 INFO YarnClusterScheduler: Adding task set 0.0 with 4 tasks 18/01/16 00:10:54 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, hdpdatanode59, partition 0,PROCESS_LOCAL, 2083 bytes) 18/01/16 00:10:54 INFO TaskSetManager: Starting task 1.0 in stage 0.0 (TID 1, hdpdatanode33, partition 1,PROCESS_LOCAL, 2083 bytes) 18/01/16 00:10:54 INFO TaskSetManager: Starting task 2.0 in stage 0.0 (TID 2, hdpdatanode28, partition 2,PROCESS_LOCAL, 2083 bytes) 18/01/16 00:10:54 INFO TaskSetManager: Starting task 3.0 in stage 0.0 (TID 3, hdpdatanode40, partition 3,PROCESS_LOCAL, 2083 bytes) 18/01/16 00:10:55 INFO BlockManagerInfo: Added broadcast_2_piece0 in memory on hdpdatanode28:35635 (size: 13.5 KB, free: 1247.6 MB) 18/01/16 00:10:55 INFO BlockManagerInfo: Added broadcast_2_piece0 in memory on hdpdatanode59:58653 (size: 13.5 KB, free: 1247.6 MB) 18/01/16 00:10:55 INFO BlockManagerInfo: Added broadcast_2_piece0 in memory on hdpdatanode33:52249 (size: 13.5 KB, free: 1247.6 MB) 18/01/16 00:10:55 INFO BlockManagerInfo: Added broadcast_2_piece0 in memory on hdpdatanode40:47214 (size: 13.5 KB, free: 1247.6 MB) 18/01/16 00:10:56 INFO SparkContext: Starting job: collect at PythonRDD.scala:405 18/01/16 00:10:56 INFO DAGScheduler: Got job 1 (collect at PythonRDD.scala:405) with 10 output partitions 18/01/16 00:10:56 INFO DAGScheduler: Final stage: ResultStage 1 (collect at PythonRDD.scala:405) 18/01/16 00:10:56 INFO DAGScheduler: Parents of final stage: List() 18/01/16 00:10:56 INFO DAGScheduler: Missing parents: List() 18/01/16 00:10:56 INFO DAGScheduler: Submitting ResultStage 1 (PythonRDD[10] at RDD at PythonRDD.scala:43), which has no missing parents 18/01/16 00:10:56 INFO MemoryStore: Block broadcast_3 stored as values in memory (estimated size 103.7 KB, free 686.0 KB) 18/01/16 00:10:56 INFO MemoryStore: Block broadcast_3_piece0 stored as bytes in memory (estimated size 24.6 KB, free 710.6 KB) 18/01/16 00:10:56 INFO BlockManagerInfo: Added broadcast_3_piece0 in memory on 10.4.117.3:48485 (size: 24.6 KB, free: 457.8 MB) 18/01/16 00:10:56 INFO SparkContext: Created broadcast 3 from broadcast at DAGScheduler.scala:1006 18/01/16 00:10:56 INFO DAGScheduler: Submitting 10 missing tasks from ResultStage 1 (PythonRDD[10] at RDD at PythonRDD.scala:43) 18/01/16 00:10:56 INFO YarnClusterScheduler: Adding task set 1.0 with 10 tasks 18/01/16 00:10:58 INFO TaskSetManager: Starting task 1.0 in stage 1.0 (TID 4, hdpdatanode40, partition 1,NODE_LOCAL, 2638 bytes) 18/01/16 00:10:58 INFO TaskSetManager: Finished task 3.0 in stage 0.0 (TID 3) in 3240 ms on hdpdatanode40 (1/4) 18/01/16 00:10:58 INFO BlockManagerInfo: Added broadcast_3_piece0 in memory on hdpdatanode40:47214 (size: 24.6 KB, free: 1247.6 MB) 18/01/16 00:10:58 INFO TaskSetManager: Finished task 2.0 in stage 0.0 (TID 2) in 3323 ms on hdpdatanode28 (2/4) 18/01/16 00:10:58 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on hdpdatanode40:47214 (size: 28.0 KB, free: 1247.6 MB) 18/01/16 00:10:58 INFO TaskSetManager: Finished task 1.0 in stage 0.0 (TID 1) in 3344 ms on hdpdatanode33 (3/4) 18/01/16 00:10:59 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on hdpdatanode40:47214 (size: 28.0 KB, free: 1247.5 MB) 18/01/16 00:11:01 INFO TaskSetManager: Starting task 2.0 in stage 1.0 (TID 5, hdpdatanode33, partition 2,RACK_LOCAL, 2638 bytes) 18/01/16 00:11:01 INFO TaskSetManager: Starting task 0.0 in stage 1.0 (TID 6, hdpdatanode28, partition 0,RACK_LOCAL, 2638 bytes) 18/01/16 00:11:01 INFO BlockManagerInfo: Added broadcast_3_piece0 in memory on hdpdatanode28:35635 (size: 24.6 KB, free: 1247.6 MB) 18/01/16 00:11:01 INFO BlockManagerInfo: Added broadcast_3_piece0 in memory on hdpdatanode33:52249 (size: 24.6 KB, free: 1247.6 MB) 18/01/16 00:11:01 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on hdpdatanode33:52249 (size: 28.0 KB, free: 1247.6 MB) 18/01/16 00:11:01 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on hdpdatanode28:35635 (size: 28.0 KB, free: 1247.6 MB) 18/01/16 00:11:02 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on hdpdatanode28:35635 (size: 28.0 KB, free: 1247.5 MB) 18/01/16 00:11:02 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on hdpdatanode33:52249 (size: 28.0 KB, free: 1247.5 MB)`

Job stderr: `SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/data4/hadoop/yarn/local/usercache/amantrach/filecache/2998/spark-assembly-1.6.1.2.4.2.0-258-hadoop2.7.1.2.4.2.0-258.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/usr/hdp/2.4.2.0-258/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] 18/01/16 00:10:51 INFO CoarseGrainedExecutorBackend: Registered signal handlers for [TERM, HUP, INT] 18/01/16 00:10:52 INFO SecurityManager: Changing view acls to: yarn,amantrach 18/01/16 00:10:52 INFO SecurityManager: Changing modify acls to: yarn,amantrach 18/01/16 00:10:52 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(yarn, amantrach); users with modify permissions: Set(yarn, amantrach) 18/01/16 00:10:52 INFO SecurityManager: Changing view acls to: yarn,amantrach 18/01/16 00:10:52 INFO SecurityManager: Changing modify acls to: yarn,amantrach 18/01/16 00:10:52 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(yarn, amantrach); users with modify permissions: Set(yarn, amantrach) 18/01/16 00:10:53 INFO Slf4jLogger: Slf4jLogger started 18/01/16 00:10:53 INFO Remoting: Starting remoting 18/01/16 00:10:53 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkExecutorActorSystem@hdpdatanode28:56206] 18/01/16 00:10:53 INFO Utils: Successfully started service 'sparkExecutorActorSystem' on port 56206. 18/01/16 00:10:53 INFO DiskBlockManager: Created local directory at /data6/hadoop/yarn/local/usercache/amantrach/appcache/application_1515444508016_276923/blockmgr-b5195463-3ac9-40d0-a119-cb3571862f51 18/01/16 00:10:53 INFO DiskBlockManager: Created local directory at /data7/hadoop/yarn/local/usercache/amantrach/appcache/application_1515444508016_276923/blockmgr-45fa63ee-52ed-41d6-9e22-eedad3f9d3c6 18/01/16 00:10:53 INFO DiskBlockManager: Created local directory at /data1/hadoop/yarn/local/usercache/amantrach/appcache/application_1515444508016_276923/blockmgr-7fc92b55-4039-464a-9f58-577bd8183273 18/01/16 00:10:53 INFO DiskBlockManager: Created local directory at /data2/hadoop/yarn/local/usercache/amantrach/appcache/application_1515444508016_276923/blockmgr-c0f30d04-bed5-4ff3-9d4c-05bb67057ef1 18/01/16 00:10:53 INFO DiskBlockManager: Created local directory at /data3/hadoop/yarn/local/usercache/amantrach/appcache/application_1515444508016_276923/blockmgr-117f68a4-efa7-49ca-bae1-31916b89313d 18/01/16 00:10:53 INFO DiskBlockManager: Created local directory at /data10/hadoop/yarn/local/usercache/amantrach/appcache/application_1515444508016_276923/blockmgr-97281634-57bc-4c48-9a86-fe285d5ced36 18/01/16 00:10:53 INFO DiskBlockManager: Created local directory at /data4/hadoop/yarn/local/usercache/amantrach/appcache/application_1515444508016_276923/blockmgr-4570e7d5-72c9-4147-9ed1-bf549703fd6b 18/01/16 00:10:53 INFO DiskBlockManager: Created local directory at /data5/hadoop/yarn/local/usercache/amantrach/appcache/application_1515444508016_276923/blockmgr-851b3f10-b6f9-437a-8867-aad2c81eb145 18/01/16 00:10:53 INFO DiskBlockManager: Created local directory at /data9/hadoop/yarn/local/usercache/amantrach/appcache/application_1515444508016_276923/blockmgr-16bb01bc-aca9-460e-870a-97a6697434b9 18/01/16 00:10:53 INFO DiskBlockManager: Created local directory at /data8/hadoop/yarn/local/usercache/amantrach/appcache/application_1515444508016_276923/blockmgr-cea50a3a-5ac8-4c9b-b8a2-c76b9226b793 18/01/16 00:10:53 INFO MemoryStore: MemoryStore started with capacity 1247.6 MB 18/01/16 00:10:53 INFO CoarseGrainedExecutorBackend: Connecting to driver: spark://CoarseGrainedScheduler@10.4.117.3:47482 18/01/16 00:10:53 INFO CoarseGrainedExecutorBackend: Successfully registered with driver 18/01/16 00:10:53 INFO Executor: Starting executor ID 1 on host hdpdatanode28 18/01/16 00:10:53 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 35635. 18/01/16 00:10:53 INFO NettyBlockTransferService: Server created on 35635 18/01/16 00:10:53 INFO BlockManagerMaster: Trying to register BlockManager 18/01/16 00:10:54 INFO BlockManagerMaster: Registered BlockManager 18/01/16 00:10:54 INFO CoarseGrainedExecutorBackend: Got assigned task 2 18/01/16 00:10:54 INFO Executor: Running task 2.0 in stage 0.0 (TID 2) 18/01/16 00:10:54 INFO TorrentBroadcast: Started reading broadcast variable 2 18/01/16 00:10:55 INFO MemoryStore: Block broadcast_2_piece0 stored as bytes in memory (estimated size 13.5 KB, free 13.5 KB) 18/01/16 00:10:55 INFO TorrentBroadcast: Reading broadcast variable 2 took 129 ms 18/01/16 00:10:55 INFO MemoryStore: Block broadcast_2 stored as values in memory (estimated size 41.9 KB, free 55.4 KB) 2018-01-16 00:10:56,109 INFO (MainThread-110455) connected to server at ('10.4.117.3', 39909) 2018-01-16 00:10:56,113 INFO (MainThread-110455) TFSparkNode.reserve: {'authkey': '\x04\xdf\x87_X\x8eM\x1e\xb0|2\x95\xcf\x8c\xa6L', 'worker_num': 2, 'host': '10.4.97.115', 'tb_port': 0, 'addr': '/tmp/pymp-IPJQ1U/listener-GpK5sS', 'ppid': 110354, 'task_index': 1, 'job_name': 'worker', 'tb_pid': 0, 'port': 53601} 2018-01-16 00:10:58,123 INFO (MainThread-110455) node: {'addr': ('10.4.111.25', 35673), 'task_index': 0, 'job_name': 'ps', 'authkey': '@\x93\xfc\xf1u!D\xc3\xa8jk\x1d\xad\xc7E,', 'worker_num': 0, 'host': '10.4.111.25', 'ppid': 33624, 'port': 55327, 'tb_pid': 0, 'tb_port': 0} 2018-01-16 00:10:58,124 INFO (MainThread-110455) node: {'addr': '/tmp/pymp-X8WJpQ/listener-9MvGAO', 'task_index': 0, 'job_name': 'worker', 'authkey': "\x90\xde\x1d\xb5~;@\xe4\x939.\x93\t\xe5\x8d'", 'worker_num': 1, 'host': '10.4.116.175', 'ppid': 74380, 'port': 36831, 'tb_pid': 0, 'tb_port': 0} 2018-01-16 00:10:58,124 INFO (MainThread-110455) node: {'addr': '/tmp/pymp-IPJQ1U/listener-GpK5sS', 'task_index': 1, 'job_name': 'worker', 'authkey': '\x04\xdf\x87_X\x8eM\x1e\xb0|2\x95\xcf\x8c\xa6L', 'worker_num': 2, 'host': '10.4.97.115', 'ppid': 110354, 'port': 53601, 'tb_pid': 0, 'tb_port': 0} 2018-01-16 00:10:58,124 INFO (MainThread-110455) node: {'addr': '/tmp/pymp-zf55NH/listener-A8vko2', 'task_index': 2, 'job_name': 'worker', 'authkey': 'u(CW\xd0\xe1O\x13\x98\x8er\xdb\xcc\x91sY', 'worker_num': 3, 'host': '10.4.100.8', 'ppid': 66809, 'port': 38280, 'tb_pid': 0, 'tb_port': 0} 2018-01-16 00:10:58,128 INFO (MainThread-110455) Starting TensorFlow worker:1 on cluster node 2 on background process 18/01/16 00:10:58 INFO PythonRunner: Times: total = 2974, boot = 648, init = 83, finish = 2243 18/01/16 00:10:58 INFO Executor: Finished task 2.0 in stage 0.0 (TID 2). 999 bytes result sent to driver 2018-01-16 00:10:59,506 INFO (MainThread-110524) 2: ======== worker:1 ======== 2018-01-16 00:10:59,506 INFO (MainThread-110524) 2: Cluster spec: {'ps': ['10.4.111.25:55327'], 'worker': ['10.4.116.175:36831', '10.4.97.115:53601', '10.4.100.8:38280']} 2018-01-16 00:10:59,506 INFO (MainThread-110524) 2: Using CPU D0116 00:10:59.507997717 110524 env_linux.c:77] Warning: insecure environment read function 'getenv' used 2018-01-16 00:10:59.518565: I tensorflow/core/distributed_runtime/rpc/grpc_channel.cc:215] Initialize GrpcChannelCache for job ps -> {0 -> 10.4.111.25:55327} 2018-01-16 00:10:59.518585: I tensorflow/core/distributed_runtime/rpc/grpc_channel.cc:215] Initialize GrpcChannelCache for job worker -> {0 -> 10.4.116.175:36831, 1 -> localhost:53601, 2 -> 10.4.100.8:38280} 2018-01-16 00:10:59.519473: I tensorflow/core/distributed_runtime/rpc/grpc_server_lib.cc:316] Started server with target: grpc://localhost:53601 tensorflow model path: hdfs://kandula/user/amantrach/mnist_model 18/01/16 00:11:01 INFO CoarseGrainedExecutorBackend: Got assigned task 6 18/01/16 00:11:01 INFO Executor: Running task 0.0 in stage 1.0 (TID 6) 18/01/16 00:11:01 INFO TorrentBroadcast: Started reading broadcast variable 3 18/01/16 00:11:01 INFO MemoryStore: Block broadcast_3_piece0 stored as bytes in memory (estimated size 24.6 KB, free 80.0 KB) 18/01/16 00:11:01 INFO TorrentBroadcast: Reading broadcast variable 3 took 45 ms 18/01/16 00:11:01 INFO MemoryStore: Block broadcast_3 stored as values in memory (estimated size 103.7 KB, free 183.7 KB) 18/01/16 00:11:01 INFO HadoopRDD: Input split: hdfs://kandula/user/amantrach/mnist/csv/train/images/part-00000.deflate:0+1134757 18/01/16 00:11:01 INFO TorrentBroadcast: Started reading broadcast variable 0 18/01/16 00:11:01 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 28.0 KB, free 211.6 KB) 18/01/16 00:11:01 INFO TorrentBroadcast: Reading broadcast variable 0 took 17 ms 18/01/16 00:11:01 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 343.1 KB, free 554.7 KB) 18/01/16 00:11:02 INFO deprecation: mapred.tip.id is deprecated. Instead, use mapreduce.task.id 18/01/16 00:11:02 INFO deprecation: mapred.task.id is deprecated. Instead, use mapreduce.task.attempt.id 18/01/16 00:11:02 INFO deprecation: mapred.task.is.map is deprecated. Instead, use mapreduce.task.ismap 18/01/16 00:11:02 INFO deprecation: mapred.task.partition is deprecated. Instead, use mapreduce.task.partition 18/01/16 00:11:02 INFO deprecation: mapred.job.id is deprecated. Instead, use mapreduce.job.id 18/01/16 00:11:02 INFO ZlibFactory: Successfully loaded & initialized native-zlib library 18/01/16 00:11:02 INFO CodecPool: Got brand-new decompressor [.deflate] 18/01/16 00:11:02 INFO HadoopRDD: Input split: hdfs://kandula/user/amantrach/mnist/csv/train/labels/part-00000.deflate:0+7565 18/01/16 00:11:02 INFO TorrentBroadcast: Started reading broadcast variable 1 18/01/16 00:11:02 INFO MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 28.0 KB, free 582.7 KB) 18/01/16 00:11:02 INFO TorrentBroadcast: Reading broadcast variable 1 took 19 ms 18/01/16 00:11:02 INFO MemoryStore: Block broadcast_1 stored as values in memory (estimated size 343.1 KB, free 925.8 KB) 18/01/16 00:11:02 INFO CodecPool: Got brand-new decompressor [.deflate] 2018-01-16 00:11:02,646 INFO (MainThread-111831) Connected to TFSparkNode.mgr on 10.4.97.115, ppid=110354, state='running' 2018-01-16 00:11:02,651 INFO (MainThread-111831) mgr.state='running' 2018-01-16 00:11:02,651 INFO (MainThread-111831) Feeding partition <generator object load_stream at 0x7f9d39fcedc0> into input queue <multiprocessing.queues.JoinableQueue object at 0x7f9d39ff3e50> 2018-01-16 00:11:03.944359: I tensorflow/core/distributed_runtime/master_session.cc:999] Start master session 6776c04ae3c78e3b with config:

INFO:tensorflow:Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:11:03,996 INFO (MainThread-110524) Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 18/01/16 00:11:04 INFO PythonRunner: Times: total = 2052, boot = -4195, init = 4244, finish = 2003 18/01/16 00:11:04 INFO PythonRunner: Times: total = 93, boot = 3, init = 38, finish = 52 2018-01-16 00:11:34.029776: I tensorflow/core/distributed_runtime/master_session.cc:999] Start master session f213c988ef700adb with config:

INFO:tensorflow:Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:11:34,049 INFO (MainThread-110524) Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:12:04.089953: I tensorflow/core/distributed_runtime/master_session.cc:999] Start master session 3885418072df8dbf with config:

INFO:tensorflow:Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:12:04,115 INFO (MainThread-110524) Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:12:34.142165: I tensorflow/core/distributed_runtime/master_session.cc:999] Start master session a594f123869e09ca with config:

INFO:tensorflow:Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:12:34,167 INFO (MainThread-110524) Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:13:04.208841: I tensorflow/core/distributed_runtime/master_session.cc:999] Start master session ea6ecf68fcc742e6 with config:

INFO:tensorflow:Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:13:04,234 INFO (MainThread-110524) Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:13:34.274622: I tensorflow/core/distributed_runtime/master_session.cc:999] Start master session 8ed302ed64aab3b2 with config:

INFO:tensorflow:Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:13:34,293 INFO (MainThread-110524) Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:14:04.333314: I tensorflow/core/distributed_runtime/master_session.cc:999] Start master session fcf8060f5e8b4fcc with config:

INFO:tensorflow:Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:14:04,361 INFO (MainThread-110524) Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:14:34.402339: I tensorflow/core/distributed_runtime/master_session.cc:999] Start master session 519da1d4ef764fb0 with config:

INFO:tensorflow:Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:14:34,435 INFO (MainThread-110524) Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:15:04.478151: I tensorflow/core/distributed_runtime/master_session.cc:999] Start master session a4a660e54da3ad4c with config:

INFO:tensorflow:Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:15:04,497 INFO (MainThread-110524) Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:15:34.539368: I tensorflow/core/distributed_runtime/master_session.cc:999] Start master session dd0c80a4ab69325f with config:

INFO:tensorflow:Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:15:34,557 INFO (MainThread-110524) Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:16:04.578888: I tensorflow/core/distributed_runtime/master_session.cc:999] Start master session cb348b03d102b40a with config:

INFO:tensorflow:Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:16:04,604 INFO (MainThread-110524) Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:16:34.643456: I tensorflow/core/distributed_runtime/master_session.cc:999] Start master session 07cc27662f1ad758 with config:

INFO:tensorflow:Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:16:34,671 INFO (MainThread-110524) Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:17:04.711058: I tensorflow/core/distributed_runtime/master_session.cc:999] Start master session 74298bc9ee827000 with config:

INFO:tensorflow:Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:17:04,735 INFO (MainThread-110524) Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:17:34.776427: I tensorflow/core/distributed_runtime/master_session.cc:999] Start master session 820fcb6f8e42f46e with config:

INFO:tensorflow:Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:17:34,801 INFO (MainThread-110524) Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad`

Job 2 stderr

SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/data3/hadoop/yarn/local/usercache/amantrach/filecache/768/spark-assembly-1.6.1.2.4.2.0-258-hadoop2.7.1.2.4.2.0-258.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/usr/hdp/2.4.2.0-258/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] 18/01/16 00:10:49 INFO CoarseGrainedExecutorBackend: Registered signal handlers for [TERM, HUP, INT] 18/01/16 00:10:50 INFO SecurityManager: Changing view acls to: yarn,amantrach 18/01/16 00:10:50 INFO SecurityManager: Changing modify acls to: yarn,amantrach 18/01/16 00:10:50 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(yarn, amantrach); users with modify permissions: Set(yarn, amantrach) 18/01/16 00:10:51 INFO SecurityManager: Changing view acls to: yarn,amantrach 18/01/16 00:10:51 INFO SecurityManager: Changing modify acls to: yarn,amantrach 18/01/16 00:10:51 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(yarn, amantrach); users with modify permissions: Set(yarn, amantrach) 18/01/16 00:10:51 INFO Slf4jLogger: Slf4jLogger started 18/01/16 00:10:51 INFO Remoting: Starting remoting 18/01/16 00:10:51 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkExecutorActorSystem@hdpdatanode33:53854] 18/01/16 00:10:51 INFO Utils: Successfully started service 'sparkExecutorActorSystem' on port 53854. 18/01/16 00:10:51 INFO DiskBlockManager: Created local directory at /data5/hadoop/yarn/local/usercache/amantrach/appcache/application_1515444508016_276923/blockmgr-fb898ac5-4f57-4c2f-87fe-9def943e1e57 18/01/16 00:10:51 INFO DiskBlockManager: Created local directory at /data7/hadoop/yarn/local/usercache/amantrach/appcache/application_1515444508016_276923/blockmgr-13de7323-4267-446b-86e5-b5901872efb4 18/01/16 00:10:51 INFO DiskBlockManager: Created local directory at /data4/hadoop/yarn/local/usercache/amantrach/appcache/application_1515444508016_276923/blockmgr-9d32e3fa-71ed-4517-97c0-7454490aee54 18/01/16 00:10:51 INFO DiskBlockManager: Created local directory at /data6/hadoop/yarn/local/usercache/amantrach/appcache/application_1515444508016_276923/blockmgr-544c8d0f-02bb-4f2d-8e67-962181671567 18/01/16 00:10:51 INFO DiskBlockManager: Created local directory at /data8/hadoop/yarn/local/usercache/amantrach/appcache/application_1515444508016_276923/blockmgr-82276aca-9497-4003-9db6-65b5bdd93252 18/01/16 00:10:51 INFO DiskBlockManager: Created local directory at /data9/hadoop/yarn/local/usercache/amantrach/appcache/application_1515444508016_276923/blockmgr-2d538d4f-c1e6-44b8-82ba-d7a1ed9dd8e8 18/01/16 00:10:51 INFO DiskBlockManager: Created local directory at /data1/hadoop/yarn/local/usercache/amantrach/appcache/application_1515444508016_276923/blockmgr-0ddb5184-81fa-44b9-9afb-90fa6fa779b7 18/01/16 00:10:51 INFO DiskBlockManager: Created local directory at /data2/hadoop/yarn/local/usercache/amantrach/appcache/application_1515444508016_276923/blockmgr-eba04e3d-2a96-486c-bb20-019d97a752db 18/01/16 00:10:51 INFO DiskBlockManager: Created local directory at /data3/hadoop/yarn/local/usercache/amantrach/appcache/application_1515444508016_276923/blockmgr-f51edba9-b0a2-491c-bed6-73e6a35d9fbb 18/01/16 00:10:51 INFO DiskBlockManager: Created local directory at /data10/hadoop/yarn/local/usercache/amantrach/appcache/application_1515444508016_276923/blockmgr-168db3da-11b2-46ff-b8ce-3bb885149875 18/01/16 00:10:51 INFO MemoryStore: MemoryStore started with capacity 1247.6 MB 18/01/16 00:10:52 INFO CoarseGrainedExecutorBackend: Connecting to driver: spark://CoarseGrainedScheduler@10.4.117.3:47482 18/01/16 00:10:52 INFO CoarseGrainedExecutorBackend: Successfully registered with driver 18/01/16 00:10:52 INFO Executor: Starting executor ID 2 on host hdpdatanode33 18/01/16 00:10:52 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 52249. 18/01/16 00:10:52 INFO NettyBlockTransferService: Server created on 52249 18/01/16 00:10:52 INFO BlockManagerMaster: Trying to register BlockManager 18/01/16 00:10:52 INFO BlockManagerMaster: Registered BlockManager 18/01/16 00:10:54 INFO CoarseGrainedExecutorBackend: Got assigned task 1 18/01/16 00:10:54 INFO Executor: Running task 1.0 in stage 0.0 (TID 1) 18/01/16 00:10:54 INFO TorrentBroadcast: Started reading broadcast variable 2 18/01/16 00:10:55 INFO MemoryStore: Block broadcast_2_piece0 stored as bytes in memory (estimated size 13.5 KB, free 13.5 KB) 18/01/16 00:10:55 INFO TorrentBroadcast: Reading broadcast variable 2 took 127 ms 18/01/16 00:10:55 INFO MemoryStore: Block broadcast_2 stored as values in memory (estimated size 41.9 KB, free 55.4 KB) 2018-01-16 00:10:56,134 INFO (MainThread-74459) connected to server at ('10.4.117.3', 39909) 2018-01-16 00:10:56,137 INFO (MainThread-74459) TFSparkNode.reserve: {'authkey': "\x90\xde\x1d\xb5~;@\xe4\x939.\x93\t\xe5\x8d'", 'worker_num': 1, 'host': '10.4.116.175', 'tb_port': 0, 'addr': '/tmp/pymp-X8WJpQ/listener-9MvGAO', 'ppid': 74380, 'task_index': 0, 'job_name': 'worker', 'tb_pid': 0, 'port': 36831} 2018-01-16 00:10:58,144 INFO (MainThread-74459) node: {'addr': ('10.4.111.25', 35673), 'task_index': 0, 'job_name': 'ps', 'authkey': '@\x93\xfc\xf1u!D\xc3\xa8jk\x1d\xad\xc7E,', 'worker_num': 0, 'host': '10.4.111.25', 'ppid': 33624, 'port': 55327, 'tb_pid': 0, 'tb_port': 0} 2018-01-16 00:10:58,144 INFO (MainThread-74459) node: {'addr': '/tmp/pymp-X8WJpQ/listener-9MvGAO', 'task_index': 0, 'job_name': 'worker', 'authkey': "\x90\xde\x1d\xb5~;@\xe4\x939.\x93\t\xe5\x8d'", 'worker_num': 1, 'host': '10.4.116.175', 'ppid': 74380, 'port': 36831, 'tb_pid': 0, 'tb_port': 0} 2018-01-16 00:10:58,144 INFO (MainThread-74459) node: {'addr': '/tmp/pymp-IPJQ1U/listener-GpK5sS', 'task_index': 1, 'job_name': 'worker', 'authkey': '\x04\xdf\x87_X\x8eM\x1e\xb0|2\x95\xcf\x8c\xa6L', 'worker_num': 2, 'host': '10.4.97.115', 'ppid': 110354, 'port': 53601, 'tb_pid': 0, 'tb_port': 0} 2018-01-16 00:10:58,145 INFO (MainThread-74459) node: {'addr': '/tmp/pymp-zf55NH/listener-A8vko2', 'task_index': 2, 'job_name': 'worker', 'authkey': 'u(CW\xd0\xe1O\x13\x98\x8er\xdb\xcc\x91sY', 'worker_num': 3, 'host': '10.4.100.8', 'ppid': 66809, 'port': 38280, 'tb_pid': 0, 'tb_port': 0} 2018-01-16 00:10:58,149 INFO (MainThread-74459) Starting TensorFlow worker:0 on cluster node 1 on background process 18/01/16 00:10:58 INFO PythonRunner: Times: total = 2995, boot = 683, init = 91, finish = 2221 18/01/16 00:10:58 INFO Executor: Finished task 1.0 in stage 0.0 (TID 1). 999 bytes result sent to driver 2018-01-16 00:10:59,555 INFO (MainThread-74532) 1: ======== worker:0 ======== 2018-01-16 00:10:59,555 INFO (MainThread-74532) 1: Cluster spec: {'ps': ['10.4.111.25:55327'], 'worker': ['10.4.116.175:36831', '10.4.97.115:53601', '10.4.100.8:38280']} 2018-01-16 00:10:59,555 INFO (MainThread-74532) 1: Using CPU D0116 00:10:59.557356018 74532 env_linux.c:77] Warning: insecure environment read function 'getenv' used 2018-01-16 00:10:59.564540: I tensorflow/core/distributed_runtime/rpc/grpc_channel.cc:215] Initialize GrpcChannelCache for job ps -> {0 -> 10.4.111.25:55327} 2018-01-16 00:10:59.564571: I tensorflow/core/distributed_runtime/rpc/grpc_channel.cc:215] Initialize GrpcChannelCache for job worker -> {0 -> localhost:36831, 1 -> 10.4.97.115:53601, 2 -> 10.4.100.8:38280} 2018-01-16 00:10:59.565500: I tensorflow/core/distributed_runtime/rpc/grpc_server_lib.cc:316] Started server with target: grpc://localhost:36831 tensorflow model path: hdfs://kandula/user/amantrach/mnist_model 18/01/16 00:11:01 INFO CoarseGrainedExecutorBackend: Got assigned task 5 18/01/16 00:11:01 INFO Executor: Running task 2.0 in stage 1.0 (TID 5) 18/01/16 00:11:01 INFO TorrentBroadcast: Started reading broadcast variable 3 18/01/16 00:11:01 INFO MemoryStore: Block broadcast_3_piece0 stored as bytes in memory (estimated size 24.6 KB, free 80.0 KB) 18/01/16 00:11:01 INFO TorrentBroadcast: Reading broadcast variable 3 took 51 ms 18/01/16 00:11:01 INFO MemoryStore: Block broadcast_3 stored as values in memory (estimated size 103.7 KB, free 183.7 KB) 18/01/16 00:11:01 INFO HadoopRDD: Input split: hdfs://kandula/user/amantrach/mnist/csv/train/images/part-00002.deflate:0+1353414 18/01/16 00:11:01 INFO TorrentBroadcast: Started reading broadcast variable 0 18/01/16 00:11:01 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 28.0 KB, free 211.6 KB) 18/01/16 00:11:01 INFO TorrentBroadcast: Reading broadcast variable 0 took 18 ms 18/01/16 00:11:01 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 343.1 KB, free 554.7 KB) 18/01/16 00:11:02 INFO deprecation: mapred.tip.id is deprecated. Instead, use mapreduce.task.id 18/01/16 00:11:02 INFO deprecation: mapred.task.id is deprecated. Instead, use mapreduce.task.attempt.id 18/01/16 00:11:02 INFO deprecation: mapred.task.is.map is deprecated. Instead, use mapreduce.task.ismap 18/01/16 00:11:02 INFO deprecation: mapred.task.partition is deprecated. Instead, use mapreduce.task.partition 18/01/16 00:11:02 INFO deprecation: mapred.job.id is deprecated. Instead, use mapreduce.job.id 18/01/16 00:11:02 INFO ZlibFactory: Successfully loaded & initialized native-zlib library 18/01/16 00:11:02 INFO CodecPool: Got brand-new decompressor [.deflate] 18/01/16 00:11:02 INFO HadoopRDD: Input split: hdfs://kandula/user/amantrach/mnist/csv/train/labels/part-00002.deflate:0+9061 18/01/16 00:11:02 INFO TorrentBroadcast: Started reading broadcast variable 1 18/01/16 00:11:02 INFO MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 28.0 KB, free 582.7 KB) 18/01/16 00:11:02 INFO TorrentBroadcast: Reading broadcast variable 1 took 15 ms 18/01/16 00:11:02 INFO MemoryStore: Block broadcast_1 stored as values in memory (estimated size 343.1 KB, free 925.8 KB) 18/01/16 00:11:02 INFO CodecPool: Got brand-new decompressor [.deflate] 2018-01-16 00:11:02,706 INFO (MainThread-75304) Connected to TFSparkNode.mgr on 10.4.116.175, ppid=74380, state='running' 2018-01-16 00:11:02,709 INFO (MainThread-75304) mgr.state='running' 2018-01-16 00:11:02,709 INFO (MainThread-75304) Feeding partition <generator object load_stream at 0x7ff6ddad0dc0> into input queue <multiprocessing.queues.JoinableQueue object at 0x7ff6ddaefe50> 18/01/16 00:11:04 INFO PythonRunner: Times: total = 2592, boot = -4211, init = 4281, finish = 2522 18/01/16 00:11:04 INFO PythonRunner: Times: total = 109, boot = 3, init = 52, finish = 54

Job 3 stderr SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/data1/hadoop/yarn/local/usercache/amantrach/filecache/3073/spark-assembly-1.6.1.2.4.2.0-258-hadoop2.7.1.2.4.2.0-258.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/usr/hdp/2.4.2.0-258/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/usr/hdp/2.4.2.0-258/oozie/lib/slf4j-simple-1.6.6.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] 18/01/16 00:10:51 INFO CoarseGrainedExecutorBackend: Registered signal handlers for [TERM, HUP, INT] 18/01/16 00:10:52 INFO SecurityManager: Changing view acls to: yarn,amantrach 18/01/16 00:10:52 INFO SecurityManager: Changing modify acls to: yarn,amantrach 18/01/16 00:10:52 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(yarn, amantrach); users with modify permissions: Set(yarn, amantrach) 18/01/16 00:10:52 INFO SecurityManager: Changing view acls to: yarn,amantrach 18/01/16 00:10:52 INFO SecurityManager: Changing modify acls to: yarn,amantrach 18/01/16 00:10:52 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(yarn, amantrach); users with modify permissions: Set(yarn, amantrach) 18/01/16 00:10:53 INFO Slf4jLogger: Slf4jLogger started 18/01/16 00:10:53 INFO Remoting: Starting remoting 18/01/16 00:10:53 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkExecutorActorSystem@hdpdatanode59:46395] 18/01/16 00:10:53 INFO Utils: Successfully started service 'sparkExecutorActorSystem' on port 46395. 18/01/16 00:10:53 INFO DiskBlockManager: Created local directory at /data5/hadoop/yarn/local/usercache/amantrach/appcache/application_1515444508016_276923/blockmgr-e12e7d30-aa16-4e24-8bc3-88e019eb5854 18/01/16 00:10:53 INFO DiskBlockManager: Created local directory at /data8/hadoop/yarn/local/usercache/amantrach/appcache/application_1515444508016_276923/blockmgr-a92597db-4be8-4e22-a739-f56d7da76048 18/01/16 00:10:53 INFO DiskBlockManager: Created local directory at /data1/hadoop/yarn/local/usercache/amantrach/appcache/application_1515444508016_276923/blockmgr-d0225a2b-7323-4ed0-a010-d0447e96358a 18/01/16 00:10:53 INFO DiskBlockManager: Created local directory at /data7/hadoop/yarn/local/usercache/amantrach/appcache/application_1515444508016_276923/blockmgr-db0a38e4-244e-4470-8745-7c162c589b37 18/01/16 00:10:53 INFO DiskBlockManager: Created local directory at /data10/hadoop/yarn/local/usercache/amantrach/appcache/application_1515444508016_276923/blockmgr-e18617e3-cc00-42a0-965e-c37f75b84281 18/01/16 00:10:53 INFO DiskBlockManager: Created local directory at /data9/hadoop/yarn/local/usercache/amantrach/appcache/application_1515444508016_276923/blockmgr-17ef2394-8c37-428a-899a-e27fe45eeaea 18/01/16 00:10:53 INFO DiskBlockManager: Created local directory at /data2/hadoop/yarn/local/usercache/amantrach/appcache/application_1515444508016_276923/blockmgr-dca64e85-62a7-4b44-97eb-fd498a150d95 18/01/16 00:10:53 INFO DiskBlockManager: Created local directory at /data3/hadoop/yarn/local/usercache/amantrach/appcache/application_1515444508016_276923/blockmgr-668c2ec0-157b-4e99-8c2d-4c3d6b1aedd4 18/01/16 00:10:53 INFO DiskBlockManager: Created local directory at /data6/hadoop/yarn/local/usercache/amantrach/appcache/application_1515444508016_276923/blockmgr-d766bb82-ac51-4032-8347-9c0446e14f74 18/01/16 00:10:53 INFO DiskBlockManager: Created local directory at /data4/hadoop/yarn/local/usercache/amantrach/appcache/application_1515444508016_276923/blockmgr-f532f50a-b948-4cfd-983b-f951f1284d00 18/01/16 00:10:53 INFO MemoryStore: MemoryStore started with capacity 1247.6 MB 18/01/16 00:10:53 INFO CoarseGrainedExecutorBackend: Connecting to driver: spark://CoarseGrainedScheduler@10.4.117.3:47482 18/01/16 00:10:53 INFO CoarseGrainedExecutorBackend: Successfully registered with driver 18/01/16 00:10:53 INFO Executor: Starting executor ID 3 on host hdpdatanode59 18/01/16 00:10:53 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 58653. 18/01/16 00:10:53 INFO NettyBlockTransferService: Server created on 58653 18/01/16 00:10:53 INFO BlockManagerMaster: Trying to register BlockManager 18/01/16 00:10:53 INFO BlockManagerMaster: Registered BlockManager 18/01/16 00:10:54 INFO CoarseGrainedExecutorBackend: Got assigned task 0 18/01/16 00:10:54 INFO Executor: Running task 0.0 in stage 0.0 (TID 0) 18/01/16 00:10:54 INFO TorrentBroadcast: Started reading broadcast variable 2 18/01/16 00:10:55 INFO MemoryStore: Block broadcast_2_piece0 stored as bytes in memory (estimated size 13.5 KB, free 13.5 KB) 18/01/16 00:10:55 INFO TorrentBroadcast: Reading broadcast variable 2 took 122 ms 18/01/16 00:10:55 INFO MemoryStore: Block broadcast_2 stored as values in memory (estimated size 41.9 KB, free 55.4 KB) 2018-01-16 00:10:56,252 INFO (MainThread-33666) connected to server at ('10.4.117.3', 39909) 2018-01-16 00:10:56,254 INFO (MainThread-33666) TFSparkNode.reserve: {'authkey': '@\x93\xfc\xf1u!D\xc3\xa8jk\x1d\xad\xc7E,', 'worker_num': 0, 'host': '10.4.111.25', 'tb_port': 0, 'addr': ('10.4.111.25', 35673), 'ppid': 33624, 'task_index': 0, 'job_name': 'ps', 'tb_pid': 0, 'port': 55327} 2018-01-16 00:10:57,259 INFO (MainThread-33666) node: {'addr': ('10.4.111.25', 35673), 'task_index': 0, 'job_name': 'ps', 'authkey': '@\x93\xfc\xf1u!D\xc3\xa8jk\x1d\xad\xc7E,', 'worker_num': 0, 'host': '10.4.111.25', 'ppid': 33624, 'port': 55327, 'tb_pid': 0, 'tb_port': 0} 2018-01-16 00:10:57,259 INFO (MainThread-33666) node: {'addr': '/tmp/pymp-X8WJpQ/listener-9MvGAO', 'task_index': 0, 'job_name': 'worker', 'authkey': "\x90\xde\x1d\xb5~;@\xe4\x939.\x93\t\xe5\x8d'", 'worker_num': 1, 'host': '10.4.116.175', 'ppid': 74380, 'port': 36831, 'tb_pid': 0, 'tb_port': 0} 2018-01-16 00:10:57,260 INFO (MainThread-33666) node: {'addr': '/tmp/pymp-IPJQ1U/listener-GpK5sS', 'task_index': 1, 'job_name': 'worker', 'authkey': '\x04\xdf\x87_X\x8eM\x1e\xb0|2\x95\xcf\x8c\xa6L', 'worker_num': 2, 'host': '10.4.97.115', 'ppid': 110354, 'port': 53601, 'tb_pid': 0, 'tb_port': 0} 2018-01-16 00:10:57,260 INFO (MainThread-33666) node: {'addr': '/tmp/pymp-zf55NH/listener-A8vko2', 'task_index': 2, 'job_name': 'worker', 'authkey': 'u(CW\xd0\xe1O\x13\x98\x8er\xdb\xcc\x91sY', 'worker_num': 3, 'host': '10.4.100.8', 'ppid': 66809, 'port': 38280, 'tb_pid': 0, 'tb_port': 0} 2018-01-16 00:10:57,263 INFO (MainThread-33666) Starting TensorFlow ps:0 on cluster node 0 on background process 2018-01-16 00:11:03,631 INFO (MainThread-33782) 0: ======== ps:0 ======== 2018-01-16 00:11:03,632 INFO (MainThread-33782) 0: Cluster spec: {'ps': ['10.4.111.25:55327'], 'worker': ['10.4.116.175:36831', '10.4.97.115:53601', '10.4.100.8:38280']} 2018-01-16 00:11:03,632 INFO (MainThread-33782) 0: Using CPU D0116 00:11:03.636705083 33782 env_linux.c:77] Warning: insecure environment read function 'getenv' used 2018-01-16 00:11:03.649579: I tensorflow/core/distributed_runtime/rpc/grpc_channel.cc:215] Initialize GrpcChannelCache for job ps -> {0 -> localhost:55327} 2018-01-16 00:11:03.649615: I tensorflow/core/distributed_runtime/rpc/grpc_channel.cc:215] Initialize GrpcChannelCache for job worker -> {0 -> 10.4.116.175:36831, 1 -> 10.4.97.115:53601, 2 -> 10.4.100.8:38280} 2018-01-16 00:11:03.650593: I tensorflow/core/distributed_runtime/rpc/grpc_server_lib.cc:316] Started server with target: grpc://localhost:55327

Job 4 stderr `SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/data1/hadoop/yarn/local/usercache/amantrach/filecache/2902/spark-assembly-1.6.1.2.4.2.0-258-hadoop2.7.1.2.4.2.0-258.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/usr/hdp/2.4.2.0-258/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] 18/01/16 00:10:50 INFO CoarseGrainedExecutorBackend: Registered signal handlers for [TERM, HUP, INT] 18/01/16 00:10:51 INFO SecurityManager: Changing view acls to: yarn,amantrach 18/01/16 00:10:51 INFO SecurityManager: Changing modify acls to: yarn,amantrach 18/01/16 00:10:51 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(yarn, amantrach); users with modify permissions: Set(yarn, amantrach) 18/01/16 00:10:52 INFO SecurityManager: Changing view acls to: yarn,amantrach 18/01/16 00:10:52 INFO SecurityManager: Changing modify acls to: yarn,amantrach 18/01/16 00:10:52 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(yarn, amantrach); users with modify permissions: Set(yarn, amantrach) 18/01/16 00:10:52 INFO Slf4jLogger: Slf4jLogger started 18/01/16 00:10:52 INFO Remoting: Starting remoting 18/01/16 00:10:52 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkExecutorActorSystem@hdpdatanode40:53244] 18/01/16 00:10:52 INFO Utils: Successfully started service 'sparkExecutorActorSystem' on port 53244. 18/01/16 00:10:52 INFO DiskBlockManager: Created local directory at /data8/hadoop/yarn/local/usercache/amantrach/appcache/application_1515444508016_276923/blockmgr-a6bb8e36-47d5-42f4-a66e-72204fefc5b2 18/01/16 00:10:52 INFO DiskBlockManager: Created local directory at /data6/hadoop/yarn/local/usercache/amantrach/appcache/application_1515444508016_276923/blockmgr-551d4033-c0dd-4e0a-a8ee-0562ee93bdcf 18/01/16 00:10:52 INFO DiskBlockManager: Created local directory at /data1/hadoop/yarn/local/usercache/amantrach/appcache/application_1515444508016_276923/blockmgr-4c073a66-ebfb-4e0e-a207-aa6acb38770f 18/01/16 00:10:52 INFO DiskBlockManager: Created local directory at /data2/hadoop/yarn/local/usercache/amantrach/appcache/application_1515444508016_276923/blockmgr-2f542a1e-f4cc-4823-83bc-3a636034daf1 18/01/16 00:10:52 INFO DiskBlockManager: Created local directory at /data9/hadoop/yarn/local/usercache/amantrach/appcache/application_1515444508016_276923/blockmgr-ebf46833-35a1-46f5-a5dd-5abfa7b90a79 18/01/16 00:10:52 INFO DiskBlockManager: Created local directory at /data3/hadoop/yarn/local/usercache/amantrach/appcache/application_1515444508016_276923/blockmgr-12e45582-5f71-46af-a7dc-9f09aede4ffd 18/01/16 00:10:52 INFO DiskBlockManager: Created local directory at /data5/hadoop/yarn/local/usercache/amantrach/appcache/application_1515444508016_276923/blockmgr-dc4dd47e-0ae0-44b8-b5cd-e51450bf03e9 18/01/16 00:10:52 INFO DiskBlockManager: Created local directory at /data10/hadoop/yarn/local/usercache/amantrach/appcache/application_1515444508016_276923/blockmgr-74e3a6f6-f930-46c5-8ab6-8fc18dcf9d06 18/01/16 00:10:52 INFO DiskBlockManager: Created local directory at /data4/hadoop/yarn/local/usercache/amantrach/appcache/application_1515444508016_276923/blockmgr-f981390c-9f4b-4560-9a25-e0b669b533cb 18/01/16 00:10:52 INFO DiskBlockManager: Created local directory at /data7/hadoop/yarn/local/usercache/amantrach/appcache/application_1515444508016_276923/blockmgr-aefa1c21-66c7-421b-b8ad-4acc1c942fd7 18/01/16 00:10:52 INFO MemoryStore: MemoryStore started with capacity 1247.6 MB 18/01/16 00:10:52 INFO CoarseGrainedExecutorBackend: Connecting to driver: spark://CoarseGrainedScheduler@10.4.117.3:47482 18/01/16 00:10:53 INFO CoarseGrainedExecutorBackend: Successfully registered with driver 18/01/16 00:10:53 INFO Executor: Starting executor ID 4 on host hdpdatanode40 18/01/16 00:10:53 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 47214. 18/01/16 00:10:53 INFO NettyBlockTransferService: Server created on 47214 18/01/16 00:10:53 INFO BlockManagerMaster: Trying to register BlockManager 18/01/16 00:10:53 INFO BlockManagerMaster: Registered BlockManager 18/01/16 00:10:54 INFO CoarseGrainedExecutorBackend: Got assigned task 3 18/01/16 00:10:54 INFO Executor: Running task 3.0 in stage 0.0 (TID 3) 18/01/16 00:10:54 INFO TorrentBroadcast: Started reading broadcast variable 2 18/01/16 00:10:55 INFO MemoryStore: Block broadcast_2_piece0 stored as bytes in memory (estimated size 13.5 KB, free 13.5 KB) 18/01/16 00:10:55 INFO TorrentBroadcast: Reading broadcast variable 2 took 113 ms 18/01/16 00:10:55 INFO MemoryStore: Block broadcast_2 stored as values in memory (estimated size 41.9 KB, free 55.4 KB) 2018-01-16 00:10:56,007 INFO (MainThread-66847) connected to server at ('10.4.117.3', 39909) 2018-01-16 00:10:56,010 INFO (MainThread-66847) TFSparkNode.reserve: {'authkey': 'u(CW\xd0\xe1O\x13\x98\x8er\xdb\xcc\x91sY', 'worker_num': 3, 'host': '10.4.100.8', 'tb_port': 0, 'addr': '/tmp/pymp-zf55NH/listener-A8vko2', 'ppid': 66809, 'task_index': 2, 'job_name': 'worker', 'tb_pid': 0, 'port': 38280} 2018-01-16 00:10:58,023 INFO (MainThread-66847) node: {'addr': ('10.4.111.25', 35673), 'task_index': 0, 'job_name': 'ps', 'authkey': '@\x93\xfc\xf1u!D\xc3\xa8jk\x1d\xad\xc7E,', 'worker_num': 0, 'host': '10.4.111.25', 'ppid': 33624, 'port': 55327, 'tb_pid': 0, 'tb_port': 0} 2018-01-16 00:10:58,023 INFO (MainThread-66847) node: {'addr': '/tmp/pymp-X8WJpQ/listener-9MvGAO', 'task_index': 0, 'job_name': 'worker', 'authkey': "\x90\xde\x1d\xb5~;@\xe4\x939.\x93\t\xe5\x8d'", 'worker_num': 1, 'host': '10.4.116.175', 'ppid': 74380, 'port': 36831, 'tb_pid': 0, 'tb_port': 0} 2018-01-16 00:10:58,024 INFO (MainThread-66847) node: {'addr': '/tmp/pymp-IPJQ1U/listener-GpK5sS', 'task_index': 1, 'job_name': 'worker', 'authkey': '\x04\xdf\x87_X\x8eM\x1e\xb0|2\x95\xcf\x8c\xa6L', 'worker_num': 2, 'host': '10.4.97.115', 'ppid': 110354, 'port': 53601, 'tb_pid': 0, 'tb_port': 0} 2018-01-16 00:10:58,024 INFO (MainThread-66847) node: {'addr': '/tmp/pymp-zf55NH/listener-A8vko2', 'task_index': 2, 'job_name': 'worker', 'authkey': 'u(CW\xd0\xe1O\x13\x98\x8er\xdb\xcc\x91sY', 'worker_num': 3, 'host': '10.4.100.8', 'ppid': 66809, 'port': 38280, 'tb_pid': 0, 'tb_port': 0} 2018-01-16 00:10:58,029 INFO (MainThread-66847) Starting TensorFlow worker:2 on cluster node 3 on background process 18/01/16 00:10:58 INFO PythonRunner: Times: total = 2870, boot = 569, init = 75, finish = 2226 18/01/16 00:10:58 INFO Executor: Finished task 3.0 in stage 0.0 (TID 3). 999 bytes result sent to driver 18/01/16 00:10:58 INFO CoarseGrainedExecutorBackend: Got assigned task 4 18/01/16 00:10:58 INFO Executor: Running task 1.0 in stage 1.0 (TID 4) 18/01/16 00:10:58 INFO TorrentBroadcast: Started reading broadcast variable 3 18/01/16 00:10:58 INFO MemoryStore: Block broadcast_3_piece0 stored as bytes in memory (estimated size 24.6 KB, free 80.0 KB) 18/01/16 00:10:58 INFO TorrentBroadcast: Reading broadcast variable 3 took 22 ms 18/01/16 00:10:58 INFO MemoryStore: Block broadcast_3 stored as values in memory (estimated size 103.7 KB, free 183.7 KB) 18/01/16 00:10:58 INFO HadoopRDD: Input split: hdfs://kandula/user/amantrach/mnist/csv/train/images/part-00001.deflate:0+1358889 18/01/16 00:10:58 INFO TorrentBroadcast: Started reading broadcast variable 0 18/01/16 00:10:58 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 28.0 KB, free 211.6 KB) 18/01/16 00:10:58 INFO TorrentBroadcast: Reading broadcast variable 0 took 18 ms 18/01/16 00:10:58 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 343.1 KB, free 554.7 KB) 18/01/16 00:10:58 INFO deprecation: mapred.tip.id is deprecated. Instead, use mapreduce.task.id 18/01/16 00:10:58 INFO deprecation: mapred.task.id is deprecated. Instead, use mapreduce.task.attempt.id 18/01/16 00:10:58 INFO deprecation: mapred.task.is.map is deprecated. Instead, use mapreduce.task.ismap 18/01/16 00:10:58 INFO deprecation: mapred.task.partition is deprecated. Instead, use mapreduce.task.partition 18/01/16 00:10:58 INFO deprecation: mapred.job.id is deprecated. Instead, use mapreduce.job.id 18/01/16 00:10:58 INFO ZlibFactory: Successfully loaded & initialized native-zlib library 18/01/16 00:10:58 INFO CodecPool: Got brand-new decompressor [.deflate] 18/01/16 00:10:59 INFO HadoopRDD: Input split: hdfs://kandula/user/amantrach/mnist/csv/train/labels/part-00001.deflate:0+9080 18/01/16 00:10:59 INFO TorrentBroadcast: Started reading broadcast variable 1 18/01/16 00:10:59 INFO MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 28.0 KB, free 582.7 KB) 18/01/16 00:10:59 INFO TorrentBroadcast: Reading broadcast variable 1 took 19 ms 18/01/16 00:10:59 INFO MemoryStore: Block broadcast_1 stored as values in memory (estimated size 343.1 KB, free 925.8 KB) 18/01/16 00:10:59 INFO CodecPool: Got brand-new decompressor [.deflate] 2018-01-16 00:10:59,408 INFO (MainThread-66919) 3: ======== worker:2 ======== 2018-01-16 00:10:59,408 INFO (MainThread-66919) 3: Cluster spec: {'ps': ['10.4.111.25:55327'], 'worker': ['10.4.116.175:36831', '10.4.97.115:53601', '10.4.100.8:38280']} 2018-01-16 00:10:59,408 INFO (MainThread-66919) 3: Using CPU D0116 00:10:59.409959060 66919 env_linux.c:77] Warning: insecure environment read function 'getenv' used 2018-01-16 00:10:59.420757: I tensorflow/core/distributed_runtime/rpc/grpc_channel.cc:215] Initialize GrpcChannelCache for job ps -> {0 -> 10.4.111.25:55327} 2018-01-16 00:10:59.420775: I tensorflow/core/distributed_runtime/rpc/grpc_channel.cc:215] Initialize GrpcChannelCache for job worker -> {0 -> 10.4.116.175:36831, 1 -> 10.4.97.115:53601, 2 -> localhost:38280} 2018-01-16 00:10:59.421683: I tensorflow/core/distributed_runtime/rpc/grpc_server_lib.cc:316] Started server with target: grpc://localhost:38280 2018-01-16 00:10:59,479 INFO (MainThread-67015) Connected to TFSparkNode.mgr on 10.4.100.8, ppid=66809, state='running' 2018-01-16 00:10:59,483 INFO (MainThread-67015) mgr.state='running' 2018-01-16 00:10:59,484 INFO (MainThread-67015) Feeding partition <generator object load_stream at 0x7fa7ce501dc0> into input queue <multiprocessing.queues.JoinableQueue object at 0x7fa7ce523e50> tensorflow model path: hdfs://kandula/user/amantrach/mnist_model 18/01/16 00:11:01 INFO PythonRunner: Times: total = 2469, boot = -916, init = 1095, finish = 2290 18/01/16 00:11:01 INFO PythonRunner: Times: total = 137, boot = 3, init = 77, finish = 57 2018-01-16 00:11:05.429875: I tensorflow/core/distributed_runtime/master_session.cc:999] Start master session b8fe901b3aedc4ab with config:

INFO:tensorflow:Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:11:05,460 INFO (MainThread-66919) Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:11:35.500385: I tensorflow/core/distributed_runtime/master_session.cc:999] Start master session 37d1ecd07921dd76 with config:

INFO:tensorflow:Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:11:35,523 INFO (MainThread-66919) Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:12:05.568085: I tensorflow/core/distributed_runtime/master_session.cc:999] Start master session bfb5bca4880cf680 with config:

INFO:tensorflow:Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:12:05,598 INFO (MainThread-66919) Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:12:35.649191: I tensorflow/core/distributed_runtime/master_session.cc:999] Start master session ec07c3acbce38503 with config:

INFO:tensorflow:Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:12:35,677 INFO (MainThread-66919) Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:13:05.717363: I tensorflow/core/distributed_runtime/master_session.cc:999] Start master session 7074155e41e78c8d with config:

INFO:tensorflow:Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:13:05,745 INFO (MainThread-66919) Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:13:35.785213: I tensorflow/core/distributed_runtime/master_session.cc:999] Start master session 4dfdd988ac20f7af with config:

INFO:tensorflow:Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:13:35,804 INFO (MainThread-66919) Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:14:05.852098: I tensorflow/core/distributed_runtime/master_session.cc:999] Start master session 961a498961985c33 with config:

INFO:tensorflow:Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:14:05,873 INFO (MainThread-66919) Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:14:35.897161: I tensorflow/core/distributed_runtime/master_session.cc:999] Start master session 7ea6660abb267b19 with config:

INFO:tensorflow:Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:14:35,921 INFO (MainThread-66919) Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:15:05.952272: I tensorflow/core/distributed_runtime/master_session.cc:999] Start master session 2d252fd1eb09fd9f with config:

INFO:tensorflow:Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:15:05,977 INFO (MainThread-66919) Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:15:36.018526: I tensorflow/core/distributed_runtime/master_session.cc:999] Start master session 5d3bf193dc3a3135 with config:

INFO:tensorflow:Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:15:36,042 INFO (MainThread-66919) Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:16:06.082185: I tensorflow/core/distributed_runtime/master_session.cc:999] Start master session 6fdf1f21d84f7765 with config:

INFO:tensorflow:Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:16:06,106 INFO (MainThread-66919) Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:16:36.152453: I tensorflow/core/distributed_runtime/master_session.cc:999] Start master session fc05d46ec9be6d1c with config:

INFO:tensorflow:Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:16:36,179 INFO (MainThread-66919) Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:17:06.227613: I tensorflow/core/distributed_runtime/master_session.cc:999] Start master session fe085eacb535107f with config:

INFO:tensorflow:Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:17:06,257 INFO (MainThread-66919) Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:17:36.295440: I tensorflow/core/distributed_runtime/master_session.cc:999] Start master session e42d5b58eed740b4 with config:

INFO:tensorflow:Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:17:36,314 INFO (MainThread-66919) Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:18:06.359294: I tensorflow/core/distributed_runtime/master_session.cc:999] Start master session 9080e3e0c95f9333 with config:

INFO:tensorflow:Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:18:06,383 INFO (MainThread-66919) Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:18:36.404463: I tensorflow/core/distributed_runtime/master_session.cc:999] Start master session 3c5e8bd6b9606aa4 with config:

INFO:tensorflow:Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:18:36,431 INFO (MainThread-66919) Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:19:06.480016: I tensorflow/core/distributed_runtime/master_session.cc:999] Start master session 0d971e95963b666e with config:

INFO:tensorflow:Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:19:06,505 INFO (MainThread-66919) Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:19:36.555923: I tensorflow/core/distributed_runtime/master_session.cc:999] Start master session 1af86966475c2b99 with config:

INFO:tensorflow:Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:19:36,586 INFO (MainThread-66919) Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:20:06.608083: I tensorflow/core/distributed_runtime/master_session.cc:999] Start master session e367a445725e5f25 with config:

INFO:tensorflow:Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:20:06,626 INFO (MainThread-66919) Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:20:36.658444: I tensorflow/core/distributed_runtime/master_session.cc:999] Start master session ec84278be8ec325d with config:

INFO:tensorflow:Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hid_w, hid_b, sm_w, sm_b, Variable, hid_w/Adagrad, hid_b/Adagrad, sm_w/Adagrad, sm_b/Adagrad 2018-01-16 00:20:36,689 INFO (MainThread-66919) Waiting for model to be ready. Ready_for_local_init_op: None, ready: Variables not initialized: hi`

Many thanks if you can suggest a help on this. Best

leewyang commented 6 years ago

Can you confirm that libhdfs.so is present at /usr/hdp/2.4.2.0-258/usr/lib on ALL machines of the cluster?

If so, can you try using: --model hdfs://default/user/amantrach/mnist_model?

leewyang commented 6 years ago

Also, please see #195

amantrac commented 6 years ago

I confirm that the variables are correct, and the path are valid on the nodes of the cluster, checked that with the admin. The only issue need still to check in having the temp directory in core-site.xml, is it important ?

leewyang commented 6 years ago

Not sure if that is important. One other thing to try is to make your hdfs://kandula/user/amantrach folder world writable temporarily, since I've seen some folks' setups write the model as the yarn or hadoop user.

amantrac commented 6 years ago

Great thanks @leewyang !! this is exactly solving the problem, when put 777 permissions on hdfs://kandula/user/amantrach jobs run without problem!

for information, Here are exactly my running command with parameters on my machine:

export JAVA_HOME=/usr/lib/jvm/java-7-openjdk-amd64 export HADOOP_HOME=/usr/hdp/current/hadoop-client export SPARK_HOME=/usr/hdp/current/spark-client export HADOOP_HDFS_HOME=/usr/hdp/current/hadoop-hdfs-client export SPARK_HOME=/usr/hdp/current/spark-client export PYTHON_ROOT=./Python

export PATH=${PATH}:${HADOOP_HOME}/bin:${SPARK_HOME}/bin:${HADOOP_HDFS_HOME}/bin:${SPARK_HOME}/bin:${PYTHON_ROOT}/bin export PYSPARK_PYTHON=${PYTHON_ROOT}/bin/python export SPARK_YARN_USER_ENV="PYSPARK_PYTHON=/usr/bin/python" export QUEUE=default export LIB_HDFS=/usr/lib/ams-hbase/lib/hadoop-native export LIB_JVM=/usr/lib/jvm/java-7-openjdk-amd64/jre/lib/amd64/server

${SPARK_HOME}/bin/spark-submit \ --master yarn \ --deploy-mode cluster \ --queue ${QUEUE} \ --num-executors 4 \ --executor-memory 2G \ --py-files TensorFlowOnSpark/tfspark.zip,TensorFlowOnSpark/examples/mnist/spark/mnist_dist.py \ --conf spark.dynamicAllocation.enabled=false \ --conf spark.yarn.maxAppAttempts=1 \ --archives hdfs:///user/${USER}/Python.zip#Python \ --conf spark.executorEnv.LD_LIBRARY_PATH="$LIB_HDFS:$LIB_JVM" \ --conf spark.executorEnv.HADOOP_HDFS_HOME="$HADOOP_HDFS_HOME" \ --conf spark.executorEnv.CLASSPATH="$($HADOOP_HOME/bin/hadoop classpath --glob):${CLASSPATH}" \ TensorFlowOnSpark/examples/mnist/spark/mnist_spark.py \ --images mnist/csv/train/images \ --labels mnist/csv/train/labels \ --mode train \ --model hdfs://default/user/amantrach/mnist_model

leewyang commented 6 years ago

@amantrac obviously, setting your user directory to be world-writable is not ideal in the long term... it was just a quick test. A better workaround is to point the model directory somewhere else on HDFS, e.g. /tmp , where you can adjust the permissions without too much worry.

Note: I've created #210 to track the permission issue going forward, and I'll close this one for now