This repository contains my solution to the project "Machine learning algorithms with global state" from the BDAPRO class at TU Berlin. (The repo is based on BDAPRO.WS1617)
I actually don't get an error, but the receiver is unexpectedly stopped. This is the output relevant to the receiver. Can you please have a look? I googled this but found no good answer.
The code for the receiver is here [1] and it is called here [2].
17/02/02 17:35:41 INFO RecurringTimer: Started timer for BlockGenerator at time 1486053341600
17/02/02 17:35:41 INFO BlockGenerator: Started block pushing thread
17/02/02 17:35:41 INFO BlockGenerator: Started BlockGenerator
17/02/02 17:35:41 INFO ReceiverSupervisorImpl: Stopping receiver with message: Registered unsuccessfully because Driver refused to start receiver 0:
17/02/02 17:35:41 WARN ReceiverSupervisorImpl: Skip stopping receiver because it has not yet stared
17/02/02 17:35:41 INFO BlockGenerator: Stopping BlockGenerator
17/02/02 17:35:41 INFO RecurringTimer: Stopped timer for BlockGenerator after time 1486053341800
17/02/02 17:35:41 INFO BlockGenerator: Waiting for block pushing thread to terminate
17/02/02 17:35:41 INFO BlockGenerator: Pushing out the last 0 blocks
17/02/02 17:35:41 INFO BlockGenerator: Stopped block pushing thread
17/02/02 17:35:41 INFO BlockGenerator: Stopped BlockGenerator
17/02/02 17:35:41 INFO ReceiverSupervisorImpl: Waiting for receiver to be stopped
17/02/02 17:35:41 INFO ReceiverSupervisorImpl: Stopped receiver without error
Here is the command line and full output just in case:
[hadoop@ibm-power-1 ~]$ /share/hadoop/peel/systems/spark-2.1.0-bin-hadoop2.4/bin/spark-submit --class org.apache.spark.mllib.clustering.SequentialKMeansGlobalState --master spark://ibm-power-1:7077 /home/prigoana/bdapro-ws1617-spark-jobs-1.0-SNAPSHOT.jar
17/02/02 17:35:36 INFO SparkContext: Running Spark version 2.1.0
17/02/02 17:35:36 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/02/02 17:35:36 WARN SparkConf: In Spark 1.0 and later spark.local.dir will be overridden by the value set by the cluster manager (via SPARK_LOCAL_DIRS in mesos/standalone and LOCAL_DIRS in YARN).
17/02/02 17:35:36 INFO SecurityManager: Changing view acls to: hadoop
17/02/02 17:35:36 INFO SecurityManager: Changing modify acls to: hadoop
17/02/02 17:35:36 INFO SecurityManager: Changing view acls groups to:
17/02/02 17:35:36 INFO SecurityManager: Changing modify acls groups to:
17/02/02 17:35:36 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(hadoop); groups with view permissions: Set(); users with modify permissions: Set(hadoop); groups with modify permissions: Set()
17/02/02 17:35:37 INFO Utils: Successfully started service 'sparkDriver' on port 15693.
17/02/02 17:35:37 INFO SparkEnv: Registering MapOutputTracker
17/02/02 17:35:37 INFO SparkEnv: Registering BlockManagerMaster
17/02/02 17:35:37 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
17/02/02 17:35:37 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
17/02/02 17:35:37 INFO DiskBlockManager: Created local directory at /data/1/hadoop/peel/spark/tmp/blockmgr-7abea63d-69ef-4973-8992-a04db0453d46
17/02/02 17:35:37 INFO DiskBlockManager: Created local directory at /data/2/hadoop/peel/spark/tmp/blockmgr-4f02f89d-45f1-498f-ae81-e0e2e865db6e
17/02/02 17:35:37 INFO DiskBlockManager: Created local directory at /data/3/hadoop/peel/spark/tmp/blockmgr-ec160ea3-ccc7-4c8d-a005-5084165e9a11
17/02/02 17:35:37 INFO DiskBlockManager: Created local directory at /data/4/hadoop/peel/spark/tmp/blockmgr-12f3f173-6bc3-47ec-9a7f-dabcd1f0f69d
17/02/02 17:35:37 INFO MemoryStore: MemoryStore started with capacity 25.4 GB
17/02/02 17:35:37 INFO SparkEnv: Registering OutputCommitCoordinator
17/02/02 17:35:37 INFO log: Logging initialized @3021ms
17/02/02 17:35:37 INFO Server: jetty-9.2.z-SNAPSHOT
17/02/02 17:35:37 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@4e28bdd1{/jobs,null,AVAILABLE}
17/02/02 17:35:37 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@53f48368{/jobs/json,null,AVAILABLE}
17/02/02 17:35:37 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@24d4d7c9{/jobs/job,null,AVAILABLE}
17/02/02 17:35:37 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@f0e995e{/jobs/job/json,null,AVAILABLE}
17/02/02 17:35:37 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@4c37b5b{/stages,null,AVAILABLE}
17/02/02 17:35:37 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@73db4768{/stages/json,null,AVAILABLE}
17/02/02 17:35:37 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@71b3bc45{/stages/stage,null,AVAILABLE}
17/02/02 17:35:37 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@a8c1f44{/stages/stage/json,null,AVAILABLE}
17/02/02 17:35:37 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@150ab4ed{/stages/pool,null,AVAILABLE}
17/02/02 17:35:37 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@3c435123{/stages/pool/json,null,AVAILABLE}
17/02/02 17:35:37 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@50fe837a{/storage,null,AVAILABLE}
17/02/02 17:35:37 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@3a62c01e{/storage/json,null,AVAILABLE}
17/02/02 17:35:37 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@7a8fa663{/storage/rdd,null,AVAILABLE}
17/02/02 17:35:37 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@5ce33a58{/storage/rdd/json,null,AVAILABLE}
17/02/02 17:35:37 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@78a287ed{/environment,null,AVAILABLE}
17/02/02 17:35:37 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@546ccad7{/environment/json,null,AVAILABLE}
17/02/02 17:35:37 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@5357c287{/executors,null,AVAILABLE}
17/02/02 17:35:37 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@1623134f{/executors/json,null,AVAILABLE}
17/02/02 17:35:37 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@7a527389{/executors/threadDump,null,AVAILABLE}
17/02/02 17:35:37 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@485a3466{/executors/threadDump/json,null,AVAILABLE}
17/02/02 17:35:37 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@25748410{/static,null,AVAILABLE}
17/02/02 17:35:37 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@2b43529a{/,null,AVAILABLE}
17/02/02 17:35:37 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@4264b240{/api,null,AVAILABLE}
17/02/02 17:35:37 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@5b04476e{/jobs/job/kill,null,AVAILABLE}
17/02/02 17:35:37 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@5ad10c1a{/stages/stage/kill,null,AVAILABLE}
17/02/02 17:35:37 INFO ServerConnector: Started ServerConnector@4c51bb7{HTTP/1.1}{0.0.0.0:4040}
17/02/02 17:35:37 INFO Server: Started @3228ms
17/02/02 17:35:37 INFO Utils: Successfully started service 'SparkUI' on port 4040.
17/02/02 17:35:37 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://130.149.21.78:4040
17/02/02 17:35:38 INFO SparkContext: Added JAR file:/home/prigoana/bdapro-ws1617-spark-jobs-1.0-SNAPSHOT.jar at spark://130.149.21.78:15693/jars/bdapro-ws1617-spark-jobs-1.0-SNAPSHOT.jar with timestamp 1486053338038
17/02/02 17:35:38 INFO Executor: Starting executor ID driver on host localhost
17/02/02 17:35:38 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 14027.
17/02/02 17:35:38 INFO NettyBlockTransferService: Server created on 130.149.21.78:14027
17/02/02 17:35:38 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
17/02/02 17:35:38 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 130.149.21.78, 14027, None)
17/02/02 17:35:38 INFO BlockManagerMasterEndpoint: Registering block manager 130.149.21.78:14027 with 25.4 GB RAM, BlockManagerId(driver, 130.149.21.78, 14027, None)
17/02/02 17:35:38 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 130.149.21.78, 14027, None)
17/02/02 17:35:38 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 130.149.21.78, 14027, None)
17/02/02 17:35:38 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@7577b641{/metrics/json,null,AVAILABLE}
17/02/02 17:35:38 INFO EventLoggingListener: Logging events to file:///share/hadoop/peel/systems/spark-2.1.0-bin-hadoop2.4/logs/local-1486053338099
Creating function called to create new StreamingContext
New context created from currently defined creating function
17/02/02 17:35:39 INFO ReceiverTracker: Starting 1 receivers
17/02/02 17:35:39 INFO ReceiverTracker: ReceiverTracker started
17/02/02 17:35:39 INFO ForEachDStream: Duration for remembering RDDs set to 60000 ms for org.apache.spark.streaming.dstream.ForEachDStream@147a58e2
17/02/02 17:35:39 INFO MappedDStream: Duration for remembering RDDs set to 60000 ms for org.apache.spark.streaming.dstream.MappedDStream@18ee9cab
17/02/02 17:35:39 INFO MappedDStream: Duration for remembering RDDs set to 60000 ms for org.apache.spark.streaming.dstream.MappedDStream@5e156a1c
17/02/02 17:35:39 INFO PluggableInputDStream: Duration for remembering RDDs set to 60000 ms for org.apache.spark.streaming.dstream.PluggableInputDStream@7194bb42
17/02/02 17:35:39 INFO PluggableInputDStream: Slide time = 1000 ms
17/02/02 17:35:39 INFO PluggableInputDStream: Storage level = Serialized 1x Replicated
17/02/02 17:35:39 INFO PluggableInputDStream: Checkpoint interval = null
17/02/02 17:35:39 INFO PluggableInputDStream: Remember interval = 60000 ms
17/02/02 17:35:39 INFO PluggableInputDStream: Initialized and validated org.apache.spark.streaming.dstream.PluggableInputDStream@7194bb42
17/02/02 17:35:39 INFO MappedDStream: Slide time = 1000 ms
17/02/02 17:35:39 INFO MappedDStream: Storage level = Serialized 1x Replicated
17/02/02 17:35:39 INFO MappedDStream: Checkpoint interval = null
17/02/02 17:35:39 INFO MappedDStream: Remember interval = 60000 ms
17/02/02 17:35:39 INFO MappedDStream: Initialized and validated org.apache.spark.streaming.dstream.MappedDStream@5e156a1c
17/02/02 17:35:39 INFO MappedDStream: Slide time = 1000 ms
17/02/02 17:35:39 INFO MappedDStream: Storage level = Serialized 1x Replicated
17/02/02 17:35:39 INFO MappedDStream: Checkpoint interval = null
17/02/02 17:35:39 INFO MappedDStream: Remember interval = 60000 ms
17/02/02 17:35:39 INFO MappedDStream: Initialized and validated org.apache.spark.streaming.dstream.MappedDStream@18ee9cab
17/02/02 17:35:39 INFO ForEachDStream: Slide time = 1000 ms
17/02/02 17:35:39 INFO ForEachDStream: Storage level = Serialized 1x Replicated
17/02/02 17:35:39 INFO ForEachDStream: Checkpoint interval = null
17/02/02 17:35:39 INFO ForEachDStream: Remember interval = 60000 ms
17/02/02 17:35:39 INFO ForEachDStream: Initialized and validated org.apache.spark.streaming.dstream.ForEachDStream@147a58e2
17/02/02 17:35:39 INFO RecurringTimer: Started timer for JobGenerator at time 1486053340000
17/02/02 17:35:39 INFO JobGenerator: Started JobGenerator at 1486053340000 ms
17/02/02 17:35:39 INFO ReceiverTracker: Receiver 0 started
17/02/02 17:35:39 INFO JobScheduler: Started JobScheduler
17/02/02 17:35:39 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@3f29e26{/streaming,null,AVAILABLE}
17/02/02 17:35:39 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@4393593c{/streaming/json,null,AVAILABLE}
17/02/02 17:35:39 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@5cbd159f{/streaming/batch,null,AVAILABLE}
17/02/02 17:35:39 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@2c43eb8{/streaming/batch/json,null,AVAILABLE}
17/02/02 17:35:39 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@7ce4de34{/static/streaming,null,AVAILABLE}
17/02/02 17:35:39 INFO StreamingContext: StreamingContext started
17/02/02 17:35:39 INFO DAGScheduler: Got job 0 (main at NativeMethodAccessorImpl.java:0) with 1 output partitions
17/02/02 17:35:39 INFO DAGScheduler: Final stage: ResultStage 0 (main at NativeMethodAccessorImpl.java:0)
17/02/02 17:35:39 INFO DAGScheduler: Parents of final stage: List()
17/02/02 17:35:39 INFO StreamingContext: Invoking stop(stopGracefully=false) from shutdown hook
17/02/02 17:35:39 INFO DAGScheduler: Missing parents: List()
17/02/02 17:35:39 INFO ReceiverTracker: Sent stop signal to all 1 receivers
17/02/02 17:35:39 INFO DAGScheduler: Submitting ResultStage 0 (Receiver 0 ParallelCollectionRDD[1] at makeRDD at ReceiverTracker.scala:620), which has no missing parents
17/02/02 17:35:39 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 54.4 KB, free 25.4 GB)
17/02/02 17:35:39 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 18.9 KB, free 25.4 GB)
17/02/02 17:35:39 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on 130.149.21.78:14027 (size: 18.9 KB, free: 25.4 GB)
17/02/02 17:35:39 INFO SparkContext: Created broadcast 0 from broadcast at DAGScheduler.scala:996
17/02/02 17:35:39 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 0 (Receiver 0 ParallelCollectionRDD[1] at makeRDD at ReceiverTracker.scala:620)
17/02/02 17:35:39 INFO TaskSchedulerImpl: Adding task set 0.0 with 1 tasks
17/02/02 17:35:40 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, localhost, executor driver, partition 0, PROCESS_LOCAL, 7242 bytes)
17/02/02 17:35:40 INFO Executor: Running task 0.0 in stage 0.0 (TID 0)
17/02/02 17:35:40 INFO Executor: Fetching spark://130.149.21.78:15693/jars/bdapro-ws1617-spark-jobs-1.0-SNAPSHOT.jar with timestamp 1486053338038
17/02/02 17:35:40 INFO JobScheduler: Added jobs for time 1486053340000 ms
17/02/02 17:35:40 INFO JobScheduler: Starting job streaming job 1486053340000 ms.0 from job set of time 1486053340000 ms
-------------------------------------------
Time: 1486053340000 ms
-------------------------------------------
17/02/02 17:35:40 INFO JobScheduler: Finished job streaming job 1486053340000 ms.0 from job set of time 1486053340000 ms
17/02/02 17:35:40 INFO JobScheduler: Total delay: 0.087 s for time 1486053340000 ms (execution: 0.007 s)
17/02/02 17:35:40 INFO ReceivedBlockTracker: Deleting batches:
17/02/02 17:35:40 INFO InputInfoTracker: remove old batch metadata:
17/02/02 17:35:40 INFO TransportClientFactory: Successfully created connection to /130.149.21.78:15693 after 44 ms (0 ms spent in bootstraps)
17/02/02 17:35:40 INFO Utils: Fetching spark://130.149.21.78:15693/jars/bdapro-ws1617-spark-jobs-1.0-SNAPSHOT.jar to /data/1/hadoop/peel/spark/tmp/spark-373414dc-f20f-45f0-aed7-87f19425565f/userFiles-5745fc97-dbd5-4a04-84f7-bacd4f10f1a2/fetchFileTemp1627062284506665166.tmp
17/02/02 17:35:41 INFO JobScheduler: Added jobs for time 1486053341000 ms
17/02/02 17:35:41 INFO JobScheduler: Starting job streaming job 1486053341000 ms.0 from job set of time 1486053341000 ms
-------------------------------------------
Time: 1486053341000 ms
-------------------------------------------
17/02/02 17:35:41 INFO JobScheduler: Finished job streaming job 1486053341000 ms.0 from job set of time 1486053341000 ms
17/02/02 17:35:41 INFO ReceivedBlockTracker: Deleting batches:
17/02/02 17:35:41 INFO JobScheduler: Total delay: 0.016 s for time 1486053341000 ms (execution: 0.001 s)
17/02/02 17:35:41 INFO InputInfoTracker: remove old batch metadata:
17/02/02 17:35:41 INFO Executor: Adding file:/data/1/hadoop/peel/spark/tmp/spark-373414dc-f20f-45f0-aed7-87f19425565f/userFiles-5745fc97-dbd5-4a04-84f7-bacd4f10f1a2/bdapro-ws1617-spark-jobs-1.0-SNAPSHOT.jar to class loader
17/02/02 17:35:41 INFO RecurringTimer: Started timer for BlockGenerator at time 1486053341600
17/02/02 17:35:41 INFO BlockGenerator: Started block pushing thread
17/02/02 17:35:41 INFO BlockGenerator: Started BlockGenerator
17/02/02 17:35:41 INFO ReceiverSupervisorImpl: Stopping receiver with message: Registered unsuccessfully because Driver refused to start receiver 0:
17/02/02 17:35:41 WARN ReceiverSupervisorImpl: Skip stopping receiver because it has not yet stared
17/02/02 17:35:41 INFO BlockGenerator: Stopping BlockGenerator
17/02/02 17:35:41 INFO RecurringTimer: Stopped timer for BlockGenerator after time 1486053341800
17/02/02 17:35:41 INFO BlockGenerator: Waiting for block pushing thread to terminate
17/02/02 17:35:41 INFO BlockGenerator: Pushing out the last 0 blocks
17/02/02 17:35:41 INFO BlockGenerator: Stopped block pushing thread
17/02/02 17:35:41 INFO BlockGenerator: Stopped BlockGenerator
17/02/02 17:35:41 INFO ReceiverSupervisorImpl: Waiting for receiver to be stopped
17/02/02 17:35:41 INFO ReceiverSupervisorImpl: Stopped receiver without error
17/02/02 17:35:41 INFO Executor: Finished task 0.0 in stage 0.0 (TID 0). 1012 bytes result sent to driver
17/02/02 17:35:41 INFO TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 1881 ms on localhost (executor driver) (1/1)
17/02/02 17:35:41 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool
17/02/02 17:35:41 INFO DAGScheduler: ResultStage 0 (main at NativeMethodAccessorImpl.java:0) finished in 1.913 s
17/02/02 17:35:41 INFO ReceiverTracker: All of the receivers have deregistered successfully
17/02/02 17:35:41 INFO ReceiverTracker: ReceiverTracker stopped
17/02/02 17:35:41 INFO JobGenerator: Stopping JobGenerator immediately
17/02/02 17:35:41 INFO RecurringTimer: Stopped timer for JobGenerator after time 1486053341000
17/02/02 17:35:41 INFO JobGenerator: Stopped JobGenerator
17/02/02 17:35:41 INFO JobScheduler: Stopped JobScheduler
17/02/02 17:35:41 INFO ContextHandler: Stopped o.s.j.s.ServletContextHandler@3f29e26{/streaming,null,UNAVAILABLE}
17/02/02 17:35:41 INFO ContextHandler: Stopped o.s.j.s.ServletContextHandler@5cbd159f{/streaming/batch,null,UNAVAILABLE}
17/02/02 17:35:41 INFO ContextHandler: Stopped o.s.j.s.ServletContextHandler@7ce4de34{/static/streaming,null,UNAVAILABLE}
17/02/02 17:35:41 INFO StreamingContext: StreamingContext stopped successfully
17/02/02 17:35:41 INFO SparkContext: Invoking stop() from shutdown hook
17/02/02 17:35:41 INFO ServerConnector: Stopped ServerConnector@4c51bb7{HTTP/1.1}{0.0.0.0:4040}
17/02/02 17:35:41 INFO ContextHandler: Stopped o.s.j.s.ServletContextHandler@5ad10c1a{/stages/stage/kill,null,UNAVAILABLE}
17/02/02 17:35:41 INFO ContextHandler: Stopped o.s.j.s.ServletContextHandler@5b04476e{/jobs/job/kill,null,UNAVAILABLE}
17/02/02 17:35:41 INFO ContextHandler: Stopped o.s.j.s.ServletContextHandler@4264b240{/api,null,UNAVAILABLE}
17/02/02 17:35:41 INFO ContextHandler: Stopped o.s.j.s.ServletContextHandler@2b43529a{/,null,UNAVAILABLE}
17/02/02 17:35:41 INFO ContextHandler: Stopped o.s.j.s.ServletContextHandler@25748410{/static,null,UNAVAILABLE}
17/02/02 17:35:41 INFO ContextHandler: Stopped o.s.j.s.ServletContextHandler@485a3466{/executors/threadDump/json,null,UNAVAILABLE}
17/02/02 17:35:41 INFO ContextHandler: Stopped o.s.j.s.ServletContextHandler@7a527389{/executors/threadDump,null,UNAVAILABLE}
17/02/02 17:35:41 INFO ContextHandler: Stopped o.s.j.s.ServletContextHandler@1623134f{/executors/json,null,UNAVAILABLE}
17/02/02 17:35:41 INFO ContextHandler: Stopped o.s.j.s.ServletContextHandler@5357c287{/executors,null,UNAVAILABLE}
17/02/02 17:35:41 INFO ContextHandler: Stopped o.s.j.s.ServletContextHandler@546ccad7{/environment/json,null,UNAVAILABLE}
17/02/02 17:35:41 INFO ContextHandler: Stopped o.s.j.s.ServletContextHandler@78a287ed{/environment,null,UNAVAILABLE}
17/02/02 17:35:41 INFO ContextHandler: Stopped o.s.j.s.ServletContextHandler@5ce33a58{/storage/rdd/json,null,UNAVAILABLE}
17/02/02 17:35:41 INFO ContextHandler: Stopped o.s.j.s.ServletContextHandler@7a8fa663{/storage/rdd,null,UNAVAILABLE}
17/02/02 17:35:41 INFO ContextHandler: Stopped o.s.j.s.ServletContextHandler@3a62c01e{/storage/json,null,UNAVAILABLE}
17/02/02 17:35:41 INFO ContextHandler: Stopped o.s.j.s.ServletContextHandler@50fe837a{/storage,null,UNAVAILABLE}
17/02/02 17:35:41 INFO ContextHandler: Stopped o.s.j.s.ServletContextHandler@3c435123{/stages/pool/json,null,UNAVAILABLE}
17/02/02 17:35:41 INFO ContextHandler: Stopped o.s.j.s.ServletContextHandler@150ab4ed{/stages/pool,null,UNAVAILABLE}
17/02/02 17:35:41 INFO ContextHandler: Stopped o.s.j.s.ServletContextHandler@a8c1f44{/stages/stage/json,null,UNAVAILABLE}
17/02/02 17:35:41 INFO ContextHandler: Stopped o.s.j.s.ServletContextHandler@71b3bc45{/stages/stage,null,UNAVAILABLE}
17/02/02 17:35:41 INFO ContextHandler: Stopped o.s.j.s.ServletContextHandler@73db4768{/stages/json,null,UNAVAILABLE}
17/02/02 17:35:41 INFO ContextHandler: Stopped o.s.j.s.ServletContextHandler@4c37b5b{/stages,null,UNAVAILABLE}
17/02/02 17:35:41 INFO ContextHandler: Stopped o.s.j.s.ServletContextHandler@f0e995e{/jobs/job/json,null,UNAVAILABLE}
17/02/02 17:35:41 INFO ContextHandler: Stopped o.s.j.s.ServletContextHandler@24d4d7c9{/jobs/job,null,UNAVAILABLE}
17/02/02 17:35:41 INFO ContextHandler: Stopped o.s.j.s.ServletContextHandler@53f48368{/jobs/json,null,UNAVAILABLE}
17/02/02 17:35:41 INFO ContextHandler: Stopped o.s.j.s.ServletContextHandler@4e28bdd1{/jobs,null,UNAVAILABLE}
17/02/02 17:35:41 INFO SparkUI: Stopped Spark web UI at http://130.149.21.78:4040
17/02/02 17:35:41 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
17/02/02 17:35:42 INFO MemoryStore: MemoryStore cleared
17/02/02 17:35:42 INFO BlockManager: BlockManager stopped
17/02/02 17:35:42 INFO BlockManagerMaster: BlockManagerMaster stopped
17/02/02 17:35:42 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
17/02/02 17:35:42 INFO SparkContext: Successfully stopped SparkContext
17/02/02 17:35:42 INFO ShutdownHookManager: Shutdown hook called
17/02/02 17:35:42 INFO ShutdownHookManager: Deleting directory /data/1/hadoop/peel/spark/tmp/spark-373414dc-f20f-45f0-aed7-87f19425565f
17/02/02 17:35:42 INFO ShutdownHookManager: Deleting directory /data/3/hadoop/peel/spark/tmp/spark-7b6727d6-759f-4720-ba31-e15c2dd812a4
17/02/02 17:35:42 INFO ShutdownHookManager: Deleting directory /data/4/hadoop/peel/spark/tmp/spark-c4fb1e6c-365b-4b17-8f85-6fa36d141f54
17/02/02 17:35:42 INFO ShutdownHookManager: Deleting directory /data/2/hadoop/peel/spark/tmp/spark-5f8fdd90-396c-4de4-9e34-86c15bfb4431
@jeyhunkarimov
Hi Jeyhun,
I actually don't get an error, but the receiver is unexpectedly stopped. This is the output relevant to the receiver. Can you please have a look? I googled this but found no good answer.
The code for the receiver is here [1] and it is called here [2].
[1] https://github.com/cristiprg/BDAPRO.GlobalStateML/blob/mllibMatrixRedis/bdapro-ws1617-spark-jobs/src/main/scala/de/tu_berlin/dima/bdapro/spark/mlalg/util/CSVFileSource.scala
[2] https://github.com/cristiprg/BDAPRO.GlobalStateML/blob/mllibMatrixRedis/bdapro-ws1617-spark-jobs/src/main/scala/org/apache/spark/mllib/clustering/SequentialKMeansGlobalState.scala#L86
Here is the command line and full output just in case: