log:
24/03/25 11:26:19 INFO HadoopRDD: Input split: hdfs://localhost:9000/demo/hello.txt:0+35
24/03/25 11:26:32 WARN ProcfsMetricsGetter: Exception when trying to compute pagesize, as a result reporting of ProcessTree metrics is stopped
24/03/25 11:26:40 WARN BlockReaderFactory: I/O error constructing remote block reader.
java.net.ConnectException: Connection timed out: no further information
24/03/25 11:26:40 WARN DFSClient: Failed to connect to /172.23.0.3:9866 for file /demo/hello.txt for block BP-105279756-172.21.0.3-1711089941744:blk_1073741836_1012, add to deadNodes and continue.
24/03/25 11:26:40 WARN DFSClient: No live nodes contain block BP-105279756-172.21.0.3-1711089941744:blk_1073741836_1012 after checking nodes = [DatanodeInfoWithStorage[172.23.0.3:9866,DS-e86dab8d-de62-461a-a527-08ff43c7640a,DISK]], ignoredNodes = null
24/03/25 11:26:40 INFO DFSClient: Could not obtain BP-105279756-172.21.0.3-1711089941744:blk_1073741836_1012 from any node: No live nodes contain current block Block locations: DatanodeInfoWithStorage[172.23.0.3:9866,DS-e86dab8d-de62-461a-a527-08ff43c7640a,DISK] Dead nodes: DatanodeInfoWithStorage[172.23.0.3:9866,DS-e86dab8d-de62-461a-a527-08ff43c7640a,DISK]. Will get new block locations from namenode and retry...
24/03/25 11:26:40 WARN DFSClient: DFS chooseDataNode: got # 1 IOException, will wait for 1010.6453538830201 msec.
24/03/25 11:27:02 WARN BlockReaderFactory: I/O error constructing remote block reader.
this hdfs is in docker with this repositoty
but when i use hdfs in vmwarm with no docker it will work
JavaRDD distFile = sc.textFile("hdfs://localhost:9000/demo/hello.txt");
log: 24/03/25 11:26:19 INFO HadoopRDD: Input split: hdfs://localhost:9000/demo/hello.txt:0+35 24/03/25 11:26:32 WARN ProcfsMetricsGetter: Exception when trying to compute pagesize, as a result reporting of ProcessTree metrics is stopped 24/03/25 11:26:40 WARN BlockReaderFactory: I/O error constructing remote block reader. java.net.ConnectException: Connection timed out: no further information
24/03/25 11:26:40 WARN DFSClient: Failed to connect to /172.23.0.3:9866 for file /demo/hello.txt for block BP-105279756-172.21.0.3-1711089941744:blk_1073741836_1012, add to deadNodes and continue.
24/03/25 11:26:40 WARN DFSClient: No live nodes contain block BP-105279756-172.21.0.3-1711089941744:blk_1073741836_1012 after checking nodes = [DatanodeInfoWithStorage[172.23.0.3:9866,DS-e86dab8d-de62-461a-a527-08ff43c7640a,DISK]], ignoredNodes = null 24/03/25 11:26:40 INFO DFSClient: Could not obtain BP-105279756-172.21.0.3-1711089941744:blk_1073741836_1012 from any node: No live nodes contain current block Block locations: DatanodeInfoWithStorage[172.23.0.3:9866,DS-e86dab8d-de62-461a-a527-08ff43c7640a,DISK] Dead nodes: DatanodeInfoWithStorage[172.23.0.3:9866,DS-e86dab8d-de62-461a-a527-08ff43c7640a,DISK]. Will get new block locations from namenode and retry... 24/03/25 11:26:40 WARN DFSClient: DFS chooseDataNode: got # 1 IOException, will wait for 1010.6453538830201 msec. 24/03/25 11:27:02 WARN BlockReaderFactory: I/O error constructing remote block reader.
this hdfs is in docker with this repositoty but when i use hdfs in vmwarm with no docker it will work