Open onehao opened 6 years ago
hadoop namenode, and found the port is in use
java.net.BindException: Port in use: 0.0.0.0:50090 at org.apache.hadoop.http.HttpServer2.openListeners(HttpServer2.java:940) at org.apache.hadoop.http.HttpServer2.start(HttpServer2.java:876) at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:276) at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.
(SecondaryNameNode.java:192) at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:671) Caused by: java.net.BindException: Address already in use at sun.nio.ch.Net.bind0(Native Method) at sun.nio.ch.Net.bind(Net.java:433) at sun.nio.ch.Net.bind(Net.java:425) at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223) at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74) at org.mortbay.jetty.nio.SelectChannelConnector.open(SelectChannelConnector.java:216) at org.apache.hadoop.http.HttpServer2.openListeners(HttpServer2.java:934) ... 4 more 18/02/13 16:49:35 FATAL namenode.SecondaryNameNode: Failed to start secondary namenode java.net.BindException: Port in use: 0.0.0.0:50090 at org.apache.hadoop.http.HttpServer2.openListeners(HttpServer2.java:940) at org.apache.hadoop.http.HttpServer2.start(HttpServer2.java:876) at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:276) at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode. (SecondaryNameNode.java:192) at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:671) Caused by: java.net.BindException: Address already in use at sun.nio.ch.Net.bind0(Native Method) at sun.nio.ch.Net.bind(Net.java:433) at sun.nio.ch.Net.bind(Net.java:425) at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223) at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74) at org.mortbay.jetty.nio.SelectChannelConnector.open(SelectChannelConnector.java:216) at org.apache.hadoop.http.HttpServer2.openListeners(HttpServer2.java:934) ... 4 more
lsof -i:50070, lsof -i:50090, lsof -i:9000, to find the process and stop them. kill the process then restart the start-dfs.sh, and all nodes started ➜ hadoop-2.7.5 git:(master) ✗ jps 99028 Jps 98696 NameNode 72075 96747 FsShell 98925 SecondaryNameNode 98796 DataNode
start-dfs.sh
and using stop-all.sh and rerun start-dfs.sh, still found that the no node started.