cindysz110 / blog

8 stars 1 forks source link

[Hadoop] Hadoop查看日志 #12

Open cindygl opened 6 years ago

cindygl commented 6 years ago

Hadoop日志

查看Hadoop日志

Hadoop默认日志目录:$HADOOP_HOME/log/,可以通过ps -ef 查看

[root@hadoop01 logs]# pwd
/opt/software/hadoop-2.8.1/logs
# Hadoop日志都是以.log结尾的
# 日志文件默认是空的,写到达到200M时会自动切分,切分的日志后面带了一个.log.1,.log.2 
[root@hadoop01 logs]# ls -ltr *.log
-rw-rw-r--. 1 hadoop hadoop 199871 May 25 22:06 hadoop-hadoop-namenode-hadoop01.log
-rw-rw-r--. 1 hadoop hadoop  62041 May 25 22:06 hadoop-hadoop-datanode-hadoop01.log
-rw-rw-r--. 1 hadoop hadoop 207830 May 25 22:06 hadoop-hadoop-secondarynamenode-hadoop01.log
-rw-rw-r--. 1 hadoop hadoop  80937 May 25 22:06 yarn-hadoop-resourcemanager-hadoop01.log
-rw-rw-r--. 1 hadoop hadoop 162803 May 25 22:06 yarn-hadoop-nodemanager-hadoop01.log
[root@hadoop01 logs]# 

观察hadoop启动日志

打开另外一个回话窗口,清空namenode日志

[hadoop@hadoop01 logs]$ cat /dev/null > hadoop-hadoop-namenode-hadoop01.log
# 重启namenode进程,启动日志会会写进.log日志文件
[root@hadoop01 logs]# ../sbin/start-dfs.sh 

然后拷贝hadoop-hadoop-namenode-hadoop01.log到windows用sublime查看,定位error。