Closed devayan851989 closed 5 years ago
Issue resolved by using points.
1.Space issue in the JDK path causing this issue Workaround to ignore space ( PROGRA~1 ) in the folder C:\PROGRA~1\Java\jdk1.8.0_152
2.C drive is not recognized workaround:Remove C:/ from file ( hdfs-site.xml ) /hadoop-2.8.0/data/namenode
Thanks @devayan851989 ! Also in main artical.
@devayan851989 I do not understand how you solved the issue number 1. This is my JDK path: C:\Program Files\Java\jdk1.8.0_351 that i saved in hadoop-env.cmd file. How should i modify this? I would appreciate it, if you answered this. Thanks
hello. I cant seem to figure out my error: I can run the HDFS command, but I cant seem to run any other commands after hdfs command. Example: hdfs namenode -format: my windows cmd says that the internal or external command is not found. What should I do?
Command ( hdfs namenode –format ) is not working.
Error message:
C:\Spark\hadoop\New folder\hadoop-2.8.0.tar\hadoop-2.8.0\hadoop-2.8.0\bin>hdfs namenode -format 'C:\Spark\hadoop\New' is not recognized as an internal or external command, operable program or batch file. '-classpath' is not recognized as an internal or external command, operable program or batch file.
Could you please help me in troubleshooting this issue