HDFS Shell is a HDFS manipulation tool to work with functions integrated in Hadoop DFS
There are 3 possible usecases:
hdfs dfs -ls /
, ls /
- both will workcd
and pwd
su
, groups
, whoami
ls /analytics | less
is not possible at this time, you have to use HDFS Shell in Daemon modeHDFS-Shell is a standard Java application. For its launch you need to define 2 things on your classpath:
./lib/*.jar
on classpath (the dependencies ./lib
are included in the binary bundle or they are located in Gradle build/distributions/*.zip)/etc/hadoop/conf
folder%HADOOP_HOME%\etc\hadoop\
folderNote that paths inside java -cp switch are separated by :
on Linux and ;
on Windows.
Pre-defined launch scripts are located in the zip file. You can modify it locally as needed.
hdfs-shell.sh
(without parameters) otherwise:hdfs-shell.sh script <file_path>
to execute commands from filehdfs-shell.sh xscript <file_path>
to execute commands from file but ignore command errors (skip errors)help
to get list of all supported commandsclear
or cls
to clear screenexit
or quit
or just q
to exit the shell! <command>
, eg. ! echo hello
will call the system command echols
onlyscript <file_path>
to execute commands from filexscript <file_path>
to execute commands from file but ignore command errors (skip errors)For our purposes we also integrated following commands:
set showResultCodeON
and set showResultCodeOFF
- if it's enabled, it will write command result code after its completioncd
, pwd
su <username>
- experimental - changes current active user - it won't probably work on secured HDFS (KERBEROS)whoami
- prints effective usernamegroups <username1 <username2,...>>
- eg.groups hdfs
prints groups for given users, same as hdfs groups my_user my_user2
functionalityedit 'my file'
- see the config belowSince the version 1.0.4 the simple command 'edit' is available. The command gets selected file from HDFS to the local temporary directory and launches the editor. Once the editor saves the file (with a result code 0), the file is uploaded back into HDFS (target file is overwritten).
By default the editor path is taken from $EDITOR
environment variable. If $EDITOR
is not set, vim
(Linux, Mac) or notepad.exe
(Windows) is used.
HDFS Shell supports customized bash-like prompt setting!
I implemented support for these switches listed in this table (include colors!, exclude \!, \#
).
You can also use this online prompt generator to create prompt value of your wish.
To setup your favorite prompt simply add export HDFS_SHELL_PROMPT="value"
to your .bashrc (or set env variable on Windows) and that's it. Restart HDFS Shell to apply change.
Default value is currently set to \e[36m\u@\h \e[0;39m\e[33m\w\e[0;39m\e[36m\\$ \e[37;0;39m
.
echo ls / | nc -U /var/tmp/hdfs-shell.sock
The project is using Gradle 3.x to build. By default it's using Hadoop 2.6.0, but it also has been succesfully tested with version 2.7.x. It's based on Spring Shell (includes JLine component). Using Spring Shell mechanism you can easily add your own commands into HDFS Shell. (see com.avast.server.hdfsshell.commands.ContextCommands or com.avast.server.hdfsshell.commands.HadoopDfsCommands for more details)
All suggestions and merge requests are welcome.
For developing, add to JVM args in your IDE launch config dialog:
-Djline.WindowsTerminal.directConsole=false -Djline.terminal=jline.UnsupportedTerminal
My dir
using command mkdir "My dir"
. This should be probably resolved with an upgrade to Spring Shell 2.rm -R dir
) from root (/
) directory. You have to use absolut path instead (rm -R /dir
). It's caused by bug in Hadoop. See HADOOP-15233 for more details. Removing directory from another cwd is not affected. Author&Maintainer: Ladislav Vitasek - vitasek/@/avast.com