Hadoop 文件系统shell使用命令介绍,基本上所有的命令与linux命令相差不大,下面为我的hadoop系统文件夹
查看命令:hadoop fs -ls /home/hadoop/
drwxr-xr-x - hadoop supergroup 0 2013-11-30 17:51 /home/hadoop/dir
drwxr-xr-x - hadoop supergroup 0 2013-11-30 17:48 /home/hadoop/input
-rw-r--r-- 1 hadoop supergroup 64 2013-11-30 17:50 /home/hadoop/ouput
drwxr-xr-x - hadoop supergroup 0 2013-11-29 22:50 /home/hadoop/output
drwxr-xr-x - hadoop supergroup 0 2013-11-29 22:50 /home/hadoop/tmp
查看所有命令帮助:hadoop fs --help
catUsage: hadoop fs -cat URI [URI …] 查看fs文件内容 例子: - hadoop fs -cat /home/hadoop/input/content.txt
Returns 0 on success and -1 on error.
chgrpUsage: hadoop fs -chgrp [-R] GROUP URI [URI …] Change group association of files. With -R, make the change recursively through the directory structure. The user must be the owner of files, or else a super-user. Additional information is in the Permissions User Guide.
chmodUsage: hadoop fs -chmod [-R] <MODE[,MODE]... | OCTALMODE> URI [URI …] Change the permissions of files. With -R, make the change recursively through the directory structure. The user must be the owner of the file, or else a super-user. Additional information is in the Permissions User Guide.
chownUsage: hadoop fs -chown [-R] [OWNER][:[GROUP]] URI [URI ] Change the owner of files. With -R, make the change recursively through the directory structure. The user must be a super-user. Additional information is in the Permissions User Guide.
copyFromLocalUsage: hadoop fs -copyFromLocal <localsrc> URI 例子: hadoop fs -copyFromLocal /home/hadoop/address /home/hadoop/input Similar to put command, except that the source is restricted to a local file reference.
copyToLocalUsage: hadoop fs -copyToLocal [-ignorecrc] [-crc] URI <localdst> 例子: hadoop fs -copyToLocal /home/hadoop/input/content.txt /home/hadoop/mylocal Similar to get command, except that the destination is restricted to a local file reference.
cpUsage: hadoop fs -cp URI [URI …] <dest> Copy files from source to destination. This command allows multiple sources as well in which case the destination must be a directory.
例子: - hadoop fs -cp /home/hadoop/input/content.txt /home/hadoop/ouput
- hadoop fs -cp /home/hadoop/input/address /home/hadoop/ouput /home/hadoop/dir
Exit Code: Returns 0 on success and -1 on error.
duUsage: hadoop fs -du URI [URI …] Displays aggregate length of files contained in the directory or the length of a file in case its just a file.
例子:
hadoop fs -du /home/hadoop/dir /home/hadoop/input 输出结果: Found 2 items 65 hdfs://localhost:9000/home/hadoop/dir/address 64 hdfs://localhost:9000/home/hadoop/dir/ouput Found 2 items 65 hdfs://localhost:9000/home/hadoop/input/address 64 hdfs://localhost:9000/home/hadoop/input/content.txt
Exit Code:
Returns 0 on success and -1 on error.
dusUsage: hadoop fs -dus <args> 例子: hadoop fs -dus /home/hadoop/dir
输出:hdfs://localhost:9000/home/hadoop/dir 129 Displays a summary of file lengths.
expunge 手动清除fs回收站Usage: hadoop fs -expunge Empty the Trash. Refer to HDFS Design for more information on Trash feature.
getUsage: hadoop fs -get [-ignorecrc] [-crc] <src> <localdst>
Copy files to the local file system. Files that fail the CRC check may be copied with the -ignorecrc option. Files and CRCs may be copied using the -crc option. 例子: - hadoop fs -get /home/hadoop/input/content.txt /home/hadoop/mylocal/
- hadoop fs -get hdfs://nn.example.com/user/hadoop/file localfile
Exit Code: Returns 0 on success and -1 on error.
getmergeUsage: hadoop fs -getmerge <src> <localdst> [addnl] Takes a source directory and a destination file as input and concatenates files in src into the destination local file. Optionally addnl can be set to enable adding a newline character at the end of each file.
lsUsage: hadoop fs -ls <args> For a file returns stat on the file with the following format:
filename <number of replicas> filesize modification_date modification_time permissions userid groupid
For a directory it returns list of its direct children as in unix. A directory is listed as:
dirname <dir> modification_time modification_time permissions userid groupid
例子:
hadoop fs -ls /home/hadoop/input/ /home/hadoop/ouput/ 输出: Found 2 items -rw-r--r-- 1 hadoop supergroup 65 2013-11-30 17:48 /home/hadoop/input/address -rw-r--r-- 1 hadoop supergroup 64 2013-11-29 22:48 /home/hadoop/input/content.txt Found 1 items -rw-r--r-- 1 hadoop supergroup 64 2013-11-30 17:50 /home/hadoop/ouput Exit Code:
Returns 0 on success and -1 on error.
lsrUsage: hadoop fs -lsr <args> 例子: hadoop fs -lsr /home/hadoop/input/ /home/hadoop/ouput/
输出: -rw-r--r-- 1 hadoop supergroup 65 2013-11-30 17:48 /home/hadoop/input/address -rw-r--r-- 1 hadoop supergroup 64 2013-11-29 22:48 /home/hadoop/input/content.txt -rw-r--r-- 1 hadoop supergroup 64 2013-11-30 17:50 /home/hadoop/ouput Recursive version of ls. Similar to Unix ls -R.
mkdirUsage: hadoop fs -mkdir <paths>
Takes path uri's as argument and creates directories. The behavior is much like unix mkdir -p creating parent directories along the path. 例子: - hadoop fs -mkdir /home/hadoop/input/ /home/hadoop/ouput/
- hadoop fs -mkdir hdfs://nn1.example.com/user/hadoop/dir hdfs://nn2.example.com/user/hadoop/dir
Exit Code: Returns 0 on success and -1 on error.
movefromLocalUsage: dfs -moveFromLocal <src> <dst> Displays a "not implemented" message.
mvUsage: hadoop fs -mv URI [URI …] <dest> Moves files from source to destination. This command allows multiple sources as well in which case the destination needs to be a directory. Moving files across filesystems is not permitted.
例子: - hadoop fs -mv /home/hadoop/input/lzw /home/hadoop/output
- hadoop fs -mv hdfs://nn.example.com/file1 hdfs://nn.example.com/file2 hdfs://nn.example.com/file3 hdfs://nn.example.com/dir1
Exit Code: Returns 0 on success and -1 on error.
putUsage: hadoop fs -put <localsrc> ... <dst> Copy single src, or multiple srcs from local file system to the destination filesystem. Also reads input from stdin and writes to destination filesystem.
- hadoop fs -put /home/hadoop/lzw /home/hadoop/input
- hadoop fs -put /home/hadoop/lzw /home/hadoop/lzw1 /home/hadoop/input
- hadoop fs -put localfile hdfs://nn.example.com/hadoop/hadoopfile
- hadoop fs -put - hdfs://nn.example.com/hadoop/hadoopfile
Reads the input from stdin.
Exit Code: Returns 0 on success and -1 on error.
rm 删除文件Usage: hadoop fs -rm URI [URI …] Delete files specified as args. Only deletes non empty directory and files. Refer to rmr for recursive deletes.
例子: - hadoop fs -rm /home/hadoop/output/lzw
Exit Code: Returns 0 on success and -1 on error.
rmr 删除文件夹Usage: hadoop fs -rmr URI [URI …] Recursive version of delete.
例子: - hadoop fs -rmr /home/hadoop/output
- hadoop fs -rmr hdfs://nn.example.com/user/hadoop/dir
Exit Code: Returns 0 on success and -1 on error.
setrepUsage: hadoop fs -setrep [-R] <path> Changes the replication factor of a file. -R option is for recursively increasing the replication factor of files within a directory. 例子: - hadoop fs -setrep -w 3 -R /user/hadoop/dir1
Exit Code: Returns 0 on success and -1 on error.
statUsage: hadoop fs -stat URI [URI …] Returns the stat information on the path. 例子: - hadoop fs -stat /home/hadoop/input
输出: 2013-11-30 16:00:27 Exit Code:
Returns 0 on success and -1 on error.
tailUsage: hadoop fs -tail [-f] URI Displays last kilobyte of the file to stdout. -f option can be used as in Unix. 例子: Exit Code:
Returns 0 on success and -1 on error.
testUsage: hadoop fs -test -[ezd] URI Options:
-e check to see if the file exists. Return 0 if true.
-z check to see if the file is zero length. Return 0 if true
-d check return 1 if the path is directory else return 0.
例子: - hadoop fs -test -d /home/hadoop/input
- hadoop fs -test -e /home/hadoop/input
- hadoop fs -test -z /home/hadoop/input
text 查看文件内容,跟cat基本相似Usage: hadoop fs -text <src> 例子: hadoop fs -text /home/hadoop/input/address 输出: addressID addressname 1 Beijing 2 Guangzhou 3 Shenzhen 4 Xian Takes a source file and outputs the file in text format. The allowed formats are zip and TextRecordInputStream.
touchz 创建空文件Usage: hadoop fs -touchz URI [URI …]
Create a file of zero length. 例子: - hadoop fs -touchz /home/hadoop/input/lzw
Exit Code:
Returns 0 on success and -1 on error.
加入qq群(号码:39327136),讨论云技术,获取最新资讯资源等
|