zoukankan      html  css  js  c++  java
  • hadoop hdfs 命令

    hdfs命令常用操作:

    hdfs帮助

    -help [cmd] 显示命令的帮助信息

    [hadoop@hadoop-01 ~]$ hdfs dfs -help ls
    

    递归显示当前目录下的所有文件:

    [hadoop@hadoop-01 ~]$ hdfs dfs -ls -h /
    Found 1 items
    drwxrwx---   - hadoop supergroup          0 2017-11-23 13:09 /tmp
    
    [hadoop@hadoop-01 ~]$ hdfs dfs -ls -h -R /
    drwxrwx---   - hadoop supergroup          0 2017-11-23 13:09 /tmp
    drwxrwx---   - hadoop supergroup          0 2017-11-23 13:09 /tmp/hadoop-yarn
    drwxrwx---   - hadoop supergroup          0 2017-11-23 13:09 /tmp/hadoop-yarn/staging
    drwxrwx---   - hadoop supergroup          0 2017-11-23 13:09 /tmp/hadoop-yarn/staging/history
    drwxrwx---   - hadoop supergroup          0 2017-11-23 13:09 /tmp/hadoop-yarn/staging/history/done
    drwxrwxrwt   - hadoop supergroup          0 2017-11-23 13:09 /tmp/hadoop-yarn/staging/history/done_intermediate
    

    -du 显示目录中所有文件大小:

    [hadoop@hadoop-01 ~]$ hdfs dfs -du -s -h /tmp/
    0  /tmp
    [hadoop@hadoop-01 ~]$
    

    -count计算路径下的目录、文件和字节数

    [hadoop@hadoop-01 ~]$ hdfs dfs -count -q -h /tmp/
            none             inf            none             inf            6            0                  0 /tmp
    [hadoop@hadoop-01 ~]$
    

    -mkdir 在指定位置创建一个hdfs目录.

    [hadoop@hadoop-01 ~]$ hdfs dfs -mkdir testdhadoop
    

    递归创建目录:

    [hadoop@hadoop-01 bin]$ ./hdfs dfs -mkdir -p /test1/test2/test3
    #递归查看当前目录下所有文件.
    [hadoop@hadoop-01 bin]$ ./hdfs dfs -ls -R /test1
    drwxr-xr-x   - hadoop supergroup          0 2017-11-23 15:17 /test1/test2
    drwxr-xr-x   - hadoop supergroup          0 2017-11-23 15:17 /test1/test2/test3
    

    -mv  移动多个文件目录到目标目录,(移动的文件也需要是hdfs目录中存在的文件.)

    [hadoop@hadoop-01 ~]$ hdfs dfs -mv /tmp/hadoop-yarn /user/hadoop/testdhadoop
    

    -cp 复制多个dhfs文件到目标目录

    [hadoop@hadoop-01 ~]$ hdfs dfs -cp /user/hadoop/testdhadoop /tmp/hadoop-yarn
    

    -put 本地文件复制到hdfs

    [hadoop@hadoop-01 ~]$ hdfs dfs -put /etc/passwd /user/hadoop/testdhadoop
    

    -copyFromLocal 与- put 命令相同.

    [hadoop@hadoop-01 ~]$ hdfs dfs -copyFromLocal /etc/yum.conf /user/Hadoop
    [hadoop@hadoop-01 ~]$ hdfs dfs -ls -R /user/hadoop                        
    drwxr-xr-x   - hadoop supergroup          0 2017-11-23 14:37 /user/hadoop/testdhadoop
    drwxrwx---   - hadoop supergroup          0 2017-11-23 13:09 /user/hadoop/testdhadoop/hadoop-yarn
    drwxrwx---   - hadoop supergroup          0 2017-11-23 13:09 /user/hadoop/testdhadoop/passwd
    -rw-r--r--   2 hadoop supergroup        969 2017-11-23 14:41 /user/hadoop/yum.conf
    

    -moveFromLocal 本地文件移动到 hdfs.

    [hadoop@hadoop-01 ~]$ hdfs dfs -mkdir /logs
    [hadoop@hadoop-01 ~]$ hdfs dfs -ls -d /logs
    drwxr-xr-x   - hadoop supergroup          0 2017-11-23 14:47 /logs
    [hadoop@hadoop-01 ~]$ hdfs dfs -moveFromLocal test.txt /logs
    [hadoop@hadoop-01 ~]$ hdfs dfs -ls -h /logs
    Found 1 items
    -rw-r--r--   2 hadoop supergroup         12 2017-11-23 14:49 /logs/test.txt
    

    get [-ignoreCrc] 复制hdfs文件到本地,可以忽略crc校验.

    [hadoop@hadoop-01 ~]$ hdfs dfs -get /logs/test.txt /tmp/
    

    - copyToLocal 与- get命令相同 复制dhfs文件到本地.

    [hadoop@hadoop-01 bin]$ ./hdfs dfs -copyToLocal /logs/test.txt /home/hadoop/
    [hadoop@hadoop-01 ~]$ ls -lh /home/hadoop/
    total 16K
    drwxrwxr-x.  4 hadoop hadoop 4.0K Nov 23 12:12 dfs
    drwxr-xr-x. 11 hadoop hadoop 4.0K Nov 23 12:47 hadoop
    -rw-r--r--.  1 hadoop hadoop   12 Nov 23 15:05 test.txt
    drwxrwxr-x.  3 hadoop hadoop 4.0K Nov 23 12:48 tmp
    

    - cat 在终端显示文件内容

    [hadoop@hadoop-01 /]$ hdfs dfs -cat /logs/test.txt
    hello world
    [hadoop@hadoop-01 /]$
    

    - text 在终端显示文件内容,将源文件输出为文本格式。允许的格式是zip和TextRecordInputStream.  

    [hadoop@hadoop-01 bin]$ ./hdfs dfs -text /logs/test.txt
    hello world
    [hadoop@hadoop-01 /]$ hdfs dfs -tail /logs/part-00000 (查看文件的最后一千行)
    [hadoop@hadoop-01 /]$ hdfs dfs -cat /logs/part-00000  | head
    

    - touchz 创建一个hdfs空文件.

    [hadoop@hadoop-01 bin]$ ./hdfs dfs -touchz /test1/1.txt
    [hadoop@hadoop-01 bin]$ ./hdfs dfs -ls -R /test1         
    -rw-r--r--   2 hadoop supergroup          0 2017-11-23 15:20 /test1/1.txt
    drwxr-xr-x   - hadoop supergroup          0 2017-11-23 15:17 /test1/test2
    drwxr-xr-x   - hadoop supergroup          0 2017-11-23 15:17 /test1/test2/test3
    

    - getmerge [addnl] 将hdfs源目录中的所有文件排序合并到一个本地文件中,接受一个源目录和一个目标文件作为输入,并且将源目录中所有的文件连接成本地目标文件。addnl是可选的,用于指定在每个文件结尾添加一个换行符.

    #将hdfs上的/logs/* 下的所有文件合并下载到本地的/tmp/hello文件中.
    [hadoop@hadoop-01 bin]$ ./hdfs dfs -getmerge /logs/* /tmp/hello
    [hadoop@hadoop-01 bin]$ cat /tmp/hello 
    111111111111111111111111
    hello world
    [hadoop@hadoop-01 bin]$
    

    - grep 从hdfs上过滤包含某个字符的行内容

    [hadoop@hadoop-01 bin]$ ./hdfs dfs -cat /logs/* | grep 过滤字段
    

     

    参考文档:http://blog.csdn.net/zhaojw_420/article/details/53161624 

  • 相关阅读:
    xx系统需求分析第七稿--权限管理(一)
    第四周学习进度总结
    hbase的Shell命令操作
    软件需求最佳实践读书笔记一
    hbase的Java基本操作
    Java 面向对象编程之接口
    Java 面向对象编程之继承的super关键词
    Java核心基础之数据类型
    Java 导出Excel
    JavaSE基础知识之修饰符和使用场景,你真的了解嘛
  • 原文地址:https://www.cnblogs.com/saneri/p/7886975.html
Copyright © 2011-2022 走看看