zoukankan      html  css  js  c++  java
  • hadoop hdfs uri详解

    hadoop hdfs uri详解

    hadoop hdfs uri详解

    一、hdfs基本命令:
    hadoop fs -cmd <args>
    选项:
    cmd: 具体的操作,基本上与UNIX的命令行相同
    args: 参数

    二、hdfs资源uri格式:
    用法:scheme://authority/path
    选项:
    scheme–>协议名,file或hdfs
    authority–>namenode主机名
    path–>路径
    范例:hdfs://localhost:54310/user/hadoop/test.txt
    假设已经在/home/hadoop/hadoop-1.1.1/conf/core-site.xml里配置了fs.default.name=hdfs://localhost:54310,则仅使用/user/hadoop/test.txt即可。hdfs默认工作目录为/user/$USER,$USER是当前的登录用户名。

    三、hdfs命令范例
    hadoop fs -mkdir /user/hadoop
    hadoop fs -ls /user
    hadoop fs -lsr /user(递归的)
    hadoop fs -put test.txt /user/hadoop(复制到hdfs://localhost:54310/user/hadoop目录下,首先要创建当前目录)
    hadoop fs -get /user/hadoop/test.txt .(复制test.txt文件到本地当前目录下)
    hadoop fs -cat /user/hadoop/test.txt
    hadoop fs -tail /user/hadoop/test.txt(查看最后1000字节)
    hadoop fs -rm /user/hadoop/test.txt
    hadoop fs -help ls(查看ls命令的帮助文档)

    四、在put时遇到的问题
    异常信息: org.apache.hadoop.hdfs.server.namenode.SafeModeException: Cannot create file/user/hadoopadmin. Name node is in safe mode.
    解决办法:hadoop dfsadmin -safemode leave





  • 相关阅读:
    SQLI DUMB SERIES-12
    SQLI DUMB SERIES-11
    SQLI DUMB SERIES-9&&10
    SQLI DUMB SERIES-8
    SQLI DUMB SERIES-7
    XXS level10
    XXS level9
    XXS level8
    XXS level7
    XXS level6
  • 原文地址:https://www.cnblogs.com/wang3680/p/f071c4ea6c85e5d994ceda2a7714f7b6.html
Copyright © 2011-2022 走看看