1、查看hadoop文件:
hdfs dfs -ls hdfs://haruna/home/byte_ecom_knowledge_graph/user/sunjianchao/
2、删除hadoop文件&目录
hadoop fs -rm hdfs://haruna/home/byte_ecom_knowledge_graph/user/sunjianchao/udf_main.py
hdfs dfs rmdir /user/zhang/demo #删除目录
3、创建目录
hdfs dfs -mkdir /user/zhang/abc
4、下载
hadoop fs -get hdfs://haruna/home/byte_ecom_knowledge_graph/user/sunjianchao/newUserEcomPre2.tar
hadoop fs -get hdfs://haruna/home/byte_ecom_knowledge_graph/user/sunjianchao/m_video_30d_info_level2_sample
5、hive表写入的hsdf文件转换到txt文件命令 hadoop fs -text
hadoop fs -text hdfs://haruna/home/byte_ecom_knowledge_graph/user/sunjianchao/m_video_30d_info_level2_sample/part* > all_hdfs_local.txt