zoukankan      html  css  js  c++  java
  • fuse挂载hdfs目录到linux本地

    1,安装fuse

    yum -y  install hadoop-hdfs-fuse  
    

     2.修改环境变量

    vi /etc/profile
    

     增加如下配置:

    JAVA_HOME=/usr/jdk64/jdk1.8.0_60
    HADOOP_HOME=/usr/hdp/2.4.0.0-169/hadoop
    export PATH=$HADOOP_HOME/bin:$JAVA_HOME/bin:$PATH
    export LD_LIBRARY_PATH=/usr/hdp/2.4.0.0-169/usr/lib/:/usr/local/lib:/usr/lib:$LD_LIBRARY_PATH:$HADOOP_HOME/build/c++/Linux-amd64-64/lib:${JAVA_HOME}/jre/lib/amd64/server
    

     3.创建挂载点 (要挂载到linux的本地目录)

    mkdir /hdfs

    4.挂载

    方法一:hadoop-fuse-dfs dfs://ocdp /hdfs

    [root@vmocdp125 lib]# hadoop-fuse-dfs dfs://ocdp /hdfs
    INFO /grid/0/jenkins/workspace/HDP-build-centos6/bigtop/build/hadoop/rpm/BUILD/hadoop-2.7.1.2.4.0.0-src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_options.c:164 Adding FUSE arg /hdfs
    

      ”ocdp“ 为集群的名称,hdfs-site.xml中nameservice的值

    方法二:

    自动挂载方法:

     修改fstab文件:

    查看一下:

    grep hadoop /etc/fstab 
    
    vi /etc/fstab
    

      添加以下信息:

        hadoop-fuse-dfs#dfs://ocdp  /hdfs fuse usetrash,rw 0 0  
    

     自动挂载:

    mount -a

    5.查看

    [root@vmocdp125 bin]# df -h
    Filesystem            Size  Used Avail Use% Mounted on
    /dev/mapper/vg_ocdp01-lv_root
                           50G   14G   34G  29% /
    tmpfs                  11G  8.0K   11G   1% /dev/shm
    /dev/sda1             477M   33M  419M   8% /boot
    /dev/mapper/vg_ocdp01-lv_home
                          948G  674M  900G   1% /home
    fuse_dfs              337G  3.3G  334G   1% /hdfs
    

      进入挂载目录可查看到hdfs上的文件夹都在挂载点/hdfs下

    [root@vmocdp125 bin]# cd /hdfs
    [root@vmocdp125 hdfs]# ll
    total 52
    drwxrwxrwx  5 yarn   hadoop 4096 Oct 12 16:11 app-logs
    drwxr-xr-x  4 hdfs   hdfs   4096 Sep 14 20:09 apps
    drwxr-xr-x  4 yarn   hadoop 4096 Sep 14 19:48 ats
    drwxr-xr-x  4 flume  hdfs   4096 Oct 31 18:55 flume
    drwxr-xr-x  3 ocetl  hdfs   4096 Oct 13 14:52 ftp
    drwxr-xr-x  3 hdfs   hdfs   4096 Sep 14 19:48 hdp
    drwxr-xr-x  3 ocetl  hdfs   4096 Oct 21 16:05 hiveQuery
    drwxrwxrwx  4 ocetl  hdfs   4096 Oct 18 17:45 home
    drwxr-xr-x  3 mapred hdfs   4096 Sep 14 19:48 mapred
    drwxrwxrwx  4 mapred hadoop 4096 Sep 14 19:48 mr-history
    drwxrwxrwx 46 spark  hadoop 4096 Nov  1 18:26 spark-history
    drwxrwxrwx  9 hdfs   hdfs   4096 Oct 14 17:22 tmp
    drwxr-xr-x  9 hdfs   hdfs   4096 Oct 11 16:54 user
    

     

    问题:

    1..出现"error while loading shared libraries: libjvm.so: cannot open shared object file: No such file or directory"错误,是由于环境变量配置的有问题。

    可能没有配置 export  LD_LIBRARY_PATH=""

    所以在本地/etc/profile文件中把fuse共享库与java共享库加上去就可以了

    2.出现"error while loading shared libraries: libhdfs.so.0.0.0: cannot open shared object file: No such file or directory"错误

    查找libhdfs.so.0.0.0所在的目录:find / -name  libhdfs.so.0.0.0

    加入到LD_LIBRARY_PATH中

     export LD_LIBRARY_PATH=/usr/hdp/2.4.0.0-169/usr/lib/:/usr/local/lib:/usr/lib:$LD_LIBRARY_PATH:$HADOOP_HOME/build/c++/Linux-amd64-64/lib:${JAVA_HOME}/jre/lib/amd64/server

    3.hadoop-fuse-dfs  cmmand  not  found

    安装hadoop-fuse-dfs后HADOOP_HOME的bin目录下有个hadoop-fuse-dfs可执行文件,找不到这个命令是因为没有把HADOOP_HOME加入到PATH中

    在PATH中增加$HADOOP_HOME/bin:

     

  • 相关阅读:
    P4047 部落划分
    P1440 求m区间的最小值
    P2880 平衡的阵容
    P2700 逐个击破
    P2814 家谱 map模版
    P4403 秦腾与教学评估
    无油无糖低脂酸奶芒果蛋糕
    紫薯铜锣烧
    Spring In Action ③
    Spring In Action ②
  • 原文地址:https://www.cnblogs.com/zwgblog/p/6020587.html
Copyright © 2011-2022 走看看