zoukankan      html  css  js  c++  java
  • YARN加载本地库抛出Unable to load native-hadoop library解决办法

    YARN加载本地库抛出Unable to load native-hadoop library解决办法

    用官方的Hadoop 2.1.0-beta安装后,每次hadoop命令进去都会抛出这样一个Warning

    WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

    设置logger级别,看下具体原因

    export HADOOP_ROOT_LOGGER=DEBUG,console​

    ...

    DEPRECATED: Use of this script to execute hdfs command is deprecated.
    Instead use the hdfs command for it.
    
    14/08/23 10:04:21 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
    report: Failed on local exception: java.io.IOException: Connection reset by peer; Host Details : local host is: "VM_160_34_centos/127.0.0.1"; destination host is: "Master":9000; 
     
    wrong ELFCLASS32,难道是加载的so文件系统版本不对
    执行命令
    file libhadoop.so.1.0.0
    hadoop@VM_160_34_centos:/usr/local/hadoop-2.4.0/lib/native> file libhadoop.so.1.0.0
    libhadoop.so.1.0.0: ELF 32-bit LSB shared object, Intel 80386, version 1 (SYSV), dynamically linked, not stripped

    果然是80386,是32位的系统版本,而我的hadoop环境是64位OS

      原来直接从apache镜像中下载的编译好的Hadoop版本native library都是32版本的,如果要支持64位版本,必须自己重新编译,这就有点坑爹了,要知道几乎所有的生产环境都是64位的OS
    YARN官方对于native library的一段话验证了这一点
    “The pre-built 32-bit i386-Linux native hadoop library is available as part of the hadoop distribution and is located in the lib/native directory​”

    解决方法:重新编译hadoop

    解决方法,就是重新编译hadoop软件:

    安装开发环境

    1.必要的包
    yum install svn
    
    yum install autoconfautomake libtool cmake
    
    yum install ncurses-devel
    
    yum install openssl-devel
    
    yum install gcc*
    2.安装maven

    下载,并解压

    wget  -c http://mirrors.hust.edu.cn/apache/maven/maven-3/3.2.3/binaries/apache-maven-3.2.3-bin.tar.gz
    tar -zxvf apache-maven-3.2.3-bin.tar.gz  -C /usr/local/

    /usr/local/apache-maven-3.2.3/bin加到环境变量中

    root@VM_160_34_centos:~/tools> vi /etc/profile.d/maven-development.sh 
    export M2_HOME=/usr/local/apache-maven-3.2.3
    export PATH=$PATH:$M2_HOME/bin
    root@VM_160_34_centos:~/tools> source /etc/profile

    测试 maven

    root@VM_160_34_centos:/usr/local/apache-maven-3.2.3> mvn -version
    Apache Maven 3.2.3 (33f8c3e1027c3ddde99d3cdebad2656a31e8fdf4; 2014-08-12T04:58:10+08:00)
    Maven home: /usr/local/apache-maven-3.2.3
    Java version: 1.7.0_55, vendor: Oracle Corporation
    Java home: /usr/local/java/jdk1.7.0_55/jre
    Default locale: en_US, platform encoding: ANSI_X3.4-1968
    OS name: "linux", version: "2.6.32-220.el6.x86_64", arch: "amd64", family: "unix"
    3.安装protobuf

    没装 protobuf,后面编译做不完,结果如下:

    [INFO] —hadoop-maven-plugins:2.4.0:protoc (compile-protoc) @ hadoop-common —
    
    [WARNING] [protoc, --version] failed:java.io.IOException: Cannot run program “protoc”: error=2, No suchfile or directory
    
    [ERROR] stdout: []
    
    ……………………
    
    [INFO] Apache Hadoop Main………………………….. SUCCESS [5.672s]
    
    [INFO] Apache Hadoop Project POM……………………. SUCCESS [3.682s]
    
    [INFO] Apache Hadoop Annotations……………………. SUCCESS [8.921s]
    
    [INFO] Apache Hadoop Assemblies…………………….. SUCCESS [0.676s]
    
    [INFO] Apache Hadoop Project Dist POM……………….. SUCCESS [4.590s]
    
    [INFO] Apache Hadoop Maven Plugins………………….. SUCCESS [9.172s]
    
    [INFO] Apache Hadoop Auth………………………….. SUCCESS [10.123s]
    
    [INFO] Apache Hadoop Auth Examples………………….. SUCCESS [5.170s]
    
    [INFO] Apache HadoopCommon ………………………… FAILURE [1.224s]
    
    [INFO] Apache Hadoop NFS…………………………… SKIPPED
    
    [INFO] Apache Hadoop Common Project…………………. SKIPPED
    
    [INFO] Apache Hadoop HDFS………………………….. SKIPPED
    
    [INFO] Apache Hadoop HttpFS………………………… SKIPPED
    
    [INFO] Apache Hadoop HDFS BookKeeperJournal …………. SKIPPED
    
    [INFO] Apache Hadoop HDFS-NFS………………………. SKIPPED
    
    [INFO] Apache Hadoop HDFS Project…………………… SKIPPED
    安装protobuf过程

    下载: 

    root@VM_160_34_centos:~/tools> wget -c https://protobuf.googlecode.com/files/protobuf-2.5.0.tar.gz

    解压

    root@VM_160_34_centos:~/tools> tar -xvzf protobuf-2.5.0.tar.gz 
    root@VM_160_34_centos:~/tools/protobuf-2.5.0> cd protobuf-2.5.0

    依次执行下面的命令即可

    ./configure
    
    make
    
    make check
    
    make install

    测试安装:

    root@VM_160_34_centos:~/tools/releaseprotoc -version
    protoc: error while loading shared libraries: libprotobuf.so.8: cannot open shared object file: No such file or directory
    这里报错 解决办法
     
    root@VM_160_34_centos:~/tools/release-2.4.0> cat >> /etc/profile.d/protoc-development.sh << end
    > export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/local/lib
    > end
    root@VM_160_34_centos:~/tools/release-2.4.0> source /etc/profile

    测试结果

    root@VM_160_34_centos:~/tools/release-2.4.0> protoc --version
    libprotoc 2.5.0

    libprotoc 2.5.0

    重新checkout source code
    svn checkout http://svn.apache.org/repos/asf/hadoop/common/tags/release-2.4.0/
    加上编译native的选项,编译时会根据当前的操作系统架构来生产相应的native库
    mvn package -Pdist,native -DskipTests -Dtar

     验证一下:

    root@VM_160_34_centos:~/tools/release-2.4.0>cd hadoop-dist/target/hadoop-2.4.0/lib/native
    root@VM_160_34_centos:~/tools/release-2.4.0/hadoop-dist/target/hadoop-2.4.0/lib/native> file libhadoop.so.1.0.0 
    libhadoop.so.1.0.0: ELF 64-bit LSB shared object, x86-64, version 1 (SYSV), dynamically linked, not stripped

    目录下hadoop-2.4.0.tar.gz也有了,以后应该就可以直接用了。

      

    感谢 : http://www.kankanews.com/ICkengine/archives/81648.shtml

  • 相关阅读:
    Oracle 内存参数调优设置
    查询Oracle正在执行的sql语句及执行该语句的用户
    oracle审计详解
    Oracle数据库的性能调整
    性能监控工具的配置及使用
    windows端5款mysql客户端工具
    Oracle 11g密码过期问题及解决方案
    PLSQL安装、PLSQL汉化、激活
    Mercurial 安装及使用
    Servlet基础(二) Servlet的生命周期
  • 原文地址:https://www.cnblogs.com/mjorcen/p/3930808.html
Copyright © 2011-2022 走看看