zoukankan      html  css  js  c++  java
  • CentOS6.4编译Hadoop-2.4.0

     
    因为搭建Hadoop环境的时候,所用的系统镜像是emi-centos-6.4-x86_64,是64位的,而hadoop是默认是32的安装包。这导致我们很多操作都会遇到这个问题(Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library /opt/hadoop-2.4.0/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now.)
    为了解决此问题,需要重新编译hadoop。把生成的 hadoop-2.4.0-src/hadoop-dist/target/hadoop-2.4.0/lib/native 覆盖到 /opt/hadoop-2.4.0/lib/native。

    以下是具体的编译步骤:

    1. 安装下面的软件
    [root@hd1 software]# yum install lzo-devel zlib-devel gcc autoconf automake libtool ncurses-devel openssl-deve

    2. 安装Maven

    [hxiaolong@hd1 software]$ wget http://mirror.esocc.com/apache/maven/maven-3/3.0.5/binaries/apache-maven-3.0.5-bin.tar.gz
    [hxiaolong@hd1 software]$ tar zxf apache-maven-3.0.5-bin.tar.gz -C /opt
    
    [hxiaolong@hd1 software]$ vi /etc/profile
    export MAVEN_HOME=/opt/apache-maven-3.0.5
    export PATH=$PATH:$MAVEN_HOME/bin

    3. 安装Ant

    [hxiaolong@hd1 software]$ wget http://mirror.bit.edu.cn/apache/ant/binaries/apache-ant-1.9.4-bin.tar.gz
    [hxiaolong@hd1 software]$ tar zxf apache-ant-1.9.4-bin.tar.gz -C /opt
    
    [hxiaolong@hd1 software]$ vi /etc/profile
    export ANT_HOME=/opt/apache-ant-1.9.4
    export PATH=$PATH:$ANT_HOME/bin
    4. 安装Findbugs
    [hxiaolong@hd1 software]$ wget http://prdownloads.sourceforge.net/findbugs/findbugs-2.0.3.tar.gz?download
    [hxiaolong@hd1 software]$ tar zxf findbugs-2.0.3.tar.gz -C /opt
    
    [hxiaolong@hd1 software]$ vi /etc/profile
    export FINDBUGS_HOME=/opt/findbugs-2.0.3
    export PATH=$PATH:$FINDBUGS_HOME/bin
    5. 安装protobuf
    [hxiaolong@hd1 software]$ tar zxf protobuf-2.5.0.tar.gz
    [hxiaolong@hd1 software]$ cd protobuf-2.5.0
    [hxiaolong@hd1 software]$ ./configure
    [hxiaolong@hd1 software]$ make
    [hxiaolong@hd1 software]$ make install

     说实话,上面这种编译、安装方式挺麻烦的。很容易碰到各种依赖问题。这里推荐用yum install来安装。

    [root@hd1 protobuf-2.5.0]# yum install protobuf
    6. 编译Hadoop

    1) 在name节点上先编译hadoop
    [hxiaolong@hd1 software]$ wget http://mirrors.cnnic.cn/apache/hadoop/common/hadoop-2.4.0/hadoop-2.4.0-src.tar.gz
    [hxiaolong@hd1 software]$ cd hadoop-2.4.0-src
    
    [hxiaolong@hd1 software]$ mvn package -DskipTests -Pdist,native -Dtar

     中间过程出错了,错误信息如下:

    [ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (make) on project hadoop-common: An Ant BuildException has occured: Execute failed: java.io.IOException: Cannot run program "cmake" (in directory "/home/hxiaolong/software/hadoop-2.4.0-src/hadoop-common-project/hadoop-common/target/native"): error=2, No such file or directory
    [ERROR] around Ant part ...<exec dir="/home/hxiaolong/software/hadoop-2.4.0-src/hadoop-common-project/hadoop-common/target/native" executable="cmake" failonerror="true">... @ 4:145 in /home/hxiaolong/software/hadoop-2.4.0-src/hadoop-common-project/hadoop-common/target/antrun/build-main.xml
    [ERROR] -> [Help 1]
    [ERROR]
    [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
    [ERROR] Re-run Maven using the -X switch to enable full debug logging.
    [ERROR]
    [ERROR] For more information about the errors and possible solutions, please read the following articles:
    [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
    [ERROR]
    [ERROR] After correcting the problems, you can resume the build with the command
    [ERROR]   mvn <goals> -rf :hadoop-common
    网上看了下,是因为cmake没安装引起的。安装一下再试。
    [root@hd1 hadoop-2.4.0-src]# yum instsall cmake

     重新编译,最终成功了。

    [hxiaolong@hd1 software]$ mvn package -DskipTests -Pdist,native -Dtar
    
    main:
         [exec] $ tar cf hadoop-2.4.0.tar hadoop-2.4.0
         [exec] $ gzip -f hadoop-2.4.0.tar
         [exec]
         [exec] Hadoop dist tar available at: /home/hxiaolong/software/hadoop-2.4.0-src/hadoop-dist/target/hadoop-2.4.0.tar.gz
        
    [INFO] BUILD SUCCESS
    [INFO] ------------------------------------------------------------------------
    [INFO] Total time: 12:41.833s
    [INFO] Finished at: Wed Jul 23 03:01:18 UTC 2014
    [INFO] Final Memory: 159M/646M
    [INFO] ------------------------------------------------------------------------

    2) 把编译后的hadoop的native目录copy到/opt/hadoop-2.4.0/lib/
    [hxiaolong@hd1 lib]$ rm -rf /opt/hadoop-2.4.0/lib/native
    [hxiaolong@hd1 lib]$ cp -R /home/hxiaolong/software/hadoop-2.4.0-src/hadoop-dist/target/hadoop-2.4.0/lib/native /opt/hadoop-2.4.0/lib/

     这是非常重要的一个步骤。


    3) 把编译后的hadoop的native目录scp其它节点
    [root@hd1 lib]# scp -r /home/hxiaolong/software/hadoop-2.4.0-src/hadoop-dist/target/hadoop-2.4.0/lib/native/ hd2:/opt/hadoop-2.4.0/lib/ 
    [root@hd1 lib]# scp -r /home/hxiaolong/software/hadoop-2.4.0-src/hadoop-dist/target/hadoop-2.4.0/lib/native/ hd3:/opt/hadoop-2.4.0/lib/

     如果不把重新编译过后的native目录同步到其它节点,那在其它节点也会遇到同样的问题。


    4) 验证
    [hxiaolong@hd2 native]$ hadoop fs -ls /
    Found 1 items
    drwxr-xr-x   - hxiaolong supergroup          0 2014-07-23 05:21 /input
    OK了,不会报错了。
    今天偶然发现这个镜像mirror.bit.edu.cn还蛮快的,而且比较稳定。看了下,是北理的。赞一个~
     
  • 相关阅读:
    Zookeeper 基础知识【1】
    Spark 基础复习【1】
    ZooKeeper 入门 一致性
    Hive 视图 索引
    Yarn调度 历史与基础
    mysql 优化【1】
    TCP IP知识梳理
    Java 基础 锁
    Spark 累加器使用
    RATE-MAX----beta答辩博客
  • 原文地址:https://www.cnblogs.com/toughhou/p/3864273.html
Copyright © 2011-2022 走看看