zoukankan      html  css  js  c++  java
  • hadoop2.6.1源码编译64位

    一、 问题

        Apache官网上提供的hadoop本地库是32位的,如果我们的Linux服务器是64位的话,就会出现问题。

        我们在64位服务器执行hadoop命令时,则会报以下错误:

        WARNutil.NativeCodeLoader: Unable to load native-hadoop library for yourplatform... using builtin-java classes where applicable

        原因是hadoop-2.6.0.tar.gz安装包是在32位机器上编译的,64位的机器加载本地库.so文件时出错,不影响使用。

    为了解决上述问题,我们就需要自己编译一个64位的hadoop版本。

    二、编译hadoop2.6.1需要的软件

    1. jdk 1.7

    2. gcc 4.4.5 | gcc-c++

    3. maven 3.3.3

    4. protobuf 2.5.0 (google序列化工具)

    5. cmake 2.8.12.2

    6. make

    7. ant 1.9.6

    8. finbugs(可选择)

    注意:
    finbugs不是编译所必须的软件,可以不下载。

    三、编译软件的准备工作

    1.jdk的安装

    • 解压 tar -zxvf jdk-7u79-linux-x64.tar.gz

    • 配置环境变量,编辑/etc/profile文件

    • export JAVA_HOME=/opt/jdk1.7.0_25

    • export CLASSPATH=.:$JAVA_HOME/jre/lib/rt.jar:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar

    • export PATH=$PATH:$JAVA_HOME/bin

    • source /etc/profile

    • java -version  检查jdk是否安装成功。

    2.gcc的安装

    一般linux上会自带了gcc的安装,所以在安装以前,先检查一下服务器上是否已经安装了gcc。
    输入:gcc -v
    如果有以下输出,则这说明已经安装了gcc

    yum install gcc   

    yum install gcc-c++

    3.maven的安装

    • 解压tar -zxvf apache-maven-3.3.3-bin.tar.gz

    • 配置环境变量,编辑/etc/profile

    • export PATH=$PATH:$JAVA_HOME/bin:$MAVEN_HOME/bin

    • source /etc/profile

    • mvn -version  检查maven是否安装成功。

    4.protobuf的安装

    • 解压tar -zxvf protobuf-2.5.0.tar.gz

    • 进入protobuf的解压目录。如:cd /opt/protobuf-2.5.0/

    • 在安装目录下,执行以下命令:

      • ./ configure –prefix=/opt/protobuf

      • make

      • make check

      • make install

    • 编译成功后 export PATH= /opt/protobuf/bin:$PATH

    • protoc –version  检查protoc是否安装成功。

    5.cmake的安装

    安装 yum install cmake

    安装 yum install openssl-devel

    安装 yum install ncurses-devel

    或者

    • tar -zxvf cmake-2.8.12.2.tar.gz

    • 进入cmake的解压目录。如:cd /opt/cmake-2.8.12.2/

    • 在安装目录下,执行以下命令:

      • ./ bootstrap

      • make

      • make install

    • cmake -version 检查cmake是否安装成功。

    6.make的安装

    yum install make

    验证: make --version

    7.ant的安装

    • 解压tar -zxvf apache-ant-1.9.6-bin.tar.gz

    • 编辑环境变量,编辑/etc/profile

    • export ANT_HOME=/opt/apache-ant-1.9.6

    • export PATH=$PATH:$JAVA_HOME/bin:$MAVEN_HOME/bin:$ANT_HOME/bin

    • source /etc/profile 刷新修改的环境变量

    • ant -version检查ant是否安装成功。

    8.安装必要的包

    • 安装 autotool

      • 执行命令:yum install autoconf automake libtool

    9.hadoop-common-project中的pom.xml添加依赖(hadoop-2.2.0需要修改,hadoop2.6.0版本不需要)

    <dependency>

    <groupId>org.mortbay.jetty</groupId>

    <artifactId>jetty-util</artifactId>

    <scope>test</scope>

    </dependency>

    10.在编译之前防止 java.lang.OutOfMemoryError: Java heap space 堆栈问题

    执行系统命令:

    export MAVEN_OPTS="-Xms256m -Xmx512m"

    四、编译hadoop2.6.1

    • 在Apache官网上,下载hadoop-2.6.1的源码包hadoop-2.6.1-src.tar.gz。

    • 解压源码包tar zxvf hadoop-2.6.1-src.tar.gz

    • 进入hadoop-2.6.1-src解压目录。cd /opt/hadoop-2.6.1-src/

    • 执行命令mvn clean package -Pdist,native -DskipTests -Dtar 进行编译。

    • 编译过程中,需要下载很多包,等待时间比较长。当看到hadoop各个项目都编译成功,即出现一系列的SUCCESS之后,即为编译成功。

    • 编译好的安装包hadoop-2.6.1.tar.gz,可以在文件目录hadoop-2.6.1-src/hadoop-dist/target/下找到。

    五、注意事项

    编译过程中需要下载安装包,有时候可能由于网络的原因,导致安装包下载不完整,而出现编译错误。

    错误1:

        Remote host closed connection during handshake: SSL peer shut down incorrectly.......

    解决方案:需要重新新多编译几次即可通过。

    =====================================================================

    编译日志:
    [INFO] Reactor Summary: [INFO] [INFO] Apache Hadoop Main ................................. SUCCESS [ 13.582 s] 
    [INFO] Apache Hadoop Project POM .......................... SUCCESS [ 9.846 s] 
    [INFO] Apache Hadoop Annotations .......................... SUCCESS [ 24.408 s]
    [INFO] Apache Hadoop Assemblies ........................... SUCCESS [ 1.967 s] 
    [INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [ 6.443 s] 
    [INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [ 20.692 s] 
    [INFO] Apache Hadoop MiniKDC .............................. SUCCESS [ 14.250 s] 
    [INFO] Apache Hadoop Auth ................................. SUCCESS [ 23.716 s] 
    [INFO] Apache Hadoop Auth Examples ........................ SUCCESS [ 13.714 s] 
    [INFO] Apache Hadoop Common ............................... SUCCESS [08:46 min] 
    [INFO] Apache Hadoop NFS .................................. SUCCESS [ 47.127 s] 
    [INFO] Apache Hadoop KMS .................................. SUCCESS [ 48.790 s] 
    [INFO] Apache Hadoop Common Project ....................... SUCCESS [ 0.316 s] 
    [INFO] Apache Hadoop HDFS ................................. SUCCESS [14:58 min] 
    [INFO] Apache Hadoop HttpFS ............................... SUCCESS [11:10 min] 
    [INFO] Apache Hadoop HDFS BookKeeper Journal .............. SUCCESS [01:43 min] 
    [INFO] Apache Hadoop HDFS-NFS ............................. SUCCESS [ 27.438 s] 
    [INFO] Apache Hadoop HDFS Project ......................... SUCCESS [ 0.146 s] 
    [INFO] hadoop-yarn ........................................ SUCCESS [ 0.165 s] 
    [INFO] hadoop-yarn-api .................................... SUCCESS [07:03 min] 
    [INFO] hadoop-yarn-common ................................. SUCCESS [03:31 min] 
    [INFO] hadoop-yarn-server ................................. SUCCESS [ 0.827 s] 
    [INFO] hadoop-yarn-server-common .......................... SUCCESS [01:11 min] 
    [INFO] hadoop-yarn-server-nodemanager ..................... SUCCESS [02:25 min] 
    [INFO] hadoop-yarn-server-web-proxy ....................... SUCCESS [ 17.129 s] 
    [INFO] hadoop-yarn-server-applicationhistoryservice ....... SUCCESS [ 39.350 s] 
    [INFO] hadoop-yarn-server-resourcemanager ................. SUCCESS [01:44 min] 
    [INFO] hadoop-yarn-server-tests ........................... SUCCESS [ 32.941 s] 
    [INFO] hadoop-yarn-client ................................. SUCCESS [ 44.664 s] 
    [INFO] hadoop-yarn-applications ........................... SUCCESS [ 0.197 s] 
    [INFO] hadoop-yarn-applications-distributedshell .......... SUCCESS [ 15.165 s] 
    [INFO] hadoop-yarn-applications-unmanaged-am-launcher ..... SUCCESS [ 9.604 s] 
    [INFO] hadoop-yarn-site ................................... SUCCESS [ 0.149 s] 
    [INFO] hadoop-yarn-registry ............................... SUCCESS [ 31.971 s] 
    [INFO] hadoop-yarn-project ................................ SUCCESS [ 22.195 s] 
    [INFO] hadoop-mapreduce-client ............................ SUCCESS [ 0.673 s] 
    [INFO] hadoop-mapreduce-client-core ....................... SUCCESS [02:08 min] 
    [INFO] hadoop-mapreduce-client-common ..................... SUCCESS [01:38 min] 
    [INFO] hadoop-mapreduce-client-shuffle .................... SUCCESS [ 24.796 s] 
    [INFO] hadoop-mapreduce-client-app ........................ SUCCESS [01:02 min] 
    [INFO] hadoop-mapreduce-client-hs ......................... SUCCESS [ 43.043 s] 
    [INFO] hadoop-mapreduce-client-jobclient .................. SUCCESS [01:09 min] 
    [INFO] hadoop-mapreduce-client-hs-plugins ................. SUCCESS [ 9.662 s] 
    [INFO] Apache Hadoop MapReduce Examples ................... SUCCESS [ 40.439 s] 
    [INFO] hadoop-mapreduce ................................... SUCCESS [ 13.894 s] 
    [INFO] Apache Hadoop MapReduce Streaming .................. SUCCESS [ 32.797 s] 
    [INFO] Apache Hadoop Distributed Copy ..................... SUCCESS [01:00 min] 
    [INFO] Apache Hadoop Archives ............................. SUCCESS [ 11.333 s] 
    [INFO] Apache Hadoop Rumen ................................ SUCCESS [ 35.122 s] 
    [INFO] Apache Hadoop Gridmix .............................. SUCCESS [ 22.939 s] 
    [INFO] Apache Hadoop Data Join ............................ SUCCESS [ 17.568 s] 
    [INFO] Apache Hadoop Ant Tasks ............................ SUCCESS [ 12.339 s] 
    [INFO] Apache Hadoop Extras ............................... SUCCESS [ 18.325 s] 
    [INFO] Apache Hadoop Pipes ................................ SUCCESS [ 27.889 s] 
    [INFO] Apache Hadoop OpenStack support .................... SUCCESS [ 30.148 s] 
    [INFO] Apache Hadoop Amazon Web Services support .......... SUCCESS [01:28 min] 
    [INFO] Apache Hadoop Client ............................... SUCCESS [ 25.086 s] 
    [INFO] Apache Hadoop Mini-Cluster ......................... SUCCESS [ 0.657 s] 
    [INFO] Apache Hadoop Scheduler Load Simulator ............. SUCCESS [ 25.302 s] 
    [INFO] Apache Hadoop Tools Dist ........................... SUCCESS [ 23.268 s] 
    [INFO] Apache Hadoop Tools ................................ SUCCESS [ 0.156 s] 
    [INFO] Apache Hadoop Distribution ......................... SUCCESS [01:06 min] 
    [INFO] ------------------------------------------------------------------------ 
    [INFO] BUILD SUCCESS [INFO] ------------------------------------------------------------------------ 
    [INFO] Total time: 01:17 h [INFO] Finished at: 2014-12-29T20:45:54-08:00 
    [INFO] Final Memory: 94M/193M [INFO] ------------------------------------------------------------------------ 
    [root@Master hadoop-2.6.1-src]#
  • 相关阅读:
    Codeforces Round #613 (Div. 2)
    Codeforces Round #575 (Div. 3)
    Codeforces Round #572 (Div. 2)
    CodeCraft-20 (Div. 2)
    Educational Codeforces Round 76 (Rated for Div. 2)
    欧拉筛法模板代码
    【Android Studio】安卓开发初体验3.1——UI设计之常用控件
    【kotlin】adapterPosition方法返回-1 无法获取位置
    【洛谷】P1009 阶乘之和——高精度算法
    【Android Studio】安卓开发初体验2——Activity
  • 原文地址:https://www.cnblogs.com/chgxtony/p/4921632.html
Copyright © 2011-2022 走看看