zoukankan      html  css  js  c++  java
  • centos6.4编译hadoop2.4源码

    4.1、环境:

    1)Linux 64 位操作系统,CentOS 6.4 版本,VMWare 搭建的虚拟机

    2)虚拟机可以联网

    4.2、官方编译说明:

    解压命令:tar -zxvf hadoop-2.4.0-src.tar.gz 

    之后进入到解压文件夹下,可以查看BUILDING.txt文件, more BUILDING.txt ,向下翻页是空格键,其中内容如下

    Requirements:

    * Unix System

    * JDK 1.6+

    * Maven 3.0 or later

    * Findbugs 1.3.9 (if running findbugs)

    * ProtocolBuffer 2.5.0

    * CMake 2.6 or newer (if compiling native code)

    * Internet connection for first build (to fetch all Maven and Hadoop dependencies)

    ----------------------------------------------------------------------------------

    Maven main modules:

      hadoop (Main Hadoop project)

             - hadoop-project (Parent POM for all Hadoop Maven modules. )

                                        (All plugins & dependencies versions are defined here.)

             - hadoop-project-dist (Parent POM for modules that generate distributions.)

             - hadoop-annotations (Generates the Hadoop doclet used to generated the Java

    docs)

             - hadoop-assemblies (Maven assemblies used by the different modules)

             - hadoop-common-project (Hadoop Common)

             - hadoop-hdfs-project (Hadoop HDFS)

             - hadoop-mapreduce-project (Hadoop MapReduce)

             - hadoop-tools (Hadoop tools like Streaming, Distcp, etc.)

             - hadoop-dist (Hadoop distribution assembler)

    ----------------------------------------------------------------------------------

    在编译完成之后,可以查看Hadoop的版本信息

    libhadoop.so.1.0.0: ELF 64-bit LSB shared object, x86-64, version 1 (SYSV), dynamically linked, not stripped 
    [root@centos native]# pwd 
    /opt/hadoop-2.4.0-src/hadoop-dist/target/hadoop-2.4.0/lib/native 
    [root@centos native]#

    4.3、编译前准备之安装依赖包

    安装linux系统包

    • yum install autoconf automake libtool cmake
    • yum install ncurses-devel
    • yum install openssl-devel
    • yum install lzo-devel zlib-devel gcc gcc-c++

    安装Maven

    • 下载:apache-maven-3.0.5-bin.tar.gz
    • 解压:tar -zxvf apache-maven-3.0.5-bin.tar.gz
    • 设置环境变量,打开/etc/profile文件,添加
      • export MAVEN_HOME=/opt/apache-maven-3.0.5
      • export PATH=$PATH:$MAVEN_HOME/bin
    • 执行命令使之生效:source /etc/profile或者./etc/profile
    • 验证:mvn -v 
    (root用户)安装protobuf
    • 解压:tar -zxvf protobuf-2.5.0.tar.gz 
    • 进入安装目录,进行配置,执行命令,./configure
    • 安装命令:make && make check && make install
    • 验证:protoc --version

    vi /etc/profile  

    export PROTOC_HOME=/opt/protobuf-2.5.0

    export PATH=$PATH:$PROTOC_HOME/src

    然后,

    $protoc --version

    libprotoc.2.5.0

    安装findbugs
    • 解压:tar -zxvf findbugs.tar.gz 
    • 设置环境变量:
    • vi /etc/profile  
    • export FINDBUGS_HOME=/opt/findbugs-3.0.0
    • export PATH=$PATH:$FINDBUGS_HOME/bin
    • 验证命令:findbugs -version
    安装java
    下载了rpm包之后,rpm -ivh jre-7u71-linux-x64.rpm,安装完成之后
    [root@centos ~]# java -version 
    java version "1.7.0_71" 
    Java(TM) SE Runtime Environment (build 1.7.0_71-b14) 
    Java HotSpot(TM) 64-Bit Server VM (build 24.71-b01, mixed mode) 
    [root@centos ~]# javac -version 
    javac 1.7.0_71 
     
    注意

    Hadoop是Java写的,他无法使用Linux预安装的OpenJDK,因此安装hadoop前需要先安装JDK(1.6以上)


    4.4、如何编译

    进入到Hadoop源码目录下/opt/hadoop-2.4.0-src,运行红色字体[可选项]:

    Building distributions:

    Create binary distribution without native code and without documentation:

      $ mvn package -Pdist -DskipTests -Dtar

    Create binary distribution with native code and with documentation:

      $ mvn package -Pdist,native,docs -DskipTests -Dtar

    Create source distribution:

      $ mvn package -Psrc -DskipTests

    Create source and binary distributions with native code and documentation:

      $ mvn -e -X package -Pdist,native[,docs,src] -DskipTests -Dtar

    Create a local staging version of the website (in /tmp/hadoop-site)

      $ mvn clean site; mvn site:stage -DstagingDirectory=/tmp/hadoop-site

    4.5、编译之前,可能需要配置MAVEN国内镜像配置
    1. 进入安装目录 /opt/modules/apache-maven-3.0.5/conf,编辑 settings.xml 文件

    * 修改<mirrors>内容:

    <mirror>  

    <id>nexus-osc</id>  

    <mirrorOf>*</mirrorOf>  

    <name>Nexus osc</name>  

    <url>http://maven.oschina.net/content/groups/public/</url>  

    </mirror> 

    * 修改<profiles>内容:

    <profile>  

    <id>jdk-1.6</id>  

    <activation>  

    <jdk>1.6</jdk>  

    </activation>  

    <repositories>  

    <repository>  

    <id>nexus</id>  

    <name>local private nexus</name>  

    <url>http://maven.oschina.net/content/groups/public/</url>  

    <releases>  

    <enabled>true</enabled>  

    </releases>  

    <snapshots>  

    <enabled>false</enabled>  

    </snapshots>  

    </repository>  

    </repositories> 

    <pluginRepositories>  

    <pluginRepository>  

    <id>nexus</id>  

    <name>local private nexus</name>  

    <url>http://maven.oschina.net/content/groups/public/</url>  

    <releases>  

    <enabled>true</enabled>  

    </releases>  

    <snapshots>  

    <enabled>false</enabled>  

    </snapshots>  

    </pluginRepository>  

    </pluginRepositories>  

    </profile>

    复制配置

       将该配置文件复制到用户目录,使得每次对maven创建时,都采用该配置

    * 查看用户目录【/home/hadoop】是否存在【.m2】文件夹,如没有,则创建

    $ cd /home/hadoop

    $ mkdir .m2

    * 复制文件

    $ cp /opt/modules/apache-maven-3.0.5/conf/settings.xml   ~/.m2/

    4.6、配置DNS 

    修改: vi /etc/resolv.conf      

    nameserver 8.8.8.8

    nameserver 8.8.4.4

    4.7、将Hadoop Project 导入到Eclipse

    Importing projects to eclipse

    When you import the project to eclipse, install hadoop-maven-plugins at first.

      $ cd hadoop-maven-plugins

      $ mvn install

    Then, generate eclipse project files.

      $ mvn eclipse:eclipse -DskipTests

    At last, import to eclipse by specifying the root directory of the project via

    [File] > [Import] > [Existing Projects into Workspace].

     

    出现错误:Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (site) on project hadoop-hdfs: An Ant BuildException has occured: input file /opt/hadoop-2.4.0-src/hadoop-hdfs-project/hadoop-hdfs/target/findbugsXml.xml does not exist
    解决办法:

    cd ~/hadoop-2.4.0-src/ mvn clean package -Pdist,native,docs -DskipTests -Dtar //编译中途出错修正后可从指定点开始继续编译,修改最后一个参数即可。如出现hadoop-hdfs/target/findbugsXml.xml does not exist则从该命令删除docs参数再运行mvn package -Pdist,native -DskipTests -Dtar -rf :hadoop-pipes

     
    build成功之后,进入到/opt/hadoop-2.4.0-src/hadoop-dist/target路径下查看hadoop-2.4.0.tar.gz就是编译完成之后的tar包
     
    出现错误:Could not find goal 'protoc' in plugin org.apache.hadoop:hadoop-maven-plugins:2.2.0 among available 
    解决办法:在/etc/profile中加入如下内容,之后source /etc/profile
    export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/opt/protobuf/lib
    export PATH=$PATH:/usr/local/bin
    通常建议安装到/usr/local目录下,执行configure时,指定--prefix=/usr/local/protobuf即可,如果出现错误,那么make clean一下,之后再进行操作
     
    我的/etc/profile文件内容:
    export MAVEN_HOME=/opt/apache-maven-3.0.5
    export PATH=$PATH:$MAVEN_HOME/bin
    export JAVA_HOME=/usr/lib/jvm/jdk1.7.0_71/
    export JRE_HOME=/usr/lib/jvm/jdk1.7.0_71/jre
    export ANT_HOME=/usr/lib/jvm/apache-ant/
    export CLASSPATH=.:$JRE_HOME/lib/rt.jar:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar
    export PATH=$PATH:$JAVA_HOME/bin:$JRE_HOME/bin:$ANT_HOME/bin
    export FINDBUGS_HOME=/opt/findbugs-3.0.0
    export PATH=$PATH:$FINDBUGS_HOME/bin:/opt/protoc/bin
    export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/opt/protoc/lib

    export PROTOC_HOME=/home/hadoop/protobuf-2.5.0
    export PATH=${PATH}:${FINDBUGS_HOME}/bin:$PROTOC_HOME/src

    编译完成,输出信息如下:

    [INFO] ------------------------------------------------------------------------
    [INFO] Reactor Summary:
    [INFO]
    [INFO] Apache Hadoop Main ................................. SUCCESS [ 0.923 s]
    [INFO] Apache Hadoop Project POM .......................... SUCCESS [ 0.734 s]
    [INFO] Apache Hadoop Annotations .......................... SUCCESS [ 2.009 s]
    [INFO] Apache Hadoop Assemblies ........................... SUCCESS [ 0.416 s]
    [INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [ 3.871 s]
    [INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [ 3.672 s]
    [INFO] Apache Hadoop MiniKDC .............................. SUCCESS [ 2.528 s]
    [INFO] Apache Hadoop Auth ................................. SUCCESS [ 17.347 s]
    [INFO] Apache Hadoop Auth Examples ........................ SUCCESS [ 3.163 s]
    [INFO] Apache Hadoop Common ............................... SUCCESS [03:46 min]
    [INFO] Apache Hadoop NFS .................................. SUCCESS [ 11.383 s]
    [INFO] Apache Hadoop Common Project ....................... SUCCESS [ 0.032 s]
    [INFO] Apache Hadoop HDFS ................................. SUCCESS [08:17 min]
    [INFO] Apache Hadoop HttpFS ............................... SUCCESS [04:10 min]
    [INFO] Apache Hadoop HDFS BookKeeper Journal .............. SUCCESS [ 27.153 s]
    [INFO] Apache Hadoop HDFS-NFS ............................. SUCCESS [ 4.014 s]
    [INFO] Apache Hadoop HDFS Project ......................... SUCCESS [ 0.076 s]
    [INFO] hadoop-yarn ........................................ SUCCESS [ 0.074 s]
    [INFO] hadoop-yarn-api .................................... SUCCESS [ 55.567 s]
    [INFO] hadoop-yarn-common ................................. SUCCESS [ 30.243 s]
    [INFO] hadoop-yarn-server ................................. SUCCESS [ 0.027 s]
    [INFO] hadoop-yarn-server-common .......................... SUCCESS [ 8.851 s]
    [INFO] hadoop-yarn-server-nodemanager ..................... SUCCESS [ 33.811 s]
    [INFO] hadoop-yarn-server-web-proxy ....................... SUCCESS [ 3.315 s]
    [INFO] hadoop-yarn-server-applicationhistoryservice ....... SUCCESS [ 8.813 s]
    [INFO] hadoop-yarn-server-resourcemanager ................. SUCCESS [ 12.100 s]
    [INFO] hadoop-yarn-server-tests ........................... SUCCESS [ 0.343 s]
    [INFO] hadoop-yarn-client ................................. SUCCESS [ 4.797 s]
    [INFO] hadoop-yarn-applications ........................... SUCCESS [ 0.027 s]
    [INFO] hadoop-yarn-applications-distributedshell .......... SUCCESS [ 3.495 s]
    [INFO] hadoop-yarn-applications-unmanaged-am-launcher ..... SUCCESS [ 2.208 s]
    [INFO] hadoop-yarn-site ................................... SUCCESS [ 0.038 s]
    [INFO] hadoop-yarn-project ................................ SUCCESS [ 6.086 s]
    [INFO] hadoop-mapreduce-client ............................ SUCCESS [ 0.125 s]
    [INFO] hadoop-mapreduce-client-core ....................... SUCCESS [ 18.008 s]
    [INFO] hadoop-mapreduce-client-common ..................... SUCCESS [ 14.628 s]
    [INFO] hadoop-mapreduce-client-shuffle .................... SUCCESS [ 3.223 s]
    [INFO] hadoop-mapreduce-client-app ........................ SUCCESS [ 9.358 s]
    [INFO] hadoop-mapreduce-client-hs ......................... SUCCESS [ 8.184 s]
    [INFO] hadoop-mapreduce-client-jobclient .................. SUCCESS [ 12.318 s]
    [INFO] hadoop-mapreduce-client-hs-plugins ................. SUCCESS [ 1.600 s]
    [INFO] Apache Hadoop MapReduce Examples ................... SUCCESS [ 5.915 s]
    [INFO] hadoop-mapreduce ................................... SUCCESS [ 4.150 s]
    [INFO] Apache Hadoop MapReduce Streaming .................. SUCCESS [ 15.438 s]
    [INFO] Apache Hadoop Distributed Copy ..................... SUCCESS [ 7.712 s]
    [INFO] Apache Hadoop Archives ............................. SUCCESS [ 2.838 s]
    [INFO] Apache Hadoop Rumen ................................ SUCCESS [ 6.190 s]
    [INFO] Apache Hadoop Gridmix .............................. SUCCESS [ 4.524 s]
    [INFO] Apache Hadoop Data Join ............................ SUCCESS [ 3.694 s]
    [INFO] Apache Hadoop Extras ............................... SUCCESS [ 3.687 s]
    [INFO] Apache Hadoop Pipes ................................ SUCCESS [ 0.023 s]
    [INFO] Apache Hadoop OpenStack support .................... SUCCESS [ 6.197 s]
    [INFO] Apache Hadoop Client ............................... SUCCESS [ 7.037 s]
    [INFO] Apache Hadoop Mini-Cluster ......................... SUCCESS [ 0.072 s]
    [INFO] Apache Hadoop Scheduler Load Simulator ............. SUCCESS [ 25.116 s]
    [INFO] Apache Hadoop Tools Dist ........................... SUCCESS [ 6.242 s]
    [INFO] Apache Hadoop Tools ................................ SUCCESS [ 0.023 s]
    [INFO] Apache Hadoop Distribution ......................... SUCCESS [ 46.024 s]
    [INFO] ------------------------------------------------------------------------
    [INFO] BUILD SUCCESS
    [INFO] ------------------------------------------------------------------------
    [INFO] Total time: 23:56 min
    [INFO] Finished at: 2016-02-17T16:34:55+08:00
    [INFO] Final Memory: 221M/6213M
    [INFO] ------------------------------------------------------------------------

    编译完成,在源码目录下多了文件夹hadoop-dist,里面的target文件夹就是编译的输出

     总用量 599636

    drwxrwxr-x.   2 hadoop hadoop 4096            2月 17 16:34 antrun
    -rw-rw-r--.    1 hadoop hadoop 1625            2月 17 16:34 dist-layout-stitching.sh
    -rw-rw-r--.    1 hadoop hadoop 642              2月 17 16:34 dist-tar-stitching.sh
    drwxrwxr-x.  8 hadoop hadoop 4096            2月 17 16:34 hadoop-2.4.0
    -rw-rw-r--.   1 hadoop hadoop 202726573    2月 17 16:34 hadoop-2.4.0.tar.gz
    -rw-rw-r--.   1 hadoop hadoop 2746            2月 17 16:34 hadoop-dist-2.4.0.jar
    -rw-rw-r--.   1 hadoop hadoop 411264073   2月 17 16:34 hadoop-dist-2.4.0-javadoc.jar
    drwxrwxr-x. 2 hadoop hadoop 4096            2月 17 16:34 javadoc-bundle-options
    drwxrwxr-x. 2 hadoop hadoop 4096            2月 17 16:34 maven-archiver
    drwxrwxr-x. 2 hadoop hadoop 4096            2月 17 16:34 test-dir

  • 相关阅读:
    【1】Chrome
    Vue
    GitHub版本控制工具入门(一)
    Vue.js 组件笔记
    最全的javascriptt选择题整理
    网站如何实现 在qq中发自己链接时,便自动获取链接标题、图片和部分内容
    js 唤起APP
    密码加密MD5,Bash64
    HTTP和HTTPS的区别及HTTPS加密算法
    计算机网络七层的理解
  • 原文地址:https://www.cnblogs.com/hd-zg/p/5195939.html
Copyright © 2011-2022 走看看