zoukankan      html  css  js  c++  java
  • Hadoop 源码编译 step by step 最简洁的步骤

    各软件版本:

    Java : 1.7.0_79
    Hadoop : hadoop-2.6.5-src.tar.gz

    maven:3.3.9

    protocbuf:2.5

    解压缩 tar -zxvf 


    1 配置maven 环境变量

    export MAVEN_HOME=/root/compileHadoop/maven-3.3.9
    export PATH=$PATH:$MAVEN_HOME/bin
    source ~/.bash_profile

    检查 maven 是否安装成功


    mvn -version

    [root@bigdatahadoop protobuf-2.5.0]# mvn -version
    Apache Maven 3.3.9 (bb52d8502b132ec0a5a3f4c09453c07478323dc5; 2015-11-11T00:41:47+08:00)
    Maven home: /root/compileHadoop/maven-3.3.9
    Java version: 1.7.0_79, vendor: Oracle Corporation
    Java home: /usr/java/jdk1.7.0_79/jre
    Default locale: en_US, platform encoding: UTF-8
    OS name: "linux", version: "2.6.32-431.el6.x86_64", arch: "amd64", family: "unix"


    2 编译安装 protobuf   configure 时候可以指定安装路径
      cd   protobuf - 2.5.0   
      ./configure 
      make
      make install

    检测  protoc –version

    [root@bigdatahadoop protobuf-2.5.0]# protoc --version
    libprotoc 2.5.0

    need check and install

    yum install  gcc-c++



    3 预备条件已经就绪 ,现在开始编译 (更多编译条件请参考 src 下 的 BUILDING.txt  文件 )

    cd hadoop-2.6.5-src

    mvn clean package -Pdist,native -DskipTests -Dtar

    --------------------------------

    Error

    1

    [INFO] BUILD FAILURE
    [INFO] ------------------------------------------------------------------------
    [INFO] Total time: 07:39 min
    [INFO] Finished at: 2016-10-12T21:50:39+08:00
    [INFO] Final Memory: 36M/87M
    [INFO] ------------------------------------------------------------------------
    [ERROR] Unknown lifecycle phase "–Pdist,native". You must specify a valid lifecycle phase or a goal in the format <plugin-prefix>:<goal> or <plugin-group-id>:<plugin-artifact-id>[:<plugin-version>]:<goal>. Available lifecycle phases are: validate, initialize, generate-sources, process-sources, generate-resources, process-resources, compile, process-classes, generate-test-sources, process-test-sources, generate-test-resources, process-test-resources, test-compile, process-test-classes, test, prepare-package, package, pre-integration-test, integration-test, post-integration-test, verify, install, deploy, pre-clean, clean, post-clean, pre-site, site, post-site, site-deploy. -> [Help 1]
    [ERROR]
    [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
    [ERROR] Re-run Maven using the -X switch to enable full debug logging.
    [ERROR]
    [ERROR] For more information about the errors and possible solutions, please read the following articles:
    [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/LifecyclePhaseNotFoundException
    命令中有中文符号 (-)

    Eror

    2

    [INFO] ------------------------------------------------------------------------
    [INFO] BUILD FAILURE
    [INFO] ------------------------------------------------------------------------
    [INFO] Total time: 01:06 h
    [INFO] Finished at: 2016-10-12T23:07:53+08:00
    [INFO] Final Memory: 81M/320M
    [INFO] ------------------------------------------------------------------------
    [ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (make) on project hadoop-common: An Ant BuildException has occured: Execute failed: java.io.IOException: Cannot run program "cmake" (in directory "/root/compileHadoop/hadoop-2.6.5-src/hadoop-common-project/hadoop-common/target/native"): error=2, No such file or directory
    [ERROR] around Ant part ...<exec dir="/root/compileHadoop/hadoop-2.6.5-src/hadoop-common-project/hadoop-common/target/native" executable="cmake" failonerror="true">... @ 4:140 in /root/compileHadoop/hadoop-2.6.5-src/hadoop-common-project/hadoop-common/target/antrun/build-main.xml
    [ERROR] -> [Help 1]
    [ERROR]
    [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
    [ERROR] Re-run Maven using the -X switch to enable full debug logging.
    [ERROR]
    [ERROR] For more information about the errors and possible solutions, please read the following articles:
    [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
    [ERROR]
    [ERROR] After correcting the problems, you can resume the build with the command
    [ERROR]   mvn <goals> -rf :hadoop-common
    [root@bigdatahadoop hadoop-2.6.5-src]#

    yum  install  cmake
    tar zxvf  apache-ant-1.9.4-bin.tar.gz

    配置环境变量

     export ANT_HOME=/root/apache-ant-1.9.4
     export PATH=.:$PATH:$JAVA_HOME/bin:$MAVEN_HOME/bin:$ANT_HOME/bin

    生效以及测试

    1  source /etc/profile
    2  ant  -version


    最后安装成功 70 分钟:

    [INFO] hadoop-mapreduce ................................... SUCCESS [  4.775 s]
    [INFO] Apache Hadoop MapReduce Streaming .................. SUCCESS [ 20.396 s]
    [INFO] Apache Hadoop Distributed Copy ..................... SUCCESS [02:05 min]
    [INFO] Apache Hadoop Archives ............................. SUCCESS [ 14.173 s]
    [INFO] Apache Hadoop Rumen ................................ SUCCESS [ 33.443 s]
    [INFO] Apache Hadoop Gridmix .............................. SUCCESS [ 27.265 s]
    [INFO] Apache Hadoop Data Join ............................ SUCCESS [ 11.006 s]
    [INFO] Apache Hadoop Ant Tasks ............................ SUCCESS [  7.516 s]
    [INFO] Apache Hadoop Extras ............................... SUCCESS [  8.966 s]
    [INFO] Apache Hadoop Pipes ................................ SUCCESS [ 11.747 s]
    [INFO] Apache Hadoop OpenStack support .................... SUCCESS [ 15.736 s]
    [INFO] Apache Hadoop Amazon Web Services support .......... SUCCESS [09:39 min]
    [INFO] Apache Hadoop Client ............................... SUCCESS [  9.931 s]
    [INFO] Apache Hadoop Mini-Cluster ......................... SUCCESS [  1.846 s]
    [INFO] Apache Hadoop Scheduler Load Simulator ............. SUCCESS [ 10.474 s]
    [INFO] Apache Hadoop Tools Dist ........................... SUCCESS [ 17.429 s]
    [INFO] Apache Hadoop Tools ................................ SUCCESS [  0.076 s]
    [INFO] Apache Hadoop Distribution ......................... SUCCESS [01:24 min]
    [INFO] ------------------------------------------------------------------------
    [INFO] BUILD SUCCESS
    [INFO] ------------------------------------------------------------------------
    [INFO] Total time: 01:13 h
    [INFO] Finished at: 2016-10-13T20:36:32+08:00
    [INFO] Final Memory: 115M/487M
    [INFO] ------------------------------------------------------------------------
    [root@bigdatahadoop hadoop-2.6.5-src]#



  • 相关阅读:
    导入myeclipse项目出现的问题及解决方案
    sqlserver允许远程连接的配置
    Microsoft SQL Server,附加数据库 错误:Error 916解决方法
    [svc]linux常用手头命令-md版-2017年11月12日 12:31:56
    [elk]es增删改查最佳实战
    [docker]docker日志驱动记录nginx日志情形探究
    [js]面向对象2
    [js]面向对象1
    [js]js中函数传参判断
    [k8s]k8s-ceph-statefulsets-storageclass-nfs 有状态应用布署实践
  • 原文地址:https://www.cnblogs.com/TendToBigData/p/10501341.html
Copyright © 2011-2022 走看看