zoukankan      html  css  js  c++  java
  • 64位ubuntu下重新编译hadoop2.2流水账

    hadoop官方网站中只提供了32位的hadoop-2.2.0.tar.gz,如果要在64位ubuntu下部署hadoop-2.2.0,就需要重新编译源码包,生成64位的部署包。
    建议以下操作使用root账户,避免出现权限不足的问题。

    安装jdk

    请参考文章《在ubuntu中安装jdk》。

    安装maven

    请参考文章《在ubuntu中安装maven》。

    下载hadoop源码

    wget http://mirror.bit.edu.cn/apache/hadoop/common/hadoop-2.2.0/hadoop-2.2.0-src.tar.gz

    解压

    tar -xzf hadoop-2.2.0-src.tar.gz

    编译源代码

    cd hadoop-2.2.0-src
    mvn package -Pdist,native -DskipTests -Dtar

    第1次编译:失败(hadoop pom.xml的bug)

    错误信息:

    [ERROR] Failed to execute goal on project hadoop-auth: Could not resolve dependencies for project org.apache.hadoop:hadoop-auth:jar:2.2.0: Could not transfer artifact org.mortbay.jetty:jetty:jar:6.1.26 from/to central (https://repo.maven.apache.org/maven2): GET request of: org/mortbay/jetty/jetty/6.1.26/jetty-6.1.26.jar from central failed: SSL peer shut down incorrectly -> [Help 1]
    [ERROR]
    [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
    [ERROR] Re-run Maven using the -X switch to enable full debug logging.
    [ERROR]
    [ERROR] For more information about the errors and possible solutions, please read the following articles:
    [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
    [ERROR]
    [ERROR] After correcting the problems, you can resume the build with the command
    [ERROR] mvn -rf :hadoop-auth

    解决办法:
    这是hadoop的一个bug,在pom.xml中添加下面patch即可,详见https://issues.apache.org/jira/browse/HADOOP-10110 。

    编辑`hadoop-common-project/hadoop-auth/pom.xml`文件:

    vi hadoop-common-project/hadoop-auth/pom.xml

    <dependencys></dependencys>节点中插入:

    <dependency>
      <groupId>org.mortbay.jetty</groupId>
      <artifactId>jetty-util</artifactId>
      <scope>test</scope>
    </dependency> 

    第2次编译:失败(未安装protoc)

    错误信息:

    [ERROR] Failed to execute goal org.apache.hadoop:hadoop-maven-plugins:2.2.0:protoc (compile-protoc) on project hadoop-common: org.apache.maven.plugin.MojoExecutionException: 'protoc --version' did not return a version -&gt; [Help 1]
    [ERROR]
    [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
    [ERROR] Re-run Maven using the -X switch to enable full debug logging.
    [ERROR]
    [ERROR] For more information about the errors and possible solutions, please read the following articles:
    [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
    [ERROR]
    [ERROR] After correcting the problems, you can resume the build with the command
    [ERROR] mvn -rf :hadoop-common

    解决办法:

    根据错误信息可以知道是因为没有安装protoc。

    wget https://protobuf.googlecode.com/files/protobuf-2.5.0.tar.gz
    tar -xzf protobuf-2.5.0.tar.gz
    cd protobuf-2.5.0
    ./configure
    make
    make check
    make install

    其中,在执行./configure命令是会报如下错误:

    checking whether to enable maintainer-specific portions of Makefiles... yes
    checking build system type... x86_64-unknown-linux-gnu
    checking host system type... x86_64-unknown-linux-gnu
    checking target system type... x86_64-unknown-linux-gnu
    checking for a BSD-compatible install... /usr/bin/install -c
    checking whether build environment is sane... yes
    checking for a thread-safe mkdir -p... /bin/mkdir -p
    checking for gawk... gawk
    checking whether make sets $(MAKE)... no
    checking for gcc... no
    checking for cc... no
    checking for cl.exe... no
    configure: error: in `/home/hadoop/protobuf-2.5.0':
    configure: error: no acceptable C compiler found in $PATH
    See `config.log' for more details

    提示我们找不到C编译器,因此我们还需要安装C编译器。

    ubuntu提供了集成gcc等编译器的基本编译工具`build-essential`,安装起来也比较方便,只需要一行命令:

    apt-get install build-essential

    安装过程中可能会提示包找不到,建议先更新下软件源:

    apt-get update

    安装之后验证protobuf的时候可能会报错以下错误:

    $ protoc --version
    protoc: error while loading shared libraries: libprotoc.so.8: cannot open shared object file: No such file or directory

    解决如下:

    $ export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/local/lib
    $ protoc --version
    libprotoc 2.5.0

    第3次编译:失败(未安装cmake)

    错误信息:

    [ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (make) on project hadoop-common: An Ant BuildException has occured: Execute failed: java.io.IOException: Cannot run program "cmake" (in directory "/home/hadoop/hadoop-2.2.0-src/hadoop-common-project/hadoop-common/target/native"): error=2, No such file or directory -&gt; [Help 1]
    [ERROR]
    [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
    [ERROR] Re-run Maven using the -X switch to enable full debug logging.
    [ERROR]
    [ERROR] For more information about the errors and possible solutions, please read the following articles:
    [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
    [ERROR]
    [ERROR] After correcting the problems, you can resume the build with the command
    [ERROR] mvn -rf :hadoop-common

    解决办法:

    apt-get install cmake

    第4次编译:失败(未安装libglib2.0-dev)

    错误信息:

    [ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (make) on project hadoop-common: An Ant BuildException has occured: exec returned: 1 -&gt; [Help 1]
    [ERROR]
    [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
    [ERROR] Re-run Maven using the -X switch to enable full debug logging.
    [ERROR]
    [ERROR] For more information about the errors and possible solutions, please read the following articles:
    [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
    [ERROR]
    [ERROR] After correcting the problems, you can resume the build with the command
    [ERROR] mvn -rf :hadoop-common

    解决办法:

    apt-get install libglib2.0-dev

    第5次编译:失败(未安装libssl-dev)

    错误信息:

    [ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (make) on project hadoop-pipes: An Ant BuildException has occured: exec returned: 1 -&gt; [Help 1]
    [ERROR]
    [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
    [ERROR] Re-run Maven using the -X switch to enable full debug logging.
    [ERROR]
    [ERROR] For more information about the errors and possible solutions, please read the following articles:
    [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
    [ERROR]
    [ERROR] After correcting the problems, you can resume the build with the command
    [ERROR] mvn -rf :hadoop-pipes

    解决办法:

    apt-get install libssl-dev

    第6次编译:成功

    [INFO] ------------------------------------------------------------------------
    [INFO] Reactor Summary:
    [INFO]
    [INFO] Apache Hadoop Main ................................. SUCCESS [ 13.578 s]
    [INFO] Apache Hadoop Project POM .......................... SUCCESS [ 5.183 s]
    [INFO] Apache Hadoop Annotations .......................... SUCCESS [ 9.527 s]
    [INFO] Apache Hadoop Assemblies ........................... SUCCESS [ 1.268 s]
    [INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [ 4.717 s]
    [INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [ 9.966 s]
    [INFO] Apache Hadoop Auth ................................. SUCCESS [ 7.368 s]
    [INFO] Apache Hadoop Auth Examples ........................ SUCCESS [ 3.971 s]
    [INFO] Apache Hadoop Common ............................... SUCCESS [02:27 min]
    [INFO] Apache Hadoop NFS .................................. SUCCESS [ 14.996 s]
    [INFO] Apache Hadoop Common Project ....................... SUCCESS [ 0.078 s]
    [INFO] Apache Hadoop HDFS ................................. SUCCESS [02:32 min]
    [INFO] Apache Hadoop HttpFS ............................... SUCCESS [ 30.260 s]
    [INFO] Apache Hadoop HDFS BookKeeper Journal .............. SUCCESS [ 19.083 s]
    [INFO] Apache Hadoop HDFS-NFS ............................. SUCCESS [ 8.313 s]
    [INFO] Apache Hadoop HDFS Project ......................... SUCCESS [ 0.071 s]
    [INFO] hadoop-yarn ........................................ SUCCESS [ 0.542 s]
    [INFO] hadoop-yarn-api .................................... SUCCESS [01:07 min]
    [INFO] hadoop-yarn-common ................................. SUCCESS [ 48.948 s]
    [INFO] hadoop-yarn-server ................................. SUCCESS [ 0.314 s]
    [INFO] hadoop-yarn-server-common .......................... SUCCESS [ 18.413 s]
    [INFO] hadoop-yarn-server-nodemanager ..................... SUCCESS [ 23.891 s]
    [INFO] hadoop-yarn-server-web-proxy ....................... SUCCESS [ 5.687 s]
    [INFO] hadoop-yarn-server-resourcemanager ................. SUCCESS [ 24.345 s]
    [INFO] hadoop-yarn-server-tests ........................... SUCCESS [ 0.721 s]
    [INFO] hadoop-yarn-client ................................. SUCCESS [ 8.261 s]
    [INFO] hadoop-yarn-applications ........................... SUCCESS [ 0.168 s]
    [INFO] hadoop-yarn-applications-distributedshell .......... SUCCESS [ 6.632 s]
    [INFO] hadoop-mapreduce-client ............................ SUCCESS [ 0.261 s]
    [INFO] hadoop-mapreduce-client-core ....................... SUCCESS [ 40.147 s]
    [INFO] hadoop-yarn-applications-unmanaged-am-launcher ..... SUCCESS [ 3.497 s]
    [INFO] hadoop-yarn-site ................................... SUCCESS [ 0.164 s]
    [INFO] hadoop-yarn-project ................................ SUCCESS [ 6.054 s]
    [INFO] hadoop-mapreduce-client-common ..................... SUCCESS [ 29.892 s]
    [INFO] hadoop-mapreduce-client-shuffle .................... SUCCESS [ 5.450 s]
    [INFO] hadoop-mapreduce-client-app ........................ SUCCESS [ 18.558 s]
    [INFO] hadoop-mapreduce-client-hs ......................... SUCCESS [ 9.045 s]
    [INFO] hadoop-mapreduce-client-jobclient .................. SUCCESS [ 7.740 s]
    [INFO] hadoop-mapreduce-client-hs-plugins ................. SUCCESS [ 2.819 s]
    [INFO] Apache Hadoop MapReduce Examples ................... SUCCESS [ 12.523 s]
    [INFO] hadoop-mapreduce ................................... SUCCESS [ 5.321 s]
    [INFO] Apache Hadoop MapReduce Streaming .................. SUCCESS [ 8.999 s]
    [INFO] Apache Hadoop Distributed Copy ..................... SUCCESS [ 13.044 s]
    [INFO] Apache Hadoop Archives ............................. SUCCESS [ 3.739 s]
    [INFO] Apache Hadoop Rumen ................................ SUCCESS [ 11.307 s]
    [INFO] Apache Hadoop Gridmix .............................. SUCCESS [ 8.223 s]
    [INFO] Apache Hadoop Data Join ............................ SUCCESS [ 6.296 s]
    [INFO] Apache Hadoop Extras ............................... SUCCESS [ 6.341 s]
    [INFO] Apache Hadoop Pipes ................................ SUCCESS [ 14.662 s]
    [INFO] Apache Hadoop Tools Dist ........................... SUCCESS [ 2.694 s]
    [INFO] Apache Hadoop Tools ................................ SUCCESS [ 0.063 s]
    [INFO] Apache Hadoop Distribution ......................... SUCCESS [ 44.996 s]
    [INFO] Apache Hadoop Client ............................... SUCCESS [ 16.908 s]
    [INFO] Apache Hadoop Mini-Cluster ......................... SUCCESS [ 5.014 s]
    [INFO] ------------------------------------------------------------------------
    [INFO] BUILD SUCCESS
    [INFO] ------------------------------------------------------------------------
    [INFO] Total time: 15:23 min
    [INFO] Finished at: 2014-10-04T14:54:28+08:00
    [INFO] Final Memory: 69M/215M
    [INFO] ------------------------------------------------------------------------

    编译成果

    编译生产的文件在`~/hadoop-2.2.0-src/hadoop-dist/target`目录中。

    $ ls ~/hadoop-2.2.0-src/hadoop-dist/target
    antrun hadoop-2.2.0 hadoop-dist-2.2.0-javadoc.jar test-dir
    dist-layout-stitching.sh hadoop-2.2.0.tar.gz javadoc-bundle-options
    dist-tar-stitching.sh hadoop-dist-2.2.0.jar maven-archiver

    其中hadoop-2.2.0是编译后的文件夹,hadoop-2.2.0.tar.gz是编译后的打包文件。

    验证

    $ cd ~/hadoop-2.2.0-src/hadoop-dist/target/hadoop-2.2.0/lib/native/
    $ file libhadoop.so.1.0.0
    libhadoop.so.1.0.0: ELF 64-bit LSB shared object, x86-64, version 1 (SYSV), dynamically linked, BuildID[sha1]=fb43b4ebd092ae8b4a427719b8907e6fdb223ed9, not stripped

    可以看到,libhadoop.so.1.0.0已经是64位的了。

    拷贝

    将编译好的64位hadoop-2.2.0.tar.gz部署包,拷贝到当前用户目录。

    cp ~/hadoop-2.2.0-src/hadoop-dist/target/hadoop-2.2.0.tar.gz ~
  • 相关阅读:
    给你的程序增加热键(C#)
    C#中的键盘处理
    可以给img元素设置背景图
    如何利用JS实现对后台CS代码的调用
    李阳疯狂英语300句
    如何基于linux创造财富
    3d材质贴图常用参数
    asp.net下检测远程URL是否存在的三种方法
    ASP.NET 配置
    服务器硬盘空间操作
  • 原文地址:https://www.cnblogs.com/imfanqi/p/4319696.html
Copyright © 2011-2022 走看看