zoukankan      html  css  js  c++  java
  • Windows配置本地Hadoop运行环境

    很多人喜欢用Windows本地开发Hadoop程序,这里是一个在Windows下配置Hadoop的教程。

    首先去官网下载hadoop,这里需要下载一个工具winutils,这个工具是编译hadoop用的,下载完之后解压hadoop文件,然后把winutils.exe放到hadoop文件的bin目录下面

    然后在hadoop/etc/hadoop下修改以下文件:

    core-site.xml:

    <?xml version="1.0" encoding="UTF-8"?>
    <?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
    <!--
      Licensed under the Apache License, Version 2.0 (the "License");
      you may not use this file except in compliance with the License.
      You may obtain a copy of the License at
    
        http://www.apache.org/licenses/LICENSE-2.0
    
      Unless required by applicable law or agreed to in writing, software
      distributed under the License is distributed on an "AS IS" BASIS,
      WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
      See the License for the specific language governing permissions and
      limitations under the License. See accompanying LICENSE file.
    -->
    
    <!-- Put site-specific property overrides in this file. -->
    
    <configuration>
      <property>
        <name>fs.defaultFS</name>
        <value>hdfs://localhost:9000/</value>
      </property>
      <property>
        <name>io.native.lib.available</name>
        <value>false</value>
      </property>
      <property>
        <name>hadoop.native.lib</name>
        <value>false</value>
      </property>
      <property>
        <name>io.compression.codecs</name>
        <value>org.apache.hadoop.io.compress.GzipCodec,
               org.apache.hadoop.io.compress.DefaultCodec,
               com.hadoop.compression.lzo.LzoCodec,
               com.hadoop.compression.lzo.LzopCodec,
               org.apache.hadoop.io.compress.BZip2Codec,
               org.apache.hadoop.io.compress.SnappyCodec
            </value>
    </property>
    <property>
        <name>io.compression.codec.lzo.class</name>
        <value>com.hadoop.compression.lzo.LzoCodec</value>
    </property>
    
    </configuration>

    hdfs-site.xml:

    <?xml version="1.0" encoding="UTF-8"?>
    <?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
    <!--
      Licensed under the Apache License, Version 2.0 (the "License");
      you may not use this file except in compliance with the License.
      You may obtain a copy of the License at
    
        http://www.apache.org/licenses/LICENSE-2.0
    
      Unless required by applicable law or agreed to in writing, software
      distributed under the License is distributed on an "AS IS" BASIS,
      WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
      See the License for the specific language governing permissions and
      limitations under the License. See accompanying LICENSE file.
    -->
    
    <!-- Put site-specific property overrides in this file. -->
    
    <configuration>  
           <property>  
                   <name>dfs.replication</name>  
                    <value>1</value>  
           </property>
           <property>
                    <name>dfs.namenode.name.dir</name>  
                   <value>file:///D:/Hadoop/namenode</value>  
           </property>
           <property>
                   <name>dfs.datanode.data.dir</name>  
                   <value>file:///D:/Hadoop/datanode</value>  
           </property>
    </configuration>  

    mapred-site.xml:

    <?xml version="1.0"?>
    <?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
    <!--
      Licensed under the Apache License, Version 2.0 (the "License");
      you may not use this file except in compliance with the License.
      You may obtain a copy of the License at
    
        http://www.apache.org/licenses/LICENSE-2.0
    
      Unless required by applicable law or agreed to in writing, software
      distributed under the License is distributed on an "AS IS" BASIS,
      WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
      See the License for the specific language governing permissions and
      limitations under the License. See accompanying LICENSE file.
    -->
    
    <!-- Put site-specific property overrides in this file. -->
    
    <configuration>
        <property>
            <name>mapreduce.framework.name</name>
            <value>yarn</value>
        </property>
        <property>
            <name>mapred.compress.map.output</name>
            <value>true</value>
        </property>
        <property>
            <name>mapred.map.output.compression.codec</name>
            <value>com.hadoop.compression.lzo.LzoCodec</value>
        </property> 
        <property> 
            <name>mapred.child.env</name> 
            <value>LD_LIBRARY_PATH=</value>
            <name>mapreduce.framework.name</name>
            <value>yarn</value>
        </property>
        <property>
            <name>mapred.compress.map.output</name>
            <value>true</value>
        </property>
        <property>
            <name>mapred.map.output.compression.codec</name>
            <value>com.hadoop.compression.lzo.LzoCodec</value>
        </property> 
        <property> 
            <name>mapred.child.env</name> 
            <value>LD_LIBRARY_PATH=D:hadoop-2.7.3-win64lib</value> 
        </property>
    </configuration>

    然后cmd到hadoop的bin目录下执行:

    hdfs namenode -format

    然后在sbin目录下执行:

    start-all.cmd

    然后浏览器打开http://localhost:8088:

    执行hadoop命令:hadoop fs -ls /

    空的,新建一个文件夹:hadoop fs -mkdir /data 

    然后查看:hadoop fs -ls /

    这样就hadoop的本地伪分布式环境就配置好了。

  • 相关阅读:
    angular $modal 模态框
    过滤器 ||(filter)
    info sharp Are you trying to install as a root or sudo user? Try again with the --unsafe-perm flag
    git error: unable to create file Invalid argument
    bash shell 快捷键
    options has an unknown property 'modifyVars'. These properties are valid: 处理方法
    test 分支强制替换master 分支的办法
    脚本统计代码行数
    git commit 后,没有push ,怎么撤销
    php 用户ip的获取
  • 原文地址:https://www.cnblogs.com/Kaivenblog/p/9311328.html
Copyright © 2011-2022 走看看