zoukankan      html  css  js  c++  java
  • 本地IDEA链接远程Hadoop

    本文使用的Hadoop为2.7.7,版本如果不同要下载相应版本的文件

    1. 配置本地的Hadoop库(不需完整安装,但是要有环境支持)

    下载文件

    https://github.com/speedAngel/hadoop2.7.7

    1. 解压到任意路径,没有中文字符和空格

    1. 把解压包的bin替换到解压路径

    2. 把bin中的Hadoop.dll复制到C:WindowsSystem32

    3. 配置环境变量

    HADOOP_HOME  D:Environmenthadoop-2.7.7
    HADOOP_CONF_DIR  D:Environmenthadoop-2.7.7etchadoop
    YARN_CONF_DIR  %HADOOP_CONF_DIR%
    PATH  %HADOOP_HOME%in
    1. IDEA设置本地Hadoop路径

    1. 导入依赖(注意版本一致)

        <dependencies>
            <dependency>
                <groupId>junit</groupId>
                <artifactId>junit</artifactId>
                <version>4.12</version>
    ​
            </dependency>
            <dependency>
                <groupId>org.apache.hadoop</groupId>
                <artifactId>hadoop-common</artifactId>
                <version>2.7.7</version>
            </dependency>
            <dependency>
                <groupId>org.apache.hadoop</groupId>
                <artifactId>hadoop-hdfs</artifactId>
                <version>2.7.7</version>
            </dependency>
            <!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-mapreduce-client-core -->
            <dependency>
                <groupId>org.apache.hadoop</groupId>
                <artifactId>hadoop-mapreduce-client-core</artifactId>
                <version>2.7.7</version>
            </dependency>
            <dependency>
                <groupId>org.apache.hadoop</groupId>
                <artifactId>hadoop-mapreduce-client-jobclient</artifactId>
                <version>2.7.7</version>
            </dependency>
            <dependency>
                <groupId>commons-cli</groupId>
                <artifactId>commons-cli</artifactId>
                <version>1.3.1</version>
            </dependency>
            <dependency>
                <groupId>org.apache.hadoop</groupId>
                <artifactId>hadoop-client</artifactId>
                <version>2.7.7</version>
            </dependency>
            <dependency>
                <groupId>mysql</groupId>
                <artifactId>mysql-connector-java</artifactId>
                <version>8.0.19</version>
            </dependency>
            <!--工具类,可以复制对象-->
            <!-- https://mvnrepository.com/artifact/commons-beanutils/commons-beanutils -->
            <dependency>
                <groupId>commons-beanutils</groupId>
                <artifactId>commons-beanutils</artifactId>
                <version>1.9.4</version>
            </dependency>
        </dependencies>
    ​
    1. 把集群的core-site.xml和hdfs-site.xml文件放到项目resource路径下。修改对应IP地址

    • core-site.xml

    <?xml version="1.0" encoding="UTF-8"?>
    <?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
    <configuration>
      <property>
        <name>fs.defaultFS</name>
        <value>hdfs://192.168.98.129:9000</value>
      </property>
      <property>
        <name>hadoop.tmp.dir</name>
        <value>/usr/hop/hadoop-2.7.7/data/hopdata</value>
      </property>
    </configuration>
    ​

         
    • hdfs-site.xml

    <?xml version="1.0" encoding="UTF-8"?>
    <?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
    <!--
      Licensed under the Apache License, Version 2.0 (the "License");
      you may not use this file except in compliance with the License.
      You may obtain a copy of the License at
    ​
        http://www.apache.org/licenses/LICENSE-2.0
    ​
      Unless required by applicable law or agreed to in writing, software
      distributed under the License is distributed on an "AS IS" BASIS,
      WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
      See the License for the specific language governing permissions and
      limitations under the License. See accompanying LICENSE file.
    -->
    ​
    <!-- Put site-specific property overrides in this file. -->
    ​
    <configuration>
      <property>
        <name>dfs.namenode.secondary.http-address</name>
        <value>192.168.98.130:50090</value>
      </property>
      <property>
        <name>dfs.replication</name>
        <value>2</value>
      </property>
    </configuration>

    • 运行的Main方法里首行添加

            System.setProperty("HADOOP_USER_NAME","root");
            System.setProperty("HADOOP_USER_PASSWORD","PASSWORD");

     转自:http://www.pingtaimeng.com/article/detail/id/1067178

  • 相关阅读:
    PAT (Advanced Level) Practice 1054 The Dominant Color (20 分)
    PAT (Advanced Level) Practice 1005 Spell It Right (20 分) (switch)
    PAT (Advanced Level) Practice 1006 Sign In and Sign Out (25 分) (排序)
    hdu 5114 Collision
    hdu4365 Palindrome graph
    单链表查找最大值、两个递增的链表合并并且去重
    蓝桥杯-最短路 (SPFA算法学习)
    蓝桥杯-最大最小公倍数
    Codeforces-470 div2 C题
    蓝桥杯-地宫取宝
  • 原文地址:https://www.cnblogs.com/javalinux/p/14930736.html
Copyright © 2011-2022 走看看