zoukankan      html  css  js  c++  java
  • 在 YARN 上运行 mapreduce 的 jar 包

    想学习运行一个mapreduce程序试试,记录如下

    本地运行word count

    新建maven项目,添加hadoop-client,版本比如3.1.2
    官方的wordcount直接拿来用就可以
    需要把winutils.exe和hadoop.dll放到环境变量HADOOP_HOME中,这两个文件夹需要在bin子文件夹中,下载链接

    添加两个运行参数,一个输入文件名,一个输出文件名,直接就可以运行了

    单服务器配置(伪分布式

    服务器使用centos安装,主机名c1,静态ip地址
    使用rpm安装了oracle的jdk8,解压hadoop-3.1.2到/hadoop/文件夹下

    ssh-keygen
    ssh-copy-id localhost
    

    core-site.xml如下

    <configuration>
    <property>
    <name>fs.default.name</name>
    <value>hdfs://c1:8020</value>
    </property>
    </configuration>
    

    hdfs-site.xml

    <configuration>
    <property>
    <name>dfs.replication</name>
    <value>1</value>
    </property>
    <property>
    <name>hadoop.tmp.dir</name>
    <value>/hadoop/tmp</value>
    </property>
    </configuration>
    

    yarn-site.xml

    <configuration>
    
    <!-- Site specific YARN configuration properties -->
    <property>
    <name>yarn.resourcemanager.hostname</name>
    <value>c1</value>
    </property>
    <property>
    <name>yarn.nodemanager.aux-services</name>
    <value>mapreduce_shuffle</value>
    </property>
    </configuration>
    

    这时候运行还是会出错,找不到MRAppMaster类,类似如下:

    Container exited with a non-zero exit code 1. Last 4096 bytes of stderr :
    Error: Could not find or load main class org.apache.hadoop.mapreduce.v2.app.MRAppMaster
    
    Please check whether your etc/hadoop/mapred-site.xml contains the below configuration:
    <property>
      <name>yarn.app.mapreduce.am.env</name>
      <value>HADOOP_MAPRED_HOME=${full path of your hadoop distribution directory}</value>
    </property>
    <property>
      <name>mapreduce.map.env</name>
      <value>HADOOP_MAPRED_HOME=${full path of your hadoop distribution directory}</value>
    </property>
    <property>
      <name>mapreduce.reduce.e nv</name>
      <value>HADOOP_MAPRED_HOME=${full path of your hadoop distribution directory}</value>
    </property>
    

    修改如下
    mapred-site.xml

    <configuration>
    <property>
    <name>yarn.app.mapreduce.am.env</name>
    <value>HADOOP_MAPRED_HOME=/hadoop/hadoop-3.1.2</value>
    </property>
    <property>
    <name>mapreduce.map.env</name>
    <value>HADOOP_MAPRED_HOME=/hadoop/hadoop-3.1.2</value>
    </property>
    <property>
    <name>mapreduce.reduce.env</name>
    <value>HADOOP_MAPRED_HOME=/hadoop/hadoop-3.1.2</value>
    </property>
    </configuration>
    

    修改workers或slaves文件,不同版本的hadoop文件名不一样
    把localhost改成主机名,比如c1

    要在/etc/hosts加入c1的静态地址,否则,就可能连不上nodemanager,比如

    192.168.1.111 c1
    

    启动dfs/yarn服务

    格式化只需要第一次运行一下。

    bin/hadoop namenode -format
    bin/start-dfs.sh
    bin/start-yarn.sh
    

    stop之后想再start可能需要等待文件列表同步,正常要等30秒,可以在网页startup信息看。

    开发机可以看到c1的50070(hadoop2)或9870(hadoop3)端口就说明hdfs起来了
    可以看到8088端口说明yarn起来了

    修改程序,加入YARN配置

    wordcount程序需要服务端的xml配置,可以放到conf文件夹,把这个文件夹在IDE里点右键设置为Resources,不然程序写相对路径也访问不到
    修改wordcount程序,加入xml配置

            Configuration conf = new Configuration();
            conf.addResource("core-site.xml");
            conf.addResource("yarn-site.xml");
            conf.addResource("hdfs-site.xml");
            conf.addResource("mapred-site.xml");
    
            conf.set("mapreduce.app-submission.cross-platform","true");
            conf.set("mapreduce.framework.name","yarn");
            conf.set("mapreduce.job.jar","target\mr1-1.0-SNAPSHOT.jar");
    

    在maven工具中调用package,生成的jar文件名和路径,写到上面程序最后一行

    在hdfs中放一个input.txt文件作为输入,如果有权限问题可以参考命令

    hdfs dfs -chown user:group input.txt
    hdfs dfs -chmod 777 input.txt
    

    提交到yarn运行程序

    结果如下

    2019-03-26 16:49:33,942 INFO  [main] client.RMProxy (RMProxy.java:newProxyInstance(133)) - Connecting to ResourceManager at c1/192.168.1.111:8032
    2019-03-26 16:49:34,363 WARN  [main] mapreduce.JobResourceUploader (JobResourceUploader.java:uploadResourcesInternal(147)) - Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this.
    2019-03-26 16:49:34,372 INFO  [main] mapreduce.JobResourceUploader (JobResourceUploader.java:disableErasureCodingForPath(883)) - Disabling Erasure Coding for path: /tmp/hadoop-yarn/staging/cdarling/.staging/job_1553590163847_0001
    2019-03-26 16:49:34,500 INFO  [main] input.FileInputFormat (FileInputFormat.java:listStatus(292)) - Total input files to process : 1
    2019-03-26 16:49:35,349 INFO  [main] mapreduce.JobSubmitter (JobSubmitter.java:submitJobInternal(202)) - number of splits:1
    2019-03-26 16:49:35,824 INFO  [main] mapreduce.JobSubmitter (JobSubmitter.java:printTokens(298)) - Submitting tokens for job: job_1553590163847_0001
    2019-03-26 16:49:35,826 INFO  [main] mapreduce.JobSubmitter (JobSubmitter.java:printTokens(299)) - Executing with tokens: []
    2019-03-26 16:49:35,932 INFO  [main] conf.Configuration (Configuration.java:getConfResourceAsInputStream(2752)) - resource-types.xml not found
    2019-03-26 16:49:35,932 INFO  [main] resource.ResourceUtils (ResourceUtils.java:addResourcesFileToConf(418)) - Unable to find 'resource-types.xml'.
    2019-03-26 16:49:36,278 INFO  [main] impl.YarnClientImpl (YarnClientImpl.java:submitApplication(324)) - Submitted application application_1553590163847_0001
    2019-03-26 16:49:36,302 INFO  [main] mapreduce.Job (Job.java:submit(1574)) - The url to track the job: http://c1:8088/proxy/application_1553590163847_0001/
    2019-03-26 16:49:36,302 INFO  [main] mapreduce.Job (Job.java:monitorAndPrintJob(1619)) - Running job: job_1553590163847_0001
    2019-03-26 16:49:41,367 INFO  [main] mapreduce.Job (Job.java:monitorAndPrintJob(1640)) - Job job_1553590163847_0001 running in uber mode : false
    2019-03-26 16:49:41,367 INFO  [main] mapreduce.Job (Job.java:monitorAndPrintJob(1647)) -  map 0% reduce 0%
    2019-03-26 16:49:45,410 INFO  [main] mapreduce.Job (Job.java:monitorAndPrintJob(1647)) -  map 100% reduce 0%
    2019-03-26 16:49:49,437 INFO  [main] mapreduce.Job (Job.java:monitorAndPrintJob(1647)) -  map 100% reduce 100%
    2019-03-26 16:49:49,447 INFO  [main] mapreduce.Job (Job.java:monitorAndPrintJob(1658)) - Job job_1553590163847_0001 completed successfully
    2019-03-26 16:49:49,510 INFO  [main] mapreduce.Job (Job.java:monitorAndPrintJob(1665)) - Counters: 53
        File System Counters
            FILE: Number of bytes read=99
            FILE: Number of bytes written=433951
            FILE: Number of read operations=0
            FILE: Number of large read operations=0
            FILE: Number of write operations=0
            HDFS: Number of bytes read=165
            HDFS: Number of bytes written=61
            HDFS: Number of read operations=8
            HDFS: Number of large read operations=0
            HDFS: Number of write operations=2
        Job Counters 
            Launched map tasks=1
            Launched reduce tasks=1
            Data-local map tasks=1
            Total time spent by all maps in occupied slots (ms)=1608
            Total time spent by all reduces in occupied slots (ms)=1685
            Total time spent by all map tasks (ms)=1608
            Total time spent by all reduce tasks (ms)=1685
            Total vcore-milliseconds taken by all map tasks=1608
            Total vcore-milliseconds taken by all reduce tasks=1685
            Total megabyte-milliseconds taken by all map tasks=1646592
            Total megabyte-milliseconds taken by all reduce tasks=1725440
        Map-Reduce Framework
            Map input records=3
            Map output records=13
            Map output bytes=123
            Map output materialized bytes=99
            Input split bytes=94
            Combine input records=13
            Combine output records=8
            Reduce input groups=8
            Reduce shuffle bytes=99
            Reduce input records=8
            Reduce output records=8
            Spilled Records=16
            Shuffled Maps =1
            Failed Shuffles=0
            Merged Map outputs=1
            GC time elapsed (ms)=75
            CPU time spent (ms)=930
            Physical memory (bytes) snapshot=509763584
            Virtual memory (bytes) snapshot=5578809344
            Total committed heap usage (bytes)=410517504
            Peak Map Physical memory (bytes)=294195200
            Peak Map Virtual memory (bytes)=2786607104
            Peak Reduce Physical memory (bytes)=215568384
            Peak Reduce Virtual memory (bytes)=2792202240
        Shuffle Errors
            BAD_ID=0
            CONNECTION=0
            IO_ERROR=0
            WRONG_LENGTH=0
            WRONG_MAP=0
            WRONG_REDUCE=0
        File Input Format Counters 
            Bytes Read=71
        File Output Format Counters 
            Bytes Written=61
    
    Process finished with exit code 0
    
     
     
    转自:https://www.jianshu.com/p/229a30a51c36 
  • 相关阅读:
    Struts2取值
    Mybatis介绍
    Java开发JDBC连接数据库
    【模板】多项式全家桶_缺斤少两
    【JOI】JOISC2020R1_T1building_构造/ntt
    【CF】codeforces_1301F_Super Jaber_最短路
    【CF】codeforces1301E_前缀和_论如何对CF的机器抱有信心
    poj 2079 Triangle
    poj 1912 A highway and the seven dwarfs
    poj 2482 Stars in Your Window
  • 原文地址:https://www.cnblogs.com/javalinux/p/15097333.html
Copyright © 2011-2022 走看看