zoukankan      html  css  js  c++  java
  • 运行hadoop自带的计算圆周率异常

    运行hadoop2 自带的圆周率计算方法时,报错,找了半天,原来是在配置hadoop临时目录时,没有给权限,找到配置的hadoop临时目录文件夹,修改权限即可

    Application application_1548242073562_0005 failed 2 times due to AM Container for appattempt_1548242073562_0005_000002 exited with exitCode: 1
    Failing this attempt.Diagnostics: [2019-01-23 20:50:53.319]Exception from container-launch.
    Container id: container_1548242073562_0005_02_000001
    Exit code: 1
    [2019-01-23 20:50:53.321]Container exited with a non-zero exit code 1. Error file: prelaunch.err.
    Last 4096 bytes of prelaunch.err :
    Last 4096 bytes of stderr :
    log4j:WARN No appenders could be found for logger (org.apache.hadoop.mapreduce.v2.app.MRAppMaster).
    log4j:WARN Please initialize the log4j system properly.
    

    计算少的时,比如1 1,不报错,如果计算稍微大的数字,如10 10 就直接报下面的异常

    19/01/24 10:20:32 INFO ipc.Client: Retrying connect to server: 0.0.0.0/0.0.0.0:10020. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
    java.io.IOException: java.net.ConnectException: Your endpoint configuration is wrong; For more details see:  http://wiki.apache.org/hadoop/UnsetHostnameOrPort
    	at org.apache.hadoop.mapred.ClientServiceDelegate.invoke(ClientServiceDelegate.java:344)
    	at org.apache.hadoop.mapred.ClientServiceDelegate.getJobStatus(ClientServiceDelegate.java:429)
    	at org.apache.hadoop.mapred.YARNRunner.getJobStatus(YARNRunner.java:804)
    	at org.apache.hadoop.mapreduce.Job$1.run(Job.java:331)
    	at org.apache.hadoop.mapreduce.Job$1.run(Job.java:328)
    	at java.security.AccessController.doPrivileged(Native Method)
    	at javax.security.auth.Subject.doAs(Subject.java:422)
    	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1893)
    	at org.apache.hadoop.mapreduce.Job.updateStatus(Job.java:328)
    	at org.apache.hadoop.mapreduce.Job.isSuccessful(Job.java:624)
    	at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1603)
    	at org.apache.hadoop.examples.QuasiMonteCarlo.estimatePi(QuasiMonteCarlo.java:307)
    	at org.apache.hadoop.examples.QuasiMonteCarlo.run(QuasiMonteCarlo.java:360)
    	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
    	at org.apache.hadoop.examples.QuasiMonteCarlo.main(QuasiMonteCarlo.java:368)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:71)
    	at org.apache.hadoop.util.ProgramDriver.run(ProgramDriver.java:144)
    	at org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:74)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.apache.hadoop.util.RunJar.run(RunJar.java:244)
    	at org.apache.hadoop.util.RunJar.main(RunJar.java:158)
    Caused by: java.net.ConnectException: Your endpoint configuration is wrong; For more details see:  http://wiki.apache.org/hadoop/UnsetHostnameOrPort
    	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    	at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:824)
    	at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
    	at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1511)
    	at org.apache.hadoop.ipc.Client.call(Client.java:1453)
    	at org.apache.hadoop.ipc.Client.call(Client.java:1363)
    	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:227)
    	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116)
    	at com.sun.proxy.$Proxy15.getJobReport(Unknown Source)
    	at org.apache.hadoop.mapreduce.v2.api.impl.pb.client.MRClientProtocolPBClientImpl.getJobReport(MRClientProtocolPBClientImpl.java:133)
    	at sun.reflect.GeneratedMethodAccessor3.invoke(Unknown Source)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.apache.hadoop.mapred.ClientServiceDelegate.invoke(ClientServiceDelegate.java:325)
    	... 27 more
    Caused by: java.net.ConnectException: Connection refused
    	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
    	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
    	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
    	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
    	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:690)
    	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:794)
    	at org.apache.hadoop.ipc.Client$Connection.access$3600(Client.java:412)
    	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1568)
    	at org.apache.hadoop.ipc.Client.call(Client.java:1399)
    	... 36 more
    

      后来在这里找到了答案https://blog.csdn.net/zhouyan8603/article/details/47398245,,hadoop的datanode需要访问namenode的jobhistory server,如果没有修改,则默认为0.0.0.0:10020,则可以修改mapred-site.xml文件:

    <property>
        <name>mapreduce.jobhistory.address</name>
           <!-- 配置实际的Master主机名和端口-->
        <value>0.0.0.0:10020</value>
    </property>
    
    
    <property>
        <name>mapreduce.jobhistory.webapp.address</name>
           <!-- 配置实际的Master主机名和端口-->
        <value>0.0.0.0:19888</value>
    </property> 

    然后启动historyserver,再尝试10 10  完成

    $HADOOP_HOME/sbin/mr-jobhistory-daemon.sh start historyserver

  • 相关阅读:
    python模块总结(一)命令行解析模块argparse
    TCP(一)三次握手和四次挥手
    容器网络(四)vxlan
    容器网络(三)容器间通信
    kvm虚拟化(二)网络虚拟化
    KVM虚拟化(一)创建虚拟机
    数字操作 —— 9_ 回文数
    数字操作 —— 8_字符串转换整数(atoi)
    数字操作 —— 7_整数反转
    字符串操作 —— 763_划分字母区间
  • 原文地址:https://www.cnblogs.com/lly001/p/10311880.html
Copyright © 2011-2022 走看看