zoukankan      html  css  js  c++  java
  • flink run 报错java.lang.NoSuchMethodError: org.apache.hadoop.ipc.Client.getRpcTimeout(Lorg/apache/hadoop/conf/Configuration;)I

    java.lang.NoSuchMethodError: org.apache.hadoop.ipc.Client.getRpcTimeout(Lorg/apache/hadoop/conf/Configuration;)I
    	at org.apache.hadoop.hdfs.DFSClient$Conf.<init>(DFSClient.java:360) ~[hadoop-hdfs-2.6.0-cdh5.13.1.jar:?]
    	at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:695) ~[hadoop-hdfs-2.6.0-cdh5.13.1.jar:?]
    	at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:678) ~[hadoop-hdfs-2.6.0-cdh5.13.1.jar:?]
    	at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:158) ~[hadoop-hdfs-2.6.0-cdh5.13.1.jar:?]
    	at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2596) ~[uat.jar:?]
    	at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:91) ~[uat.jar:?]
    	at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2630) ~[uat.jar:?]
    	at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2612) ~[uat.jar:?]
    	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:370) ~[uat.jar:?]
    	at org.apache.hadoop.fs.Path.getFileSystem(Path.java:296) ~[uat.jar:?]
    	at org.apache.flink.yarn.Utils.createTaskExecutorContext(Utils.java:389) ~[flink-dist_2.11-1.11.3.jar:1.11.3]
    	at org.apache.flink.yarn.YarnResourceManager.createTaskExecutorLaunchContext(YarnResourceManager.java:661) ~[flink-dist_2.11-1.11.3.jar:1.11.3]
    	at org.apache.flink.yarn.YarnResourceManager.startTaskExecutorInContainer(YarnResourceManager.java:448) ~[flink-dist_2.11-1.11.3.jar:1.11.3]
    	at org.apache.flink.yarn.YarnResourceManager.onContainersOfResourceAllocated(YarnResourceManager.java:422) ~[flink-dist_2.11-1.11.3.jar:1.11.3]
    	at org.apache.flink.yarn.YarnResourceManager.lambda$onContainersAllocated$1(YarnResourceManager.java:382) ~[flink-dist_2.11-1.11.3.jar:1.11.3]
    	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRunAsync(AkkaRpcActor.java:404) ~[uat.jar:?]
    	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:197) ~[uat.jar:?]
    	at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:74) ~[uat.jar:?]
    	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:154) ~[uat.jar:?]
    	at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26) [uat.jar:?]
    	at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21) [uat.jar:?]
    	at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:123) [uat.jar:?]
    	at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21) [uat.jar:?]
    	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:170) [uat.jar:?]
    	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171) [uat.jar:?]
    	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171) [uat.jar:?]
    	at akka.actor.Actor$class.aroundReceive(Actor.scala:517) [uat.jar:?]
    	at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225) [uat.jar:?]
    	at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592) [uat.jar:?]
    	at akka.actor.ActorCell.invoke(ActorCell.scala:561) [uat.jar:?]
    	at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258) [uat.jar:?]
    	at akka.dispatch.Mailbox.run(Mailbox.scala:225) [uat.jar:?]
    	at akka.dispatch.Mailbox.exec(Mailbox.scala:235) [uat.jar:?]
    	at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) [uat.jar:?]
    	at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) [uat.jar:?]
    	at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) [uat.jar:?]
    	at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) [uat.jar:?]

    查询该类出现在hadoop-common.jar下,找到该类Client,方法getRpcTimeout()无参,无重载方法。而cdh下的hadoop-common.jar该包中包含该方法。
    于是在打包flink项目时排除掉了pom中依赖:
    <dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-common</artifactId>
    <version>${hadoop.version}</version>
    <scope>provided</scope>
    </dependecy>
    重新打包后没有报错,不明白 执行flink run时是不是从/opt/cloudera/parcels/CDH-5.13.1-1.cdh5.13.1.p0.2/lib/hadoop/hadoop-common.jar 中获取了对应的jar。
    测试将 /opt/cloudera/parcels/CDH-5.13.1-1.cdh5.13.1.p0.2/lib/hadoop/ 下的hadoop-common.jar链接源修改(该文件为快捷方式源为hadoop-common-2.6.0-cdh5.13.1.jar,将该包变更名称),
    启动 flink run -m yarn-cluster -c cn.com.MainFeeTestVM /bigdata/uat.jar 报错无法提交yarn任务,改回后成功。由此可见,执行程序间接引用了hadoop中各类jar,在打包uat.jar就不需要再将
    hadoop相关包带进去了
    
    






  • 相关阅读:
    插入排序
    JavaMail学习笔记
    Struts2的工作原理
    我的快速排序
    截取字符串,只截取前N个字节的字符
    修改MyEclipse8.6中的Servlet.java模板
    Java类装载的过程及原理介绍
    cmd检查jdk的版本
    快速排序
    flash 侦测人的面部
  • 原文地址:https://www.cnblogs.com/mryangbo/p/14225483.html
Copyright © 2011-2022 走看看