zoukankan      html  css  js  c++  java
  • 记一次spark classpath jar 冲突问题

    spark shell 提交任务报错

    Caused by: java.lang.NoSuchMethodError: org.apache.spark.network.util.AbstractFileRegion.transferred()J
    at org.apache.spark.network.util.AbstractFileRegion.transfered(AbstractFileRegion.java:28)
    at io.netty.channel.nio.AbstractNioByteChannel.doWrite(AbstractNioByteChannel.java:228)
    at io.netty.channel.socket.nio.NioSocketChannel.doWrite(NioSocketChannel.java:282)
    at io.netty.channel.AbstractChannel$AbstractUnsafe.flush0(AbstractChannel.java:879)
    

    此类 AbstractFileRegion 在spark-network-common.xxx.jar中 classpath是有的,所以怀疑jar冲突

    # 提交失败的classpath
    [/data/xxx/dts-executor/executions/4998-0-1/resource
     /data/xxx/dts-executor/plugins/sparkShell/conf
     /data/xxx/dts-executor/plugins/sparkShell/di-dts-plugin-ic-1.0.0.jar
     /usr/hdp/current/hadoop-client/*
     /usr/hdp/current/hadoop-client/lib/*
     /usr/hdp/current/hadoop-hdfs-client/*
     /usr/hdp/current/hadoop-hdfs-client/lib/*
     /usr/hdp/current/hadoop-mapreduce-client/*
     /usr/hdp/current/hadoop-mapreduce-client/lib/*
     /usr/hdp/current/hadoop-yarn-client/*
     /usr/hdp/current/hadoop-yarn-client/lib/*
     /usr/hdp/current/hive/lib/*
     /usr/hdp/current/spark2-client/conf
     /usr/hdp/current/spark2-client/jars/*]
    
    # 提交成功的classpath
     [/data/xxx/dts-executor/plugins/sparkShell/di-dts-plugin-ic-1.0.0.jar
     /data/xxx/dts-executor/plugins/sparkShell/conf
     /usr/hdp/current/spark2-client/jars/*
     /usr/hdp/current/spark2-client/conf
     /usr/hdp/current/hive/lib/*
     /usr/hdp/current/hadoop-client/*
     /usr/hdp/current/hadoop-client/lib/*
     /usr/hdp/current/hadoop-mapreduce-client/*
     /usr/hdp/current/hadoop-mapreduce-client/lib/*
     /usr/hdp/current/hadoop-yarn-client/*
     /usr/hdp/current/hadoop-yarn-client/lib/*
     /usr/hdp/current/hadoop-hdfs-client/*
     /usr/hdp/current/hadoop-hdfs-client/lib/*
     /data/xxx/dts-executor/executions/5008-0-1/resource]
    
    主要区别在于  /usr/hdp/current/spark2-client/jars/* 被提前了,后来发现是netty包冲突了
    /usr/hdp/current/hadoop-client/*等下面有与 /usr/hdp/current/spark2-client/jars/* 以下两个包冲突
    netty-3.9.9.Final.jar
    netty-all-4.1.17.Final.jar
    解决方案:
    1.可以把spark相关classpath提到hadoop前,那么优先加载spark相关包(我们采用了这种)
    2.如果不使用hadoop的netty相关的两个包,可以直接删除掉,那么就不存在冲突了
    
  • 相关阅读:
    【excel】=EXACT(A1,B1) 比较两个字符串是否相等
    【oracle】oracle11g安装失败 提示找不到文件,模板General_Purpose.dbc不存在
    【oracle】11g服务器安装详细步骤
    【oracle】ceil函数 返回值 (大于参数的最小整数)
    【oracle】 months_between(date1,date2)
    javaWeb遍历获取session中的值
    tomcat+mysql数据库连接池的操作
    java中值得类型转化
    javaWeb图片验证码代码
    JSP与Servlet之间传值
  • 原文地址:https://www.cnblogs.com/jiangxiaoxian/p/13638553.html
Copyright © 2011-2022 走看看