zoukankan      html  css  js  c++  java
  • 记一次spark classpath jar 冲突问题

    spark shell 提交任务报错

    Caused by: java.lang.NoSuchMethodError: org.apache.spark.network.util.AbstractFileRegion.transferred()J
    at org.apache.spark.network.util.AbstractFileRegion.transfered(AbstractFileRegion.java:28)
    at io.netty.channel.nio.AbstractNioByteChannel.doWrite(AbstractNioByteChannel.java:228)
    at io.netty.channel.socket.nio.NioSocketChannel.doWrite(NioSocketChannel.java:282)
    at io.netty.channel.AbstractChannel$AbstractUnsafe.flush0(AbstractChannel.java:879)
    

    此类 AbstractFileRegion 在spark-network-common.xxx.jar中 classpath是有的,所以怀疑jar冲突

    # 提交失败的classpath
    [/data/xxx/dts-executor/executions/4998-0-1/resource
     /data/xxx/dts-executor/plugins/sparkShell/conf
     /data/xxx/dts-executor/plugins/sparkShell/di-dts-plugin-ic-1.0.0.jar
     /usr/hdp/current/hadoop-client/*
     /usr/hdp/current/hadoop-client/lib/*
     /usr/hdp/current/hadoop-hdfs-client/*
     /usr/hdp/current/hadoop-hdfs-client/lib/*
     /usr/hdp/current/hadoop-mapreduce-client/*
     /usr/hdp/current/hadoop-mapreduce-client/lib/*
     /usr/hdp/current/hadoop-yarn-client/*
     /usr/hdp/current/hadoop-yarn-client/lib/*
     /usr/hdp/current/hive/lib/*
     /usr/hdp/current/spark2-client/conf
     /usr/hdp/current/spark2-client/jars/*]
    
    # 提交成功的classpath
     [/data/xxx/dts-executor/plugins/sparkShell/di-dts-plugin-ic-1.0.0.jar
     /data/xxx/dts-executor/plugins/sparkShell/conf
     /usr/hdp/current/spark2-client/jars/*
     /usr/hdp/current/spark2-client/conf
     /usr/hdp/current/hive/lib/*
     /usr/hdp/current/hadoop-client/*
     /usr/hdp/current/hadoop-client/lib/*
     /usr/hdp/current/hadoop-mapreduce-client/*
     /usr/hdp/current/hadoop-mapreduce-client/lib/*
     /usr/hdp/current/hadoop-yarn-client/*
     /usr/hdp/current/hadoop-yarn-client/lib/*
     /usr/hdp/current/hadoop-hdfs-client/*
     /usr/hdp/current/hadoop-hdfs-client/lib/*
     /data/xxx/dts-executor/executions/5008-0-1/resource]
    
    主要区别在于  /usr/hdp/current/spark2-client/jars/* 被提前了,后来发现是netty包冲突了
    /usr/hdp/current/hadoop-client/*等下面有与 /usr/hdp/current/spark2-client/jars/* 以下两个包冲突
    netty-3.9.9.Final.jar
    netty-all-4.1.17.Final.jar
    解决方案:
    1.可以把spark相关classpath提到hadoop前,那么优先加载spark相关包(我们采用了这种)
    2.如果不使用hadoop的netty相关的两个包,可以直接删除掉,那么就不存在冲突了
    
  • 相关阅读:
    java Io 流类详解
    java 集合hashmap hashset arraylist 详解以及常见面试题
    java 基本类型以及笔试常考点
    Java 面向对象思想简介(入门篇)
    SpringMVC开发过程中的中文乱码问题
    菜鸟级springmvc+spring+mybatis整合开发用户登录功能(下)
    菜鸟级springmvc+spring+mybatis整合开发用户登录功能(上)
    Applet再学习
    Applet初次使用
    ZLYD团队第5周项目总结
  • 原文地址:https://www.cnblogs.com/jiangxiaoxian/p/13638553.html
Copyright © 2011-2022 走看看