zoukankan      html  css  js  c++  java
  • 记一次spark classpath jar 冲突问题

    spark shell 提交任务报错

    Caused by: java.lang.NoSuchMethodError: org.apache.spark.network.util.AbstractFileRegion.transferred()J
    at org.apache.spark.network.util.AbstractFileRegion.transfered(AbstractFileRegion.java:28)
    at io.netty.channel.nio.AbstractNioByteChannel.doWrite(AbstractNioByteChannel.java:228)
    at io.netty.channel.socket.nio.NioSocketChannel.doWrite(NioSocketChannel.java:282)
    at io.netty.channel.AbstractChannel$AbstractUnsafe.flush0(AbstractChannel.java:879)
    

    此类 AbstractFileRegion 在spark-network-common.xxx.jar中 classpath是有的,所以怀疑jar冲突

    # 提交失败的classpath
    [/data/xxx/dts-executor/executions/4998-0-1/resource
     /data/xxx/dts-executor/plugins/sparkShell/conf
     /data/xxx/dts-executor/plugins/sparkShell/di-dts-plugin-ic-1.0.0.jar
     /usr/hdp/current/hadoop-client/*
     /usr/hdp/current/hadoop-client/lib/*
     /usr/hdp/current/hadoop-hdfs-client/*
     /usr/hdp/current/hadoop-hdfs-client/lib/*
     /usr/hdp/current/hadoop-mapreduce-client/*
     /usr/hdp/current/hadoop-mapreduce-client/lib/*
     /usr/hdp/current/hadoop-yarn-client/*
     /usr/hdp/current/hadoop-yarn-client/lib/*
     /usr/hdp/current/hive/lib/*
     /usr/hdp/current/spark2-client/conf
     /usr/hdp/current/spark2-client/jars/*]
    
    # 提交成功的classpath
     [/data/xxx/dts-executor/plugins/sparkShell/di-dts-plugin-ic-1.0.0.jar
     /data/xxx/dts-executor/plugins/sparkShell/conf
     /usr/hdp/current/spark2-client/jars/*
     /usr/hdp/current/spark2-client/conf
     /usr/hdp/current/hive/lib/*
     /usr/hdp/current/hadoop-client/*
     /usr/hdp/current/hadoop-client/lib/*
     /usr/hdp/current/hadoop-mapreduce-client/*
     /usr/hdp/current/hadoop-mapreduce-client/lib/*
     /usr/hdp/current/hadoop-yarn-client/*
     /usr/hdp/current/hadoop-yarn-client/lib/*
     /usr/hdp/current/hadoop-hdfs-client/*
     /usr/hdp/current/hadoop-hdfs-client/lib/*
     /data/xxx/dts-executor/executions/5008-0-1/resource]
    
    主要区别在于  /usr/hdp/current/spark2-client/jars/* 被提前了,后来发现是netty包冲突了
    /usr/hdp/current/hadoop-client/*等下面有与 /usr/hdp/current/spark2-client/jars/* 以下两个包冲突
    netty-3.9.9.Final.jar
    netty-all-4.1.17.Final.jar
    解决方案:
    1.可以把spark相关classpath提到hadoop前,那么优先加载spark相关包(我们采用了这种)
    2.如果不使用hadoop的netty相关的两个包,可以直接删除掉,那么就不存在冲突了
    
  • 相关阅读:
    springboot 配置
    spring boot配置分页插件
    mongodb 操作
    java基础知识
    java设计模式
    /cat/cpuinfo信息查看
    app接口开发
    基于OpenAM系列的SSO----基础
    关于Linux下的连接文件学习总结
    YII :将oracle中timestamp 字段正常显示在页面中
  • 原文地址:https://www.cnblogs.com/jiangxiaoxian/p/13638553.html
Copyright © 2011-2022 走看看