1、集群版本
Hadoop 3.1.1.3.1.4.0-315
Hive (version 3.1.0.3.1.4.0-315)
Spark 2.3.2.3.1.4.0-315
Scala version 2.11.12 (Java HotSpot™ 64-Bit Server VM, Java 1.8.0_144)
guava是guava-28.0-jre.jar
2、Exchange强依赖
指定依赖guava-14版本
3、提交任务报错
命令
spark-submit
--conf spark.app.name="g1"
--master "local"
--class com.vesoft.nebula.exchange.Exchange /data/nebula-spark-utils210/nebula-exchange-2.1.0.jar -c /data/nebula/exchange/configs/imp_test1.conf -h
报错信息
21/08/11 23:18:20 INFO HiveMetaStoreClient: Opened a connection to metastore, current connections: 1
21/08/11 23:18:20 INFO HiveMetaStoreClient: Connected to metastore.
21/08/11 23:18:20 INFO RetryingMetaStoreClient: RetryingMetaStoreClient proxy=class org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient ugi=xxx (auth:SIMPLE) retries=1 delay=5 lifetime=0
Exception in thread "main" java.lang.NoSuchMethodError: com.google.common.net.HostAndPort.getHostText()Ljava/lang/String;
at com.vesoft.nebula.exchange.MetaProvider$$anonfun$1.apply(MetaProvider.scala:30)
at com.vesoft.nebula.exchange.MetaProvider$$anonfun$1.apply(MetaProvider.scala:29)
at scala.collection.immutable.List.foreach(List.scala:392)
at com.vesoft.nebula.exchange.MetaProvider.<init>(MetaProvider.scala:29)
at com.vesoft.nebula.exchange.processor.VerticesProcessor.process(VerticesProcessor.scala:109)
at com.vesoft.nebula.exchange.Exchange$$anonfun$main$2.apply(Exchange.scala:152)
at com.vesoft.nebula.exchange.Exchange$$anonfun$main$2.apply(Exchange.scala:129)
at scala.collection.immutable.List.foreach(List.scala:392)
at com.vesoft.nebula.exchange.Exchange$.main(Exchange.scala:129)
at com.vesoft.nebula.exchange.Exchange.main(Exchange.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:904)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:198)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:228)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:137)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
21/08/11 23:18:22 INFO SparkContext: Invoking stop() from shutdown hook
4、原因分析
exchange对guava14强依赖,集群使用的是guava28,guava16以后不再向下兼容,
提交exchange导数任务到spark集群时,依赖包时直接用了集群的28,自己提交运行时包则造成spark依赖14后不能运行,
5、解决方案
A 通过spark-submit参数提交指定运行包
通过
--jars或运行参数提交
spark-submit
--conf spark.app.name="g1"
--master "local"
--files /data/nebula/exchange/configs/i1.conf
--driver-class-path /data/nebula/tools/guava-14.0.jar
--driver-library-path /data/nebula/tools/guava-14.0.jar
--conf spark.executor.extraClassPath=/data/nebula/tools/guava-14.0.jar
--conf spark.executor.extraLibraryPath=/data/nebula/tools/guava-14.0.jar
--total-executor-cores 15
--class com.vesoft.nebula.exchange.Exchange /data/disk01/nebula/exchange/Exchange/nebula-spark-utilsv2/nebula-exchange-2.0.0.jar -c /data/disk01/nebula/exchange/configs/i1.conf -h
导致集群spark不能执行,
B 重新编译exchange,直接把运行包打包进应用
此处使用maven-shade-plugin改造源代码,重新定义guava14然后打进nebula-java包
注意,exchange直接依赖nebula-java的包,需要先解决nebula-java的依赖问题
① 对guava14重新打包为myguava
1、guava单独打成jar包的pom文件
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.data.guava</groupId>
<artifactId>my-guava</artifactId>
<version>1.0</version>
<dependencies>
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
<version>14.0</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>3.2.1</version>
<configuration>
<createDependencyReducedPom>false</createDependencyReducedPom>
</configuration>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<relocations>
<relocation>
<pattern>com.google.guava</pattern>
<shadedPattern>my.guava</shadedPattern>
</relocation>
<relocation>
<pattern>com.google.common</pattern>
<shadedPattern>my.guava.common</shadedPattern>
</relocation>
</relocations>
<transformers>
<transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer" />
</transformers>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>
② 编译安装myguava
mvn install:install-file -Dfile=target/guava-1.0.jar -DgroupId=com.data.guava -DartifactId=my-guava -Dversion=1.0 -Dpackaging=jar
此时java工程已经可以引用
//maven
// <dependency>
// <groupId>com.data.guava</groupId>
// <artifactId>my-guava</artifactId>
// <version>1.0</version>
// </dependency>
import my.guava.common.collect.Lists;
③ 编译安装nebula-java
1、拉取代码nebula-java
git clone -b v2.0.0-ga https://github.com/vesoft-inc/nebula-java.git
2、 在pom.xml中注释掉对guava14的引用,引用新的包
<!-- <dependency>-->
<!-- <groupId>com.google.guava</groupId>-->
<!-- <artifactId>guava</artifactId>-->
<!-- <version>${guava.version}</version>-->
<!-- </dependency>-->
<dependency>
<groupId>com.data.guava</groupId>
<artifactId>my-guava</artifactId>
<version>1.0</version>
</dependency>
3、修改程序中所有对import com.google.common.的引用
注释掉原导包语句,
在代码报错处,引入import my.guava.common.collect.Maps;
注意,此处import引入时需要注意import my.guava.common.collect.Maps;
这行的位置,需要按照字典序排序,否则编译无法通过style检验
4、编译&安装
此时下载某些包可能需要**上网
mvn clean install -Dmaven.test.skip=true -Dgpg.skip -Dmaven.javadoc.skip=true
④ 编译安装nebula-exchange
1、拉取最新代码
git clone -b v2.1.0 https://github.com/vesoft-inc/nebula-spark-utils.git
cd nebula-spark-utils/nebula-exchange
2、修改pom.xml引入新的包
<!-- 引入guava特殊版本--> <dependency> <groupId>com.data.guava</groupId> <artifactId>my-guava</artifactId> <version>1.0</version> </dependency>
3、修改程序中所有对import com.google.common.的引用
注释掉原导包语句,
在代码报错处,引入import my.guava.common.collect.Maps;
注意,
1、此处import引入时需要注意import my.guava.common.collect.Maps;
这行的位置,需要按照字典序排序,否则编译无法通过style检验
2、包import com.google.common.geometry.{S2CellId, S2LatLng}不需要替换,虽然引用路径相同但这是独立工程,不依赖guava
4、编译&安装
此时下载某些包可能需要**上网
mvn clean package -Dmaven.test.skip=true -Dgpg.skip -Dmaven.javadoc.skip=true
⑤ 测试运行
--入库测试数据-groupspark-submit --conf spark.app.name="g1" --master "local" --class com.vesoft.nebula.exchange.Exchange /data/nebula/exchange/Exchange/nebula-spark-utils210new/nebula-exchange-2.1.0.jar -c /data/nebula/exchange/configs/i1.conf -h
可能存在隐患
log4j的版本可能不兼容
参考文献
大数据 Guava冲突问题
https://blog.csdn.net/bigdataf/article/details/108991920
Phoenix/HBase/Hadoop Guava依赖冲突解决办法(guava conflicts in project)
https://www.jianshu.com/p/f73bb7776a3e
HBase/Spark Guava依赖冲突解决方案
https://www.jianshu.com/p/8422ba13f096
解决Hive与Elasticsearch共有库 guava 冲突 NoSuchMethodError
利用maven-shade-plugin解决不兼容依赖冲突
https://www.jianshu.com/p/4e5b0c04fe60
施用 maven shade plugin 解决 jar 或类的多版本冲突
通过maven-shade-plugin 解决Elasticsearch与hbase的jar包冲突问题
https://blog.csdn.net/sunshine920103/article/details/51659936
巧用maven-shade-plugin解决依赖冲突
https://www.jianshu.com/p/ac959f5deac5
使用maven-shade-plugin插件解决Phoenix依赖中的Guava版本冲突问题
https://blog.csdn.net/wwyzxb/article/details/87905952
java 依赖包冲突,使用maven的Shade方式解决