zoukankan      html  css  js  c++  java
  • problems_maven

    problems_maven

    1 pom.xml中删除了某个依赖,右侧maven栏里仍然有该依赖

    desc:
    并且有5个相同的依赖banboo-0.0.1.jar,其中2个依赖有红色下划波浪线,代表有错误。
    RCA:
    待定。
    solution:
    关闭该idea项目,删除工作空间根目录下的.idea文件夹,然后打开idea,点击File - open 按钮,重新打开该项目。
    重新设置所有的编码为UTF-8,重新设置maven路径、maven使用的jdk,重新设置jdk版本。

    2 maven警告提示

    desc:

    maven package warning: 
    Some problems were encountered while building the effective settings
    expected START_TAG or END_TAG not TEXT (position: TEXT seen ...</mirror>
         -->
        <mirror>
        u3000u3000<i... @161:9)  @ D:developmvnreposettingsforsca.xml, line 161, column 9
    

    RCA:
    在Maven警告提示区域存在空格等不规范字符,在网上复制到项目中时经常出现类似问题。
    pop.xml文件,setting.xml文件极易出现此类问题。
    solution:
    将空格删除,规范一下格式。
    reference: https://www.cnblogs.com/tfxz/p/12662423.html

    3 maven打包失败invalid LOC header (bad signature)

    maven进行package,构建失败,并报如下错误:

    Error creating shaded jar: invalid LOC header (bad signature) -> [Help 1]
    

    一开始更换高版本的maven plugin,再次打包,还是失败,不过这次的报错更加详细,指明了有问题的jar包:/mnt/d/Develop/mavenRepo/com/alibaba/fastjson/1.2.44/fastjson-1.2.44.jar

    [INFO] BUILD FAILURE  
    [INFO] ------------------------------------------------------------------------  
    [INFO] Total time: 32.235 s  
    [INFO] Finished at: 2019-11-28T20:46:18+08:00  
    [INFO] Final Memory: 56M/720M  
    [INFO] ------------------------------------------------------------------------  
    [ERROR] Failed to execute goal org.apache.maven.plugins:maven-shade-plugin:3.1.1:shade (default) on project flink-base: Error creating shaded jar: Problem shading JAR /mnt/d/Develop/mavenRepo/com/alibaba/fastjson/1.2.44/fastjson-1.2.44.jar entry META-INF/services/org.glassfish.jersey.internal.spi.AutoDiscoverable: java.util.zip.ZipException: invalid LOC header (bad signature) -> [Help 1]
    [ERROR]   
    [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.  
    [ERROR] Re-run Maven using the -X switch to enable full debug logging.    
    [ERROR]   
    [ERROR] For more information about the errors and possible solutions, please read the following articles:  
    [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException  
    

    也可以根据提示,在工程目录文件夹下的命令行下运行mvn -e package,mvn -X package,并且我看到网上也会打印具体的错误信息,指出有问题的jar包,不过这个要重新下载所有的jar包,太耗费时间了,就没有尝试。

    然后进入目录,/mnt/d/Develop/mavenRepo/com/alibaba/fastjson/1.2.44/,发现里面有一个aether-fb0089ce-dff1-4256-9ccc-a2bb768c48f7-fastjson-1.2.44.jar.sha1-in-progress 文件,删除该目录下所有的文件,重新package,成功!

    小结:本次问题,还是maven下载jar错误的原因,遇到的大多数问题都是maven下载jar包错误的原因。

    4 导入spark程序的maven依赖包时,无法导入,报错

    desc: 导入spark程序的maven依赖包时,无法导入,且报错。
    errorlog:

    0:23 Unable to import maven project: See logs for details
    
    2019-08-23 00:34:05,140 [ 747292] WARN - #org.jetbrains.idea.maven - Cannot reconnect.
    java.lang.RuntimeException: Cannot reconnect.
    at org.jetbrains.idea.maven.server.RemoteObjectWrapper.perform(RemoteObjectWrapper.java:111)
    at org.jetbrains.idea.maven.server.MavenIndexerWrapper.createIndex(MavenIndexerWrapper.java:61)
    at org.jetbrains.idea.maven.indices.MavenIndex.createContext(MavenIndex.java:396)
    at org.jetbrains.idea.maven.indices.MavenIndex.access$500(MavenIndex.java:48)
    at org.jetbrains.idea.maven.indices.MavenIndex$IndexData.<init>(MavenIndex.java:703)
    at org.jetbrains.idea.maven.indices.MavenIndex.doOpen(MavenIndex.java:236)
    at org.jetbrains.idea.maven.indices.MavenIndex.open(MavenIndex.java:202)
    at org.jetbrains.idea.maven.indices.MavenIndex.<init>(MavenIndex.java:104)
    at org.jetbrains.idea.maven.indices.MavenIndices.add(MavenIndices.java:92)
    at org.jetbrains.idea.maven.indices.MavenIndicesManager.ensureIndicesExist(MavenIndicesManager.java:174)
    at org.jetbrains.idea.maven.indices.MavenProjectIndicesManager$3.run(MavenProjectIndicesManager.java:117)
    at com.intellij.util.ui.update.MergingUpdateQueue.execute(MergingUpdateQueue.java:337)
    at com.intellij.util.ui.update.MergingUpdateQueue.execute(MergingUpdateQueue.java:327)
    at com.intellij.util.ui.update.MergingUpdateQueue.lambda$flush$1(MergingUpdateQueue.java:277)
    at com.intellij.util.ui.update.MergingUpdateQueue.flush(MergingUpdateQueue.java:291)
    at com.intellij.util.ui.update.MergingUpdateQueue.run(MergingUpdateQueue.java:246)
    at com.intellij.util.concurrency.QueueProcessor.runSafely(QueueProcessor.java:246)
    at com.intellij.util.Alarm$Request.runSafely(Alarm.java:417)
    at com.intellij.util.Alarm$Request.access$700(Alarm.java:344)
    at com.intellij.util.Alarm$Request$1.run(Alarm.java:384)
    at com.intellij.util.Alarm$Request.run(Alarm.java:395)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at com.intellij.util.concurrency.SchedulingWrapper$MyScheduledFutureTask.run(SchedulingWrapper.java:242)
    at com.intellij.util.concurrency.BoundedTaskExecutor$2.run(BoundedTaskExecutor.java:212)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
    Caused by: java.rmi.UnmarshalException: Error unmarshaling return header; nested exception is:
    java.net.SocketException: Connection reset
    

    RCA: maven版本问题,我原来使用的是maven3.6.0,不兼容。

    我需要导入的maven依赖如下:

    <properties>
    <scala.version>2.11.8</scala.version>
    <hadoop.version>2.7.4</hadoop.version>
    <spark.version>2.1.3</spark.version>
    </properties>
    <dependencies>
    <dependency>
    <groupId>org.scala-lang</groupId>
    <artifactId>scala-library</artifactId>
    <version>${scala.version}</version>
    </dependency>
    <dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.11</artifactId>
    <version>${spark.version}</version>
    </dependency>
    
    </dependencies>
    
    <build>
    <sourceDirectory>src/main/scala</sourceDirectory>
    <testSourceDirectory>src/test/scala</testSourceDirectory>
    <plugins>
    <plugin>
    <groupId>net.alchim31.maven</groupId>
    <artifactId>scala-maven-plugin</artifactId>
    <version>3.2.2</version>
    <executions>
    <execution>
    <goals>
    <goal>compile</goal>
    <goal>testCompile</goal>
    </goals>
    <configuration>
    <args>
    <arg>-dependencyfile</arg>
    <arg>${project.build.directory}/.scala_dependencies</arg>
    </args>
    </configuration>
    </execution>
    </executions>
    </plugin>
    <plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-shade-plugin</artifactId>
    <version>2.4.3</version>
    <executions>
    <execution>
    <phase>package</phase>
    <goals>
    <goal>shade</goal>
    </goals>
    <configuration>
    <filters>
    <filter>
    <artifact>*:*</artifact>
    <excludes>
    <exclude>META-INF/*.SF</exclude>
    <exclude>META-INF/*.DSA</exclude>
    <exclude>META-INF/*.RSA</exclude>
    </excludes>
    </filter>
    </filters>
    <transformers>
    <transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
    <mainClass></mainClass>
    </transformer>
    </transformers>
    </configuration>
    </execution>
    </executions>
    </plugin>
    </plugins>
    </build>
    

    action:

    1. 更换仓库为一个空白的repository。同时该仓库的路径比较浅,怀疑是原来的仓库的路径太深了。或者原来仓库内容有问题。没用。
    2. pom.xml中删除一些依赖、插件,然后一个个添加,没用。

    solution: 更换maven为idea自带的maven3.3.9

    5 spark程序编译报错error: object apache is not a member of package org

    Spark程序编译报错, errorlog:

    [INFO] Compiling 2 source files to E:DevelopIDEAWorkspacespark	argetclasses at 1567004370534
    [ERROR] E:DevelopIDEAWorkspacesparksrcmainscalacndevelopwordCountWordCount.scala:3: error: object apache is not a member of package org
    [ERROR] import org.apache.spark.rdd.RDD
    [ERROR] ^
    [ERROR] E:DevelopIDEAWorkspacesparksrcmainscalacndevelopwordCountWordCount.scala:4: error: object apache is not a member of package org
    [ERROR] import org.apache.spark.{SparkConf, SparkContext}
    [ERROR] ^
    [ERROR] E:DevelopIDEAWorkspacesparksrcmainscalacndevelopwordCountWordCount.scala:12: error: not found: type SparkConf
    [ERROR] val sparkConf: SparkConf = new SparkConf().setAppName("WordCount").setMaster("local[2]")
    [ERROR] ^
    [ERROR] E:DevelopIDEAWorkspacesparksrcmainscalacndevelopwordCountWordCount.scala:12: error: not found: type SparkConf
    [ERROR] val sparkConf: SparkConf = new SparkConf().setAppName("WordCount").setMaster("local[2]")
    [ERROR] ^
    [ERROR] E:DevelopIDEAWorkspacesparksrcmainscalacndevelopwordCountWordCount.scala:14: error: not found: type SparkContext
    [ERROR] val sc: SparkContext = new SparkContext(sparkConf)
    [ERROR] ^
    [ERROR] E:DevelopIDEAWorkspacesparksrcmainscalacndevelopwordCountWordCount.scala:14: error: not found: type SparkContext
    [ERROR] val sc: SparkContext = new SparkContext(sparkConf)
    [ERROR] ^
    [ERROR] E:DevelopIDEAWorkspacesparksrcmainscalacndevelopwordCountWordCount.scala:18: error: not found: type RDD
    [ERROR] val data: RDD[String] = sc.textFile("E:\Study\BigData\heima\stage5\2spark����\words.txt")
    [ERROR] ^
    [ERROR] E:DevelopIDEAWorkspacesparksrcmainscalacndevelopwordCountWordCount.scala:20: error: not found: type RDD
    [ERROR] val words: RDD[String] = data.flatMap(_.split(" "))
    [ERROR] ^
    [ERROR] E:DevelopIDEAWorkspacesparksrcmainscalacndevelopwordCountWordCount.scala:22: error: not found: type RDD
    [ERROR] val wordToOne: RDD[(String, Int)] = words.map((_,1))
    [ERROR] ^
    [ERROR] E:DevelopIDEAWorkspacesparksrcmainscalacndevelopwordCountWordCount.scala:24: error: not found: type RDD
    [ERROR] val result: RDD[(String, Int)] = wordToOne.reduceByKey(_+_)
    [ERROR] ^
    [ERROR] E:DevelopIDEAWorkspacesparksrcmainscalacndevelopwordCountWordCount.scala:27: error: not found: type RDD
    [ERROR] val ascResult: RDD[(String, Int)] = result.sortBy(_._2,false) 
    [ERROR] ^
    [ERROR] E:DevelopIDEAWorkspacesparksrcmainscalacndevelopwordCountWordCountCluster.scala:3: error: object apache is not a member of package org
    [ERROR] import org.apache.spark.{SparkConf, SparkContext}
    [ERROR] ^
    [ERROR] E:DevelopIDEAWorkspacesparksrcmainscalacndevelopwordCountWordCountCluster.scala:4: error: object apache is not a member of package org
    [ERROR] import org.apache.spark.rdd.RDD
    [ERROR] ^
    [ERROR] E:DevelopIDEAWorkspacesparksrcmainscalacndevelopwordCountWordCountCluster.scala:12: error: not found: type SparkConf
    [ERROR] val sparkConf: SparkConf = new SparkConf().setAppName("WordCountCluster")
    [ERROR] ^
    [ERROR] E:DevelopIDEAWorkspacesparksrcmainscalacndevelopwordCountWordCountCluster.scala:12: error: not found: type SparkConf
    [ERROR] val sparkConf: SparkConf = new SparkConf().setAppName("WordCountCluster")
    [ERROR] ^
    [ERROR] E:DevelopIDEAWorkspacesparksrcmainscalacndevelopwordCountWordCountCluster.scala:14: error: not found: type SparkContext
    [ERROR] val sc: SparkContext = new SparkContext(sparkConf)
    [ERROR] ^
    [ERROR] E:DevelopIDEAWorkspacesparksrcmainscalacndevelopwordCountWordCountCluster.scala:14: error: not found: type SparkContext
    [ERROR] val sc: SparkContext = new SparkContext(sparkConf)
    [ERROR] ^
    [ERROR] E:DevelopIDEAWorkspacesparksrcmainscalacndevelopwordCountWordCountCluster.scala:18: error: not found: type RDD
    [ERROR] val data: RDD[String] = sc.textFile(args(0))
    [ERROR] ^
    [ERROR] E:DevelopIDEAWorkspacesparksrcmainscalacndevelopwordCountWordCountCluster.scala:20: error: not found: type RDD
    [ERROR] val words: RDD[String] = data.flatMap(_.split(" "))
    [ERROR] ^
    [ERROR] E:DevelopIDEAWorkspacesparksrcmainscalacndevelopwordCountWordCountCluster.scala:22: error: not found: type RDD
    [ERROR] val wordToOne: RDD[(String, Int)] = words.map((_,1))
    [ERROR] ^
    [ERROR] E:DevelopIDEAWorkspacesparksrcmainscalacndevelopwordCountWordCountCluster.scala:24: error: not found: type RDD
    [ERROR] val result: RDD[(String, Int)] = wordToOne.reduceByKey(_+_)
    [ERROR] ^
    [ERROR] 21 errors found
    [INFO] ------------------------------------------------------------------------
    [INFO] BUILD FAILURE
    

    RCA: 本地仓库有问题。很可能是原来的本地仓库路径太长了太深了,仓库本身没问题,因为我把原来的仓库拷贝到E:目录下,就能正常使用。

    solution:
    原来spark工程的maven本地仓库是:E:developBigDatamavenmaven1maven2maven3sparkRepository
    后来我修改为:E: epository 就可以了。

    6

  • 相关阅读:
    Jenkins学习记录(三)
    Jenkins学习记录(二)
    并发编程
    黏包及解决方法
    socket通信,三次握手,四次挥手
    异常处理
    元类与魔法方法
    封装方法与多态
    组合与封装
    继承
  • 原文地址:https://www.cnblogs.com/mediocreWorld/p/15145867.html
Copyright © 2011-2022 走看看