zoukankan      html  css  js  c++  java
  • spark 报错

     maven打包时报错:

    报错信息:

    "D:Program FilesJavajdk1.8.0_131injava" -Dmaven.multiModuleProjectDirectory=D:WorkspaceIDEA_workSpark_Workspark01sparkCore "-Dmaven.home=D:Program FilesJetBrainsIntelliJ IDEA 2017.3.1pluginsmavenlibmaven3" "-Dclassworlds.conf=D:Program FilesJetBrainsIntelliJ IDEA 2017.3.1pluginsmavenlibmaven3inm2.conf" "-javaagent:D:Program FilesJetBrainsIntelliJ IDEA 2017.3.1libidea_rt.jar=61000:D:Program FilesJetBrainsIntelliJ IDEA 2017.3.1in" -Dfile.encoding=UTF-8 -classpath "D:Program FilesJetBrainsIntelliJ IDEA 2017.3.1pluginsmavenlibmaven3ootplexus-classworlds-2.5.2.jar" org.codehaus.classworlds.Launcher -Didea.version=2017.3.1 -DskipTests=true package
    [INFO] Scanning for projects...
    [INFO]                                                                         
    [INFO] ------------------------------------------------------------------------
    [INFO] Building sparkCore 1.0-SNAPSHOT
    [INFO] ------------------------------------------------------------------------
    [INFO] 
    [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ sparkCore ---
    [WARNING] Using platform encoding (UTF-8 actually) to copy filtered resources, i.e. build is platform dependent!
    [INFO] Copying 0 resource
    [INFO] 
    [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ sparkCore ---
    [INFO] Nothing to compile - all classes are up to date
    [INFO] 
    [INFO] --- scala-maven-plugin:3.2.2:compile (default) @ sparkCore ---
    [WARNING]  Expected all dependencies to require Scala version: 2.11.8
    [WARNING]  com.twitter:chill_2.11:0.8.0 requires scala version: 2.11.7
    [WARNING] Multiple versions of scala libraries detected!
    [INFO] D:WorkspaceIDEA_workSpark_Workspark01sparkCoresrcmainjava:-1: info: compiling
    [INFO] D:WorkspaceIDEA_workSpark_Workspark01sparkCoresrcmainscala:-1: info: compiling
    [INFO] Compiling 1 source files to D:WorkspaceIDEA_workSpark_Workspark01sparkCore	argetclasses at 1562322123123
    [ERROR] error: error while loading <root>, Error accessing C:Users67001.m2
    epositoryorgcodehausjacksonjackson-core-asl1.9.13jackson-core-asl-1.9.13.jar
    [ERROR] error: scala.reflect.internal.MissingRequirementError: object java.lang.Object in compiler mirror not found.
    [ERROR]     at scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:17)
    [ERROR]     at scala.reflect.internal.MissingRequirementError$.notFound(MissingRequirementError.scala:18)
    [ERROR]     at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:53)
    [ERROR]     at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:45)
    [ERROR]     at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:45)
    [ERROR]     at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:66)
    [ERROR]     at scala.reflect.internal.Mirrors$RootsBase.getClassByName(Mirrors.scala:102)
    [ERROR]     at scala.reflect.internal.Mirrors$RootsBase.getRequiredClass(Mirrors.scala:105)
    [ERROR]     at scala.reflect.internal.Definitions$DefinitionsClass.ObjectClass$lzycompute(Definitions.scala:257)
    [ERROR]     at scala.reflect.internal.Definitions$DefinitionsClass.ObjectClass(Definitions.scala:257)
    [ERROR]     at scala.reflect.internal.Definitions$DefinitionsClass.init(Definitions.scala:1394)
    [ERROR]     at scala.tools.nsc.Global$Run.<init>(Global.scala:1215)
    [ERROR]     at scala.tools.nsc.Driver.doCompile(Driver.scala:31)
    [ERROR]     at scala.tools.nsc.MainClass.doCompile(Main.scala:23)
    [ERROR]     at scala.tools.nsc.Driver.process(Driver.scala:51)
    [ERROR]     at scala.tools.nsc.Driver.main(Driver.scala:64)
    [ERROR]     at scala.tools.nsc.Main.main(Main.scala)
    [ERROR]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [ERROR]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    [ERROR]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    [ERROR]     at java.lang.reflect.Method.invoke(Method.java:498)
    [ERROR]     at scala_maven_executions.MainHelper.runMain(MainHelper.java:164)
    [ERROR]     at scala_maven_executions.MainWithArgsInFile.main(MainWithArgsInFile.java:26)
    [INFO] ------------------------------------------------------------------------
    [INFO] BUILD FAILURE
    [INFO] ------------------------------------------------------------------------
    [INFO] Total time: 4.995 s
    [INFO] Finished at: 2019-07-05T18:22:03+08:00
    [INFO] Final Memory: 26M/698M
    [INFO] ------------------------------------------------------------------------
    [ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile (default) on project sparkCore: wrap: org.apache.commons.exec.ExecuteException: Process exited with an error: 1 (Exit value: 1) -> [Help 1]
    [ERROR] 
    [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
    [ERROR] Re-run Maven using the -X switch to enable full debug logging.
    [ERROR] 
    [ERROR] For more information about the errors and possible solutions, please read the following articles:
    [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
    
    Process finished with exit code 1
    "D:Program FilesJavajdk1.8.0_131injava" -Dmaven.multiModuleProjectDirectory=D:WorkspaceIDEA_workSpark_Workspark02sparkCore 
    "-Dmaven.home=D:Program FilesJetBrainsIntelliJ IDEA 2017.3.1pluginsmavenlibmaven3"
    "-Dclassworlds.conf=D:Program FilesJetBrainsIntelliJ IDEA 2017.3.1pluginsmavenlibmaven3inm2.conf"
    "-javaagent:D:Program FilesJetBrainsIntelliJ IDEA 2017.3.1libidea_rt.jar=64675:D:Program FilesJetBrainsIntelliJ IDEA 2017.3.1in"
    -Dfile.encoding=UTF-8 -classpath "D:Program FilesJetBrainsIntelliJ IDEA 2017.3.1pluginsmavenlibmaven3ootplexus-classworlds-2.5.2.jar"
    org.codehaus.classworlds.Launcher -Didea.version=2017.3.1 -DskipTests=true package [INFO] Scanning for projects... [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building sparkCore 1.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ sparkCore --- [WARNING] Using platform encoding (UTF-8 actually) to copy filtered resources, i.e. build is platform dependent! [INFO] Copying 0 resource [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ sparkCore --- [INFO] Nothing to compile - all classes are up to date [INFO] [INFO] --- scala-maven-plugin:3.2.2:compile (default) @ sparkCore --- [WARNING] Expected all dependencies to require Scala version: 2.11.8 [WARNING] com.twitter:chill_2.11:0.8.0 requires scala version: 2.11.7 [WARNING] Multiple versions of scala libraries detected! [INFO] D:WorkspaceIDEA_workSpark_Workspark02sparkCoresrcmainjava:-1: info: compiling [INFO] D:WorkspaceIDEA_workSpark_Workspark02sparkCoresrcmainscala:-1: info: compiling [INFO] Compiling 1 source files to D:WorkspaceIDEA_workSpark_Workspark02sparkCore argetclasses at 1562341347300 [ERROR] error: error while loading <root>, Error accessing C:Users67001.m2 epositorycommons-configurationcommons-configuration1.6commons-configuration-1.6.jar [ERROR] error: scala.reflect.internal.MissingRequirementError: object java.lang.Object in compiler mirror not found. [ERROR] at scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:17) [ERROR] at scala.reflect.internal.MissingRequirementError$.notFound(MissingRequirementError.scala:18) [ERROR] at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:53) [ERROR] at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:45) [ERROR] at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:45) [ERROR] at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:66) [ERROR] at scala.reflect.internal.Mirrors$RootsBase.getClassByName(Mirrors.scala:102) [ERROR] at scala.reflect.internal.Mirrors$RootsBase.getRequiredClass(Mirrors.scala:105) [ERROR] at scala.reflect.internal.Definitions$DefinitionsClass.ObjectClass$lzycompute(Definitions.scala:257) [ERROR] at scala.reflect.internal.Definitions$DefinitionsClass.ObjectClass(Definitions.scala:257) [ERROR] at scala.reflect.internal.Definitions$DefinitionsClass.init(Definitions.scala:1394) [ERROR] at scala.tools.nsc.Global$Run.<init>(Global.scala:1215) [ERROR] at scala.tools.nsc.Driver.doCompile(Driver.scala:31) [ERROR] at scala.tools.nsc.MainClass.doCompile(Main.scala:23) [ERROR] at scala.tools.nsc.Driver.process(Driver.scala:51) [ERROR] at scala.tools.nsc.Driver.main(Driver.scala:64) [ERROR] at scala.tools.nsc.Main.main(Main.scala) [ERROR] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) [ERROR] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) [ERROR] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) [ERROR] at java.lang.reflect.Method.invoke(Method.java:498) [ERROR] at scala_maven_executions.MainHelper.runMain(MainHelper.java:164) [ERROR] at scala_maven_executions.MainWithArgsInFile.main(MainWithArgsInFile.java:26) [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 5.117 s [INFO] Finished at: 2019-07-05T23:42:28+08:00 [INFO] Final Memory: 26M/698M [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile (default) on project
    sparkCore: wrap: org.apache.commons.exec.ExecuteException: Process exited with an error: 1 (Exit value: 1) -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException Process finished with exit code 1

     解决方法:

    因为配置了JobHistoryServer所以需要启动 yarn 和 HDFS

    spark-shell 启动报错:

    ERROR spark.SparkContext: Error initializing SparkContext.
    java.net.ConnectException: Call From hadoop102/192.168.192.102 to hadoop102:9000 failed on connection exception: java.net.ConnectException: 拒绝连接; 
    For more details see: http://wiki.apache.org/hadoop/ConnectionRefused at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:792) at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:732) at org.apache.hadoop.ipc.Client.call(Client.java:1479) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy15.getFileInfo(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:771) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy16.getFileInfo(Unknown Source) at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2108) at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1305) at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1301) at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1317) at org.apache.spark.scheduler.EventLoggingListener.start(EventLoggingListener.scala:93) at org.apache.spark.SparkContext.<init>(SparkContext.scala:531) at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2320) at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:868) at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:860) at scala.Option.getOrElse(Option.scala:121) at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:860) at org.apache.spark.repl.Main$.createSparkSession(Main.scala:96) at $line3.$read$$iw$$iw.<init>(<console>:15) at $line3.$read$$iw.<init>(<console>:42) at $line3.$read.<init>(<console>:44) at $line3.$read$.<init>(<console>:48) at $line3.$read$.<clinit>(<console>) at $line3.$eval$.$print$lzycompute(<console>:7) at $line3.$eval$.$print(<console>:6) at $line3.$eval.$print(<console>) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786) at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047) at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638) at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637) at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31) at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19) at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637) at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569) at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565) at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807) at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681) at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395) at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38) at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37) at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37) at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214) at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37) at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105) at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920) at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909) at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909) at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97) at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909) at org.apache.spark.repl.Main$.doMain(Main.scala:69) at org.apache.spark.repl.Main$.main(Main.scala:52) at org.apache.spark.repl.Main.main(Main.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:743) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) Caused by: java.net.ConnectException: 拒绝连接 at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495) at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:614) at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:712) at org.apache.hadoop.ipc.Client$Connection.access$2900(Client.java:375) at org.apache.hadoop.ipc.Client.getConnection(Client.java:1528) at org.apache.hadoop.ipc.Client.call(Client.java:1451) ... 71 more java.net.ConnectException: Call From hadoop102/192.168.192.102 to hadoop102:9000 failed on connection exception: java.net.ConnectException: 拒绝连接;
    For more details see: http://wiki.apache.org/hadoop/ConnectionRefused at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:792) at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:732) at org.apache.hadoop.ipc.Client.call(Client.java:1479) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy15.getFileInfo(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:771) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy16.getFileInfo(Unknown Source) at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2108) at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1305) at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1301) at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1317) at org.apache.spark.scheduler.EventLoggingListener.start(EventLoggingListener.scala:93) at org.apache.spark.SparkContext.<init>(SparkContext.scala:531) at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2320) at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:868) at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:860) at scala.Option.getOrElse(Option.scala:121) at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:860) at org.apache.spark.repl.Main$.createSparkSession(Main.scala:96) ... 47 elided Caused by: java.net.ConnectException: 拒绝连接 at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495) at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:614) at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:712) at org.apache.hadoop.ipc.Client$Connection.access$2900(Client.java:375) at org.apache.hadoop.ipc.Client.getConnection(Client.java:1528) at org.apache.hadoop.ipc.Client.call(Client.java:1451) ... 71 more <console>:14: error: not found: value spark import spark.implicits._ ^ <console>:14: error: not found: value spark import spark.sql

    解决方法:

    因为配置了JobHistoryServer所以需要启动 yarn 和 HDFS

  • 相关阅读:
    Linux防火墙--iptables学习
    LVS持久化
    LVS管理工具--ipvsadm
    Linux负载均衡--LVS(IPVS)
    一步步学习python
    驱动工程师需要的技能
    红外图像盲元补偿matlab实现源码与效果验证
    红外图像非均匀矫正——两点矫正
    夏日炎炎 python写个天气预报
    解决OV系列摄像头寄存器读数据无法收到的问题
  • 原文地址:https://www.cnblogs.com/LXL616/p/11140975.html
Copyright © 2011-2022 走看看