zoukankan      html  css  js  c++  java
  • error: not found: value sqlContext/import sqlContext.implicits._/error: not found: value sqlContext /import sqlContext.sql/Caused by: java.net.ConnectException: Connection refused

    1、今天启动启动spark的spark-shell命令的时候报下面的错误,百度了很多,也没解决问题,最后想着是不是没有启动hadoop集群的问题

    ,可是之前启动spark-shell命令是不用启动hadoop集群也是可以启动起来的。今天突然报错了。

      1 [hadoop@slaver1 spark-1.5.1-bin-hadoop2.4]$ bin/spark-shell 
      2 > --master spark://slaver1:7077 
      3 > --executor-memory 512M 
      4 > --total-executor-cores 2
      5 18/05/24 10:26:36 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
      6 18/05/24 10:26:37 INFO SecurityManager: Changing view acls to: hadoop
      7 18/05/24 10:26:37 INFO SecurityManager: Changing modify acls to: hadoop
      8 18/05/24 10:26:37 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(hadoop); users with modify permissions: Set(hadoop)
      9 18/05/24 10:26:37 INFO HttpServer: Starting HTTP Server
     10 18/05/24 10:26:37 INFO Utils: Successfully started service 'HTTP class server' on port 40833.
     11 Welcome to
     12       ____              __
     13      / __/__  ___ _____/ /__
     14     _ / _ / _ `/ __/  '_/
     15    /___/ .__/\_,_/_/ /_/\_   version 1.5.1
     16       /_/
     17 
     18 Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_79)
     19 Type in expressions to have them evaluated.
     20 Type :help for more information.
     21 18/05/24 10:26:44 INFO SparkContext: Running Spark version 1.5.1
     22 18/05/24 10:26:44 WARN SparkConf: 
     23 SPARK_WORKER_INSTANCES was detected (set to '1').
     24 This is deprecated in Spark 1.0+.
     25 
     26 Please instead use:
     27  - ./spark-submit with --num-executors to specify the number of executors
     28  - Or set SPARK_EXECUTOR_INSTANCES
     29  - spark.executor.instances to configure the number of instances in the spark config.
     30         
     31 18/05/24 10:26:44 INFO SecurityManager: Changing view acls to: hadoop
     32 18/05/24 10:26:44 INFO SecurityManager: Changing modify acls to: hadoop
     33 18/05/24 10:26:44 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(hadoop); users with modify permissions: Set(hadoop)
     34 18/05/24 10:26:45 INFO Slf4jLogger: Slf4jLogger started
     35 18/05/24 10:26:45 INFO Remoting: Starting remoting
     36 18/05/24 10:26:45 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@192.168.19.131:54496]
     37 18/05/24 10:26:45 INFO Utils: Successfully started service 'sparkDriver' on port 54496.
     38 18/05/24 10:26:45 INFO SparkEnv: Registering MapOutputTracker
     39 18/05/24 10:26:45 INFO SparkEnv: Registering BlockManagerMaster
     40 18/05/24 10:26:45 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-7badf4b8-7ff2-4e0a-acb9-d91542dec428
     41 18/05/24 10:26:45 INFO MemoryStore: MemoryStore started with capacity 534.5 MB
     42 18/05/24 10:26:46 INFO HttpFileServer: HTTP File server directory is /tmp/spark-de58b384-cdc0-4857-821f-138763cf15ba/httpd-1878a206-a95f-42af-b5de-c41394e7aa7e
     43 18/05/24 10:26:46 INFO HttpServer: Starting HTTP Server
     44 18/05/24 10:26:46 INFO Utils: Successfully started service 'HTTP file server' on port 41154.
     45 18/05/24 10:26:46 INFO SparkEnv: Registering OutputCommitCoordinator
     46 18/05/24 10:26:46 INFO Utils: Successfully started service 'SparkUI' on port 4040.
     47 18/05/24 10:26:46 INFO SparkUI: Started SparkUI at http://192.168.19.131:4040
     48 18/05/24 10:26:46 WARN MetricsSystem: Using default name DAGScheduler for source because spark.app.id is not set.
     49 18/05/24 10:26:47 INFO AppClient$ClientEndpoint: Connecting to master spark://slaver1:7077...
     50 18/05/24 10:26:47 INFO SparkDeploySchedulerBackend: Connected to Spark cluster with app ID app-20180524102647-0001
     51 18/05/24 10:26:47 INFO AppClient$ClientEndpoint: Executor added: app-20180524102647-0001/0 on worker-20180524100824-192.168.19.132-7078 (192.168.19.132:7078) with 1 cores
     52 18/05/24 10:26:47 INFO SparkDeploySchedulerBackend: Granted executor ID app-20180524102647-0001/0 on hostPort 192.168.19.132:7078 with 1 cores, 512.0 MB RAM
     53 18/05/24 10:26:47 INFO AppClient$ClientEndpoint: Executor added: app-20180524102647-0001/1 on worker-20180524100822-192.168.19.133-7078 (192.168.19.133:7078) with 1 cores
     54 18/05/24 10:26:47 INFO SparkDeploySchedulerBackend: Granted executor ID app-20180524102647-0001/1 on hostPort 192.168.19.133:7078 with 1 cores, 512.0 MB RAM
     55 18/05/24 10:26:47 INFO AppClient$ClientEndpoint: Executor updated: app-20180524102647-0001/0 is now RUNNING
     56 18/05/24 10:26:47 INFO AppClient$ClientEndpoint: Executor updated: app-20180524102647-0001/1 is now RUNNING
     57 18/05/24 10:26:48 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 50133.
     58 18/05/24 10:26:48 INFO NettyBlockTransferService: Server created on 50133
     59 18/05/24 10:26:48 INFO BlockManagerMaster: Trying to register BlockManager
     60 18/05/24 10:26:48 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.19.131:50133 with 534.5 MB RAM, BlockManagerId(driver, 192.168.19.131, 50133)
     61 18/05/24 10:26:48 INFO BlockManagerMaster: Registered BlockManager
     62 18/05/24 10:26:48 INFO AppClient$ClientEndpoint: Executor updated: app-20180524102647-0001/1 is now LOADING
     63 18/05/24 10:26:48 INFO AppClient$ClientEndpoint: Executor updated: app-20180524102647-0001/0 is now LOADING
     64 18/05/24 10:26:49 ERROR SparkContext: Error initializing SparkContext.
     65 java.net.ConnectException: Call From slaver1/192.168.19.131 to slaver1:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
     66     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
     67     at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
     68     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
     69     at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
     70     at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:783)
     71     at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:730)
     72     at org.apache.hadoop.ipc.Client.call(Client.java:1414)
     73     at org.apache.hadoop.ipc.Client.call(Client.java:1363)
     74     at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
     75     at com.sun.proxy.$Proxy14.getFileInfo(Unknown Source)
     76     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
     77     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
     78     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
     79     at java.lang.reflect.Method.invoke(Method.java:606)
     80     at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:190)
     81     at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
     82     at com.sun.proxy.$Proxy14.getFileInfo(Unknown Source)
     83     at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:699)
     84     at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1762)
     85     at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1124)
     86     at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1120)
     87     at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
     88     at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1120)
     89     at org.apache.spark.scheduler.EventLoggingListener.start(EventLoggingListener.scala:100)
     90     at org.apache.spark.SparkContext.<init>(SparkContext.scala:541)
     91     at org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017)
     92     at $line3.$read$$iwC$$iwC.<init>(<console>:9)
     93     at $line3.$read$$iwC.<init>(<console>:18)
     94     at $line3.$read.<init>(<console>:20)
     95     at $line3.$read$.<init>(<console>:24)
     96     at $line3.$read$.<clinit>(<console>)
     97     at $line3.$eval$.<init>(<console>:7)
     98     at $line3.$eval$.<clinit>(<console>)
     99     at $line3.$eval.$print(<console>)
    100     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    101     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    102     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    103     at java.lang.reflect.Method.invoke(Method.java:606)
    104     at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
    105     at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1340)
    106     at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
    107     at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
    108     at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
    109     at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
    110     at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
    111     at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
    112     at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:125)
    113     at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
    114     at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
    115     at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
    116     at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
    117     at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
    118     at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
    119     at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
    120     at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
    121     at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
    122     at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
    123     at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
    124     at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
    125     at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
    126     at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
    127     at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
    128     at org.apache.spark.repl.Main$.main(Main.scala:31)
    129     at org.apache.spark.repl.Main.main(Main.scala)
    130     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    131     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    132     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    133     at java.lang.reflect.Method.invoke(Method.java:606)
    134     at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
    135     at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
    136     at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
    137     at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
    138     at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
    139 Caused by: java.net.ConnectException: Connection refused
    140     at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
    141     at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
    142     at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
    143     at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:529)
    144     at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:493)
    145     at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:604)
    146     at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:699)
    147     at org.apache.hadoop.ipc.Client$Connection.access$2800(Client.java:367)
    148     at org.apache.hadoop.ipc.Client.getConnection(Client.java:1462)
    149     at org.apache.hadoop.ipc.Client.call(Client.java:1381)
    150     ... 66 more
    151 18/05/24 10:26:49 INFO SparkUI: Stopped Spark web UI at http://192.168.19.131:4040
    152 18/05/24 10:26:49 INFO DAGScheduler: Stopping DAGScheduler
    153 18/05/24 10:26:49 INFO SparkDeploySchedulerBackend: Shutting down all executors
    154 18/05/24 10:26:49 INFO SparkDeploySchedulerBackend: Asking each executor to shut down
    155 18/05/24 10:26:49 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
    156 18/05/24 10:26:49 INFO MemoryStore: MemoryStore cleared
    157 18/05/24 10:26:49 INFO BlockManager: BlockManager stopped
    158 18/05/24 10:26:49 INFO BlockManagerMaster: BlockManagerMaster stopped
    159 18/05/24 10:26:49 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
    160 18/05/24 10:26:49 INFO SparkContext: Successfully stopped SparkContext
    161 18/05/24 10:26:49 INFO RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.
    162 18/05/24 10:26:49 INFO RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.
    163 18/05/24 10:26:50 INFO RemoteActorRefProvider$RemotingTerminator: Remoting shut down.
    164 java.net.ConnectException: Call From slaver1/192.168.19.131 to slaver1:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
    165     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    166     at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
    167     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    168     at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
    169     at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:783)
    170     at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:730)
    171     at org.apache.hadoop.ipc.Client.call(Client.java:1414)
    172     at org.apache.hadoop.ipc.Client.call(Client.java:1363)
    173     at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
    174     at com.sun.proxy.$Proxy14.getFileInfo(Unknown Source)
    175     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    176     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    177     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    178     at java.lang.reflect.Method.invoke(Method.java:606)
    179     at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:190)
    180     at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
    181     at com.sun.proxy.$Proxy14.getFileInfo(Unknown Source)
    182     at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:699)
    183     at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1762)
    184     at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1124)
    185     at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1120)
    186     at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
    187     at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1120)
    188     at org.apache.spark.scheduler.EventLoggingListener.start(EventLoggingListener.scala:100)
    189     at org.apache.spark.SparkContext.<init>(SparkContext.scala:541)
    190     at org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017)
    191     at $iwC$$iwC.<init>(<console>:9)
    192     at $iwC.<init>(<console>:18)
    193     at <init>(<console>:20)
    194     at .<init>(<console>:24)
    195     at .<clinit>(<console>)
    196     at .<init>(<console>:7)
    197     at .<clinit>(<console>)
    198     at $print(<console>)
    199     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    200     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    201     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    202     at java.lang.reflect.Method.invoke(Method.java:606)
    203     at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
    204     at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1340)
    205     at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
    206     at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
    207     at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
    208     at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
    209     at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
    210     at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
    211     at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:125)
    212     at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
    213     at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
    214     at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
    215     at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
    216     at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
    217     at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
    218     at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
    219     at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
    220     at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
    221     at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
    222     at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
    223     at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
    224     at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
    225     at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
    226     at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
    227     at org.apache.spark.repl.Main$.main(Main.scala:31)
    228     at org.apache.spark.repl.Main.main(Main.scala)
    229     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    230     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    231     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    232     at java.lang.reflect.Method.invoke(Method.java:606)
    233     at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
    234     at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
    235     at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
    236     at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
    237     at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
    238 Caused by: java.net.ConnectException: Connection refused
    239     at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
    240     at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
    241     at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
    242     at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:529)
    243     at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:493)
    244     at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:604)
    245     at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:699)
    246     at org.apache.hadoop.ipc.Client$Connection.access$2800(Client.java:367)
    247     at org.apache.hadoop.ipc.Client.getConnection(Client.java:1462)
    248     at org.apache.hadoop.ipc.Client.call(Client.java:1381)
    249     ... 66 more
    250 
    251 java.lang.NullPointerException
    252     at org.apache.spark.sql.execution.ui.SQLListener.<init>(SQLListener.scala:34)
    253     at org.apache.spark.sql.SQLContext.<init>(SQLContext.scala:77)
    254     at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:72)
    255     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    256     at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
    257     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    258     at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
    259     at org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1028)
    260     at $iwC$$iwC.<init>(<console>:9)
    261     at $iwC.<init>(<console>:18)
    262     at <init>(<console>:20)
    263     at .<init>(<console>:24)
    264     at .<clinit>(<console>)
    265     at .<init>(<console>:7)
    266     at .<clinit>(<console>)
    267     at $print(<console>)
    268     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    269     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    270     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    271     at java.lang.reflect.Method.invoke(Method.java:606)
    272     at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
    273     at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1340)
    274     at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
    275     at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
    276     at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
    277     at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
    278     at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
    279     at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
    280     at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:132)
    281     at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
    282     at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
    283     at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
    284     at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
    285     at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
    286     at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
    287     at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
    288     at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
    289     at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
    290     at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
    291     at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
    292     at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
    293     at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
    294     at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
    295     at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
    296     at org.apache.spark.repl.Main$.main(Main.scala:31)
    297     at org.apache.spark.repl.Main.main(Main.scala)
    298     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    299     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    300     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    301     at java.lang.reflect.Method.invoke(Method.java:606)
    302     at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
    303     at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
    304     at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
    305     at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
    306     at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
    307 
    308 <console>:10: error: not found: value sqlContext
    309        import sqlContext.implicits._
    310               ^
    311 <console>:10: error: not found: value sqlContext
    312        import sqlContext.sql
    313               ^
    314 
    315 scala> 

    2、然后启动hadoop集群以后出现如下所示:

      1 [hadoop@slaver1 spark-1.5.1-bin-hadoop2.4]$ bin/spark-shell --master spark://slaver1:7077 --executor-memory 512M
      2 18/05/24 11:04:27 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
      3 18/05/24 11:04:27 INFO SecurityManager: Changing view acls to: hadoop
      4 18/05/24 11:04:27 INFO SecurityManager: Changing modify acls to: hadoop
      5 18/05/24 11:04:27 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(hadoop); users with modify permissions: Set(hadoop)
      6 18/05/24 11:04:28 INFO HttpServer: Starting HTTP Server
      7 18/05/24 11:04:28 INFO Utils: Successfully started service 'HTTP class server' on port 49309.
      8 Welcome to
      9       ____              __
     10      / __/__  ___ _____/ /__
     11     _ / _ / _ `/ __/  '_/
     12    /___/ .__/\_,_/_/ /_/\_   version 1.5.1
     13       /_/
     14 
     15 Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_79)
     16 Type in expressions to have them evaluated.
     17 Type :help for more information.
     18 18/05/24 11:04:45 INFO SparkContext: Running Spark version 1.5.1
     19 18/05/24 11:04:45 WARN SparkConf: 
     20 SPARK_WORKER_INSTANCES was detected (set to '1').
     21 This is deprecated in Spark 1.0+.
     22 
     23 Please instead use:
     24  - ./spark-submit with --num-executors to specify the number of executors
     25  - Or set SPARK_EXECUTOR_INSTANCES
     26  - spark.executor.instances to configure the number of instances in the spark config.
     27         
     28 18/05/24 11:04:45 INFO SecurityManager: Changing view acls to: hadoop
     29 18/05/24 11:04:45 INFO SecurityManager: Changing modify acls to: hadoop
     30 18/05/24 11:04:45 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(hadoop); users with modify permissions: Set(hadoop)
     31 18/05/24 11:04:47 INFO Slf4jLogger: Slf4jLogger started
     32 18/05/24 11:04:47 INFO Remoting: Starting remoting
     33 18/05/24 11:04:48 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@192.168.19.131:42322]
     34 18/05/24 11:04:48 INFO Utils: Successfully started service 'sparkDriver' on port 42322.
     35 18/05/24 11:04:48 INFO SparkEnv: Registering MapOutputTracker
     36 18/05/24 11:04:48 INFO SparkEnv: Registering BlockManagerMaster
     37 18/05/24 11:04:48 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-d2a42fd5-466c-4cfc-88bf-8e8e44a14268
     38 18/05/24 11:04:48 INFO MemoryStore: MemoryStore started with capacity 534.5 MB
     39 18/05/24 11:04:49 INFO HttpFileServer: HTTP File server directory is /tmp/spark-2d383016-d4d2-43d2-984b-198f36b5241d/httpd-9d7feaf4-6227-4ba3-8d61-991dbbc30d27
     40 18/05/24 11:04:49 INFO HttpServer: Starting HTTP Server
     41 18/05/24 11:04:49 INFO Utils: Successfully started service 'HTTP file server' on port 48428.
     42 18/05/24 11:04:49 INFO SparkEnv: Registering OutputCommitCoordinator
     43 18/05/24 11:04:50 INFO Utils: Successfully started service 'SparkUI' on port 4040.
     44 18/05/24 11:04:50 INFO SparkUI: Started SparkUI at http://192.168.19.131:4040
     45 18/05/24 11:04:51 WARN MetricsSystem: Using default name DAGScheduler for source because spark.app.id is not set.
     46 18/05/24 11:04:52 INFO AppClient$ClientEndpoint: Connecting to master spark://slaver1:7077...
     47 18/05/24 11:04:57 INFO SparkDeploySchedulerBackend: Connected to Spark cluster with app ID app-20180524110456-0001
     48 18/05/24 11:04:57 INFO AppClient$ClientEndpoint: Executor added: app-20180524110456-0001/0 on worker-20180524105217-192.168.19.132-7078 (192.168.19.132:7078) with 1 cores
     49 18/05/24 11:04:57 INFO SparkDeploySchedulerBackend: Granted executor ID app-20180524110456-0001/0 on hostPort 192.168.19.132:7078 with 1 cores, 512.0 MB RAM
     50 18/05/24 11:04:57 INFO AppClient$ClientEndpoint: Executor added: app-20180524110456-0001/1 on worker-20180524105216-192.168.19.133-7078 (192.168.19.133:7078) with 1 cores
     51 18/05/24 11:04:57 INFO SparkDeploySchedulerBackend: Granted executor ID app-20180524110456-0001/1 on hostPort 192.168.19.133:7078 with 1 cores, 512.0 MB RAM
     52 18/05/24 11:05:02 INFO AppClient$ClientEndpoint: Executor updated: app-20180524110456-0001/0 is now LOADING
     53 18/05/24 11:05:02 INFO AppClient$ClientEndpoint: Executor updated: app-20180524110456-0001/1 is now LOADING
     54 18/05/24 11:05:02 INFO AppClient$ClientEndpoint: Executor updated: app-20180524110456-0001/0 is now RUNNING
     55 18/05/24 11:05:02 INFO AppClient$ClientEndpoint: Executor updated: app-20180524110456-0001/1 is now RUNNING
     56 18/05/24 11:05:04 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 54059.
     57 18/05/24 11:05:04 INFO NettyBlockTransferService: Server created on 54059
     58 18/05/24 11:05:04 INFO BlockManagerMaster: Trying to register BlockManager
     59 18/05/24 11:05:05 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.19.131:54059 with 534.5 MB RAM, BlockManagerId(driver, 192.168.19.131, 54059)
     60 18/05/24 11:05:05 INFO BlockManagerMaster: Registered BlockManager
     61 18/05/24 11:05:11 INFO SparkDeploySchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.0
     62 18/05/24 11:05:11 INFO SparkILoop: Created spark context..
     63 Spark context available as sc.
     64 18/05/24 11:05:24 INFO SparkDeploySchedulerBackend: Registered executor: AkkaRpcEndpointRef(Actor[akka.tcp://sparkExecutor@192.168.19.133:52946/user/Executor#659827168]) with ID 1
     65 18/05/24 11:05:25 INFO SparkDeploySchedulerBackend: Registered executor: AkkaRpcEndpointRef(Actor[akka.tcp://sparkExecutor@192.168.19.132:32851/user/Executor#2084326958]) with ID 0
     66 18/05/24 11:05:25 INFO HiveContext: Initializing execution hive, version 1.2.1
     67 18/05/24 11:05:26 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.19.133:48833 with 267.3 MB RAM, BlockManagerId(1, 192.168.19.133, 48833)
     68 18/05/24 11:05:26 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.19.132:55208 with 267.3 MB RAM, BlockManagerId(0, 192.168.19.132, 55208)
     69 18/05/24 11:05:27 INFO ClientWrapper: Inspected Hadoop version: 2.4.0
     70 18/05/24 11:05:27 INFO ClientWrapper: Loaded org.apache.hadoop.hive.shims.Hadoop23Shims for Hadoop version 2.4.0
     71 18/05/24 11:05:33 INFO HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
     72 18/05/24 11:05:33 INFO ObjectStore: ObjectStore, initialize called
     73 18/05/24 11:05:36 INFO Persistence: Property datanucleus.cache.level2 unknown - will be ignored
     74 18/05/24 11:05:36 INFO Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored
     75 18/05/24 11:05:37 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
     76 18/05/24 11:05:45 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
     77 18/05/24 11:05:50 INFO ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
     78 18/05/24 11:05:53 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
     79 18/05/24 11:05:53 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
     80 18/05/24 11:05:56 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
     81 18/05/24 11:05:56 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
     82 18/05/24 11:05:58 INFO MetaStoreDirectSql: Using direct SQL, underlying DB is DERBY
     83 18/05/24 11:05:58 INFO ObjectStore: Initialized ObjectStore
     84 18/05/24 11:05:59 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0
     85 18/05/24 11:05:59 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException
     86 18/05/24 11:06:00 INFO HiveMetaStore: Added admin role in metastore
     87 18/05/24 11:06:00 INFO HiveMetaStore: Added public role in metastore
     88 18/05/24 11:06:00 INFO HiveMetaStore: No user is added in admin role, since config is empty
     89 18/05/24 11:06:01 INFO HiveMetaStore: 0: get_all_databases
     90 18/05/24 11:06:01 INFO audit: ugi=hadoop    ip=unknown-ip-addr    cmd=get_all_databases    
     91 18/05/24 11:06:01 INFO HiveMetaStore: 0: get_functions: db=default pat=*
     92 18/05/24 11:06:01 INFO audit: ugi=hadoop    ip=unknown-ip-addr    cmd=get_functions: db=default pat=*    
     93 18/05/24 11:06:01 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MResourceUri" is tagged as "embedded-only" so does not have its own datastore table.
     94 18/05/24 11:06:12 INFO SessionState: Created local directory: /tmp/a2f92e9b-0be3-4c46-9637-f3813cc42f3b_resources
     95 18/05/24 11:06:13 INFO SessionState: Created HDFS directory: /tmp/hive/hadoop/a2f92e9b-0be3-4c46-9637-f3813cc42f3b
     96 18/05/24 11:06:13 INFO SessionState: Created local directory: /tmp/hadoop/a2f92e9b-0be3-4c46-9637-f3813cc42f3b
     97 18/05/24 11:06:13 INFO SessionState: Created HDFS directory: /tmp/hive/hadoop/a2f92e9b-0be3-4c46-9637-f3813cc42f3b/_tmp_space.db
     98 18/05/24 11:06:13 INFO HiveContext: default warehouse location is /user/hive/warehouse
     99 18/05/24 11:06:13 INFO HiveContext: Initializing HiveMetastoreConnection version 1.2.1 using Spark classes.
    100 18/05/24 11:06:14 INFO ClientWrapper: Inspected Hadoop version: 2.4.0
    101 18/05/24 11:06:14 INFO ClientWrapper: Loaded org.apache.hadoop.hive.shims.Hadoop23Shims for Hadoop version 2.4.0
    102 18/05/24 11:06:15 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
    103 18/05/24 11:06:16 INFO HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
    104 18/05/24 11:06:16 INFO ObjectStore: ObjectStore, initialize called
    105 18/05/24 11:06:17 INFO Persistence: Property datanucleus.cache.level2 unknown - will be ignored
    106 18/05/24 11:06:17 INFO Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored
    107 18/05/24 11:06:17 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
    108 18/05/24 11:06:18 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
    109 18/05/24 11:06:23 INFO ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
    110 18/05/24 11:06:25 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
    111 18/05/24 11:06:25 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
    112 18/05/24 11:06:26 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
    113 18/05/24 11:06:26 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
    114 18/05/24 11:06:26 INFO Query: Reading in results for query "org.datanucleus.store.rdbms.query.SQLQuery@0" since the connection used is closing
    115 18/05/24 11:06:26 INFO MetaStoreDirectSql: Using direct SQL, underlying DB is DERBY
    116 18/05/24 11:06:26 INFO ObjectStore: Initialized ObjectStore
    117 18/05/24 11:06:27 INFO HiveMetaStore: Added admin role in metastore
    118 18/05/24 11:06:27 INFO HiveMetaStore: Added public role in metastore
    119 18/05/24 11:06:27 INFO HiveMetaStore: No user is added in admin role, since config is empty
    120 18/05/24 11:06:28 INFO HiveMetaStore: 0: get_all_databases
    121 18/05/24 11:06:28 INFO audit: ugi=hadoop    ip=unknown-ip-addr    cmd=get_all_databases    
    122 18/05/24 11:06:28 INFO HiveMetaStore: 0: get_functions: db=default pat=*
    123 18/05/24 11:06:28 INFO audit: ugi=hadoop    ip=unknown-ip-addr    cmd=get_functions: db=default pat=*    
    124 18/05/24 11:06:28 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MResourceUri" is tagged as "embedded-only" so does not have its own datastore table.
    125 18/05/24 11:06:31 INFO SessionState: Created local directory: /tmp/9203a30a-f597-4e60-bdcc-a546b37e9066_resources
    126 18/05/24 11:06:31 INFO SessionState: Created HDFS directory: /tmp/hive/hadoop/9203a30a-f597-4e60-bdcc-a546b37e9066
    127 18/05/24 11:06:31 INFO SessionState: Created local directory: /tmp/hadoop/9203a30a-f597-4e60-bdcc-a546b37e9066
    128 18/05/24 11:06:31 INFO SessionState: Created HDFS directory: /tmp/hive/hadoop/9203a30a-f597-4e60-bdcc-a546b37e9066/_tmp_space.db
    129 18/05/24 11:06:31 INFO SparkILoop: Created sql context (with Hive support)..
    130 SQL context available as sqlContext.
    131 
    132 scala> 
  • 相关阅读:
    A “word-wrap” functionality(一个字符串包裹函数)
    First Unique Character in a String 的变种问题返回第一个找到符合条件的字符
    北美一工作搜索引擎公司技术岗面经
    一房地产数据服务初创公司的面经
    Prime numbers from 1 to 100 (打印 100 以内的素数)
    dubbo面试题(1)
    maven工具日常开发常用命令
    BaseMapper和继承
    《计算机是怎样跑起来的》读书笔记(2)
    AEAP工作总结模板套路
  • 原文地址:https://www.cnblogs.com/biehongli/p/9081784.html
Copyright © 2011-2022 走看看