zoukankan      html  css  js  c++  java
  • Spark记录-Spark作业调试

    在本地IDE里直接运行spark程序操作远程集群

    一般运行spark作业的方式有两种:

    • 本机调试,通过设置master为local模式运行spark作业,这种方式一般用于调试,不用连接远程集群。

    • 集群运行。一般本机调试通过后会将作业打成jar包通过spark-submit提交运行。生产环境一般使用这种方式。

    操作方法

    1.设置master

    两种方式:

    • 在程序中设置
    SparkConf conf = new SparkConf()
                    .setAppName("helloworld")
                    .setMaster("spark://192.168.66.66:7077");
    • 在run configuration中设置
    VM options中添加:
    -Dspark.master="spark://192.168.66.66:7077"

    2.设置HDFS

    在程序中使用HDFS路径,会出现文件系统不匹配hdfs,可以将集群中的hadoop配置中的core-site.xml和hdfs-site.xml拷贝到项目src/main/resources下

    3.发送jar包

    如果程序中使用了自定义的算子和依赖的jar包,需要将本项目jar包和依赖的jar包发送到集群中SPARK_HOME/jars目录下,可以用maven-assembly打成带依赖的jar包,spark的jars相当于mvn库。

    注意集群中每个节点的jars目录下都要放自己的jar包。

    可能遇到的问题

    如果遇到了节点间通信问题,可能是jar包没有在所有节点放置好。

    incompatible loaded等问题,是依赖的spark版本不匹配,修改dependency。

    至此,就可以直接在IDE中运行了

    实战操作

    1.安装jdk1.8、idea2017,maven3、idea2017安装scala插件

    2.新建maven项目-scala

    3.配置pom.xml,下载依赖包

    <?xml version="1.0" encoding="UTF-8"?>
    <!--
      Licensed to the Apache Software Foundation (ASF) under one
      or more contributor license agreements.  See the NOTICE file
      distributed with this work for additional information
      regarding copyright ownership.  The ASF licenses this file
      to you under the Apache License, Version 2.0 (the
      "License"); you may not use this file except in compliance
      with the License.  You may obtain a copy of the License at
    
       http://www.apache.org/licenses/LICENSE-2.0
    
      Unless required by applicable law or agreed to in writing,
      software distributed under the License is distributed on an
      "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
      KIND, either express or implied.  See the License for the
      specific language governing permissions and limitations
      under the License.
    -->
    <!-- $Id: pom.xml 642118 2008-03-28 08:04:16Z reinhard $ -->
    <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
    
        <modelVersion>4.0.0</modelVersion>
        <packaging>war</packaging>
    
        <name>scala</name>
        <groupId>scala</groupId>
        <artifactId>scala</artifactId>
        <version>1.0-SNAPSHOT</version>
    
        <build>
            <plugins>
                <plugin>
                    <groupId>org.mortbay.jetty</groupId>
                    <artifactId>maven-jetty-plugin</artifactId>
                    <version>6.1.7</version>
                    <configuration>
                        <connectors>
                            <connector implementation="org.mortbay.jetty.nio.SelectChannelConnector">
                                <port>8888</port>
                                <maxIdleTime>30000</maxIdleTime>
                            </connector>
                        </connectors>
                        <webAppSourceDirectory>${project.build.directory}/${pom.artifactId}-${pom.version}</webAppSourceDirectory>
                        <contextPath>/</contextPath>
                    </configuration>
                </plugin>
            </plugins>
        </build>
        <properties>
            <scala.version>2.10.5</scala.version>
            <hadoop.version>2.7.3</hadoop.version>
        </properties>
    
        <repositories>
            <repository>
                <id>scala-tools.org</id>
                <name>Scala-Tools Maven2 Repository</name>
                <url>http://scala-tools.org/repo-releases</url>
            </repository>
        </repositories>
    
        <dependencies>
            <!--dependency>
              <groupId>scala</groupId>
              <artifactId>[the artifact id of the block to be mounted]</artifactId>
              <version>1.0-SNAPSHOT</version>
            </dependency-->
            <dependency>
                <groupId>org.apache.spark</groupId>
                <artifactId>spark-core_2.10</artifactId>
                <version>1.6.3</version>
            </dependency>
            <dependency>
                <groupId>org.apache.spark</groupId>
                <artifactId>spark-sql_2.10</artifactId>
                <version>1.6.3</version>
            </dependency>
            <dependency>
                <groupId>org.apache.spark</groupId>
                <artifactId>spark-streaming_2.10</artifactId>
                <version>1.6.3</version>
            </dependency>
            <dependency>
                <groupId>org.apache.hadoop</groupId>
                <artifactId>hadoop-client</artifactId>
                <version>${hadoop.version}</version>
            </dependency>
            <dependency>
                <groupId>org.apache.hadoop</groupId>
                <artifactId>hadoop-common</artifactId>
                <version>${hadoop.version}</version>
            </dependency>
            <dependency>
                <groupId>org.apache.hadoop</groupId>
                <artifactId>hadoop-hdfs</artifactId>
                <version>${hadoop.version}</version>
            </dependency>
        </dependencies>
    
    </project>

    4.打开File-Project Structure-Modules,Sources-scala项目src-main新建一个scala目录,设置为Sources

    5.打开File-Project Structure-Libraries,添加scala sdk(2.10.5)

    6.从集群里复制一个hadoop程序到D盘,再到网上下载https://github.com/srccodes/hadoop-common-2.2.0-bin,解压合并bin文件夹即可,设置用户环境变量HADOOP_HOME。

    7.返回主界面,在src-main-scala下新建一个object:sparkPi:

    import scala.math.random
    import org.apache.spark._
    
    object sparkPi {
      def main(args: Array[String]) {
        System.setProperty("hadoop.home.dir", "D:\hadoop")
        val conf = new SparkConf().setAppName("Spark Pi").setMaster("spark://192.168.66.66:7077")
    .set("spark.executor.memory.","1g")
    .set("spark.serializer","org.apache.spark.serializer.KryoSerializer") .setJars(Seq("D:\workspace\scala\out\scala.jar")) val spark = new SparkContext(conf) val slices = if (args.length > 0) args(0).toInt else 2 println("Time:" + spark.startTime) val n = math.min(1000L * slices, Int.MaxValue).toInt // avoid overflow val count = spark.parallelize(1 until n, slices).map { i => val x = random * 2 - 1 val y = random * 2 - 1 if (x*x + y*y < 1) 1 else 0 }.reduce(_ + _) println("Pi is roughly " + 4.0 * count / n) spark.stop() } }

    8.打开File-Project Structure-Artifacts,添加JAR-from modules with  dependencies,选择好主类-sparkPi,设置好Output,去除不必要的包;

    9.Build-Build Artifacts-scala.jar

    10.Run-Run sparkPi

    "C:Program FilesJavajdk1.8.0_121injava" "-javaagent:D:Program Files (x86)JetBrainsIntelliJ IDEA 173.3302.5libidea_rt.jar=53908:D:Program Files (x86)JetBrainsIntelliJ IDEA 173.3302.5in" -Dfile.encoding=UTF-8 -classpath "C:Program FilesJavajdk1.8.0_121jrelibcharsets.jar;C:Program FilesJavajdk1.8.0_121jrelibdeploy.jar;C:Program FilesJavajdk1.8.0_121jrelibextaccess-bridge-64.jar;C:Program FilesJavajdk1.8.0_121jrelibextcldrdata.jar;C:Program FilesJavajdk1.8.0_121jrelibextdnsns.jar;C:Program FilesJavajdk1.8.0_121jrelibextjaccess.jar;C:Program FilesJavajdk1.8.0_121jrelibextjfxrt.jar;C:Program FilesJavajdk1.8.0_121jrelibextlocaledata.jar;C:Program FilesJavajdk1.8.0_121jrelibext
    ashorn.jar;C:Program FilesJavajdk1.8.0_121jrelibextsunec.jar;C:Program FilesJavajdk1.8.0_121jrelibextsunjce_provider.jar;C:Program FilesJavajdk1.8.0_121jrelibextsunmscapi.jar;C:Program FilesJavajdk1.8.0_121jrelibextsunpkcs11.jar;C:Program FilesJavajdk1.8.0_121jrelibextzipfs.jar;C:Program FilesJavajdk1.8.0_121jrelibjavaws.jar;C:Program FilesJavajdk1.8.0_121jrelibjce.jar;C:Program FilesJavajdk1.8.0_121jrelibjfr.jar;C:Program FilesJavajdk1.8.0_121jrelibjfxswt.jar;C:Program FilesJavajdk1.8.0_121jrelibjsse.jar;C:Program FilesJavajdk1.8.0_121jrelibmanagement-agent.jar;C:Program FilesJavajdk1.8.0_121jrelibplugin.jar;C:Program FilesJavajdk1.8.0_121jrelib
    esources.jar;C:Program FilesJavajdk1.8.0_121jrelib
    t.jar;D:workspacescala	argetclasses;C:Usersxinfang.m2
    epositoryorgscala-langscala-library2.10.5scala-library-2.10.5.jar;C:Usersxinfang.m2
    epositoryorgscala-langscala-reflect2.10.5scala-reflect-2.10.5.jar;C:Usersxinfang.m2
    epositoryorgapachesparkspark-core_2.101.6.3spark-core_2.10-1.6.3.jar;C:Usersxinfang.m2
    epositoryorgapacheavroavro-mapred1.7.7avro-mapred-1.7.7-hadoop2.jar;C:Usersxinfang.m2
    epositoryorgapacheavroavro-ipc1.7.7avro-ipc-1.7.7.jar;C:Usersxinfang.m2
    epositoryorgapacheavroavro-ipc1.7.7avro-ipc-1.7.7-tests.jar;C:Usersxinfang.m2
    epositorycom	witterchill_2.10.5.0chill_2.10-0.5.0.jar;C:Usersxinfang.m2
    epositorycomesotericsoftwarekryokryo2.21kryo-2.21.jar;C:Usersxinfang.m2
    epositorycomesotericsoftware
    eflectasm
    eflectasm1.07
    eflectasm-1.07-shaded.jar;C:Usersxinfang.m2
    epositorycomesotericsoftwareminlogminlog1.2minlog-1.2.jar;C:Usersxinfang.m2
    epositoryorgobjenesisobjenesis1.2objenesis-1.2.jar;C:Usersxinfang.m2
    epositorycom	witterchill-java.5.0chill-java-0.5.0.jar;C:Usersxinfang.m2
    epositoryorgapachexbeanxbean-asm5-shaded4.4xbean-asm5-shaded-4.4.jar;C:Usersxinfang.m2
    epositoryorgapachesparkspark-launcher_2.101.6.3spark-launcher_2.10-1.6.3.jar;C:Usersxinfang.m2
    epositoryorgapachesparkspark-network-common_2.101.6.3spark-network-common_2.10-1.6.3.jar;C:Usersxinfang.m2
    epositoryorgapachesparkspark-network-shuffle_2.101.6.3spark-network-shuffle_2.10-1.6.3.jar;C:Usersxinfang.m2
    epositorycomfasterxmljacksoncorejackson-annotations2.4.4jackson-annotations-2.4.4.jar;C:Usersxinfang.m2
    epositoryorgapachesparkspark-unsafe_2.101.6.3spark-unsafe_2.10-1.6.3.jar;C:Usersxinfang.m2
    epository
    etjavadevjets3tjets3t.7.1jets3t-0.7.1.jar;C:Usersxinfang.m2
    epositoryorgapachecuratorcurator-recipes2.4.0curator-recipes-2.4.0.jar;C:Usersxinfang.m2
    epositoryorgapachecuratorcurator-framework2.4.0curator-framework-2.4.0.jar;C:Usersxinfang.m2
    epositoryorgeclipsejettyorbitjavax.servlet3.0.0.v201112011016javax.servlet-3.0.0.v201112011016.jar;C:Usersxinfang.m2
    epositoryorgapachecommonscommons-lang33.3.2commons-lang3-3.3.2.jar;C:Usersxinfang.m2
    epositoryorgapachecommonscommons-math33.4.1commons-math3-3.4.1.jar;C:Usersxinfang.m2
    epositorycomgooglecodefindbugsjsr3051.3.9jsr305-1.3.9.jar;C:Usersxinfang.m2
    epositoryorgslf4jslf4j-api1.7.10slf4j-api-1.7.10.jar;C:Usersxinfang.m2
    epositoryorgslf4jjul-to-slf4j1.7.10jul-to-slf4j-1.7.10.jar;C:Usersxinfang.m2
    epositoryorgslf4jjcl-over-slf4j1.7.10jcl-over-slf4j-1.7.10.jar;C:Usersxinfang.m2
    epositorylog4jlog4j1.2.17log4j-1.2.17.jar;C:Usersxinfang.m2
    epositoryorgslf4jslf4j-log4j121.7.10slf4j-log4j12-1.7.10.jar;C:Usersxinfang.m2
    epositorycom
    ingcompress-lzf1.0.3compress-lzf-1.0.3.jar;C:Usersxinfang.m2
    epositoryorgxerialsnappysnappy-java1.1.2.6snappy-java-1.1.2.6.jar;C:Usersxinfang.m2
    epository
    etjpountzlz4lz41.3.0lz4-1.3.0.jar;C:Usersxinfang.m2
    epositoryorg
    oaringbitmapRoaringBitmap.5.11RoaringBitmap-0.5.11.jar;C:Usersxinfang.m2
    epositorycommons-netcommons-net2.2commons-net-2.2.jar;C:Usersxinfang.m2
    epositorycom	ypesafeakkaakka-remote_2.102.3.11akka-remote_2.10-2.3.11.jar;C:Usersxinfang.m2
    epositorycom	ypesafeakkaakka-actor_2.102.3.11akka-actor_2.10-2.3.11.jar;C:Usersxinfang.m2
    epositorycom	ypesafeconfig1.2.1config-1.2.1.jar;C:Usersxinfang.m2
    epositoryorguncommonsmathsuncommons-maths1.2.2auncommons-maths-1.2.2a.jar;C:Usersxinfang.m2
    epositorycom	ypesafeakkaakka-slf4j_2.102.3.11akka-slf4j_2.10-2.3.11.jar;C:Usersxinfang.m2
    epositoryorgjson4sjson4s-jackson_2.103.2.10json4s-jackson_2.10-3.2.10.jar;C:Usersxinfang.m2
    epositoryorgjson4sjson4s-core_2.103.2.10json4s-core_2.10-3.2.10.jar;C:Usersxinfang.m2
    epositoryorgjson4sjson4s-ast_2.103.2.10json4s-ast_2.10-3.2.10.jar;C:Usersxinfang.m2
    epositoryorgscala-langscalap2.10.0scalap-2.10.0.jar;C:Usersxinfang.m2
    epositoryorgscala-langscala-compiler2.10.0scala-compiler-2.10.0.jar;C:Usersxinfang.m2
    epositorycomsunjerseyjersey-server1.9jersey-server-1.9.jar;C:Usersxinfang.m2
    epositoryasmasm3.1asm-3.1.jar;C:Usersxinfang.m2
    epositorycomsunjerseyjersey-core1.9jersey-core-1.9.jar;C:Usersxinfang.m2
    epositoryorgapachemesosmesos.21.1mesos-0.21.1-shaded-protobuf.jar;C:Usersxinfang.m2
    epositoryio
    etty
    etty-all4.0.29.Final
    etty-all-4.0.29.Final.jar;C:Usersxinfang.m2
    epositorycomclearspringanalyticsstream2.7.0stream-2.7.0.jar;C:Usersxinfang.m2
    epositoryiodropwizardmetricsmetrics-core3.1.2metrics-core-3.1.2.jar;C:Usersxinfang.m2
    epositoryiodropwizardmetricsmetrics-jvm3.1.2metrics-jvm-3.1.2.jar;C:Usersxinfang.m2
    epositoryiodropwizardmetricsmetrics-json3.1.2metrics-json-3.1.2.jar;C:Usersxinfang.m2
    epositoryiodropwizardmetricsmetrics-graphite3.1.2metrics-graphite-3.1.2.jar;C:Usersxinfang.m2
    epositorycomfasterxmljacksoncorejackson-databind2.4.4jackson-databind-2.4.4.jar;C:Usersxinfang.m2
    epositorycomfasterxmljacksoncorejackson-core2.4.4jackson-core-2.4.4.jar;C:Usersxinfang.m2
    epositorycomfasterxmljacksonmodulejackson-module-scala_2.102.4.4jackson-module-scala_2.10-2.4.4.jar;C:Usersxinfang.m2
    epositoryorgscala-langscala-reflect2.10.4scala-reflect-2.10.4.jar;C:Usersxinfang.m2
    epositorycom	houghtworksparanamerparanamer2.6paranamer-2.6.jar;C:Usersxinfang.m2
    epositoryorgapacheivyivy2.4.0ivy-2.4.0.jar;C:Usersxinfang.m2
    epositoryorooro2.0.8oro-2.0.8.jar;C:Usersxinfang.m2
    epositoryorg	achyonproject	achyon-client.8.2	achyon-client-0.8.2.jar;C:Usersxinfang.m2
    epositoryorg	achyonproject	achyon-underfs-hdfs.8.2	achyon-underfs-hdfs-0.8.2.jar;C:Usersxinfang.m2
    epositoryorg	achyonproject	achyon-underfs-s3.8.2	achyon-underfs-s3-0.8.2.jar;C:Usersxinfang.m2
    epositoryorg	achyonproject	achyon-underfs-local.8.2	achyon-underfs-local-0.8.2.jar;C:Usersxinfang.m2
    epository
    et
    azorvinepyrolite4.9pyrolite-4.9.jar;C:Usersxinfang.m2
    epository
    etsfpy4jpy4j.9py4j-0.9.jar;C:Usersxinfang.m2
    epositoryorgspark-projectsparkunused1.0.0unused-1.0.0.jar;C:Usersxinfang.m2
    epositoryorgapachesparkspark-sql_2.101.6.3spark-sql_2.10-1.6.3.jar;C:Usersxinfang.m2
    epositoryorgapachesparkspark-catalyst_2.101.6.3spark-catalyst_2.10-1.6.3.jar;C:Usersxinfang.m2
    epositoryorgcodehausjaninojanino2.7.8janino-2.7.8.jar;C:Usersxinfang.m2
    epositoryorgcodehausjaninocommons-compiler2.7.8commons-compiler-2.7.8.jar;C:Usersxinfang.m2
    epositoryorgapacheparquetparquet-column1.7.0parquet-column-1.7.0.jar;C:Usersxinfang.m2
    epositoryorgapacheparquetparquet-common1.7.0parquet-common-1.7.0.jar;C:Usersxinfang.m2
    epositoryorgapacheparquetparquet-encoding1.7.0parquet-encoding-1.7.0.jar;C:Usersxinfang.m2
    epositoryorgapacheparquetparquet-generator1.7.0parquet-generator-1.7.0.jar;C:Usersxinfang.m2
    epositoryorgapacheparquetparquet-hadoop1.7.0parquet-hadoop-1.7.0.jar;C:Usersxinfang.m2
    epositoryorgapacheparquetparquet-format2.3.0-incubatingparquet-format-2.3.0-incubating.jar;C:Usersxinfang.m2
    epositoryorgapacheparquetparquet-jackson1.7.0parquet-jackson-1.7.0.jar;C:Usersxinfang.m2
    epositoryorgapachesparkspark-streaming_2.101.6.3spark-streaming_2.10-1.6.3.jar;C:Usersxinfang.m2
    epositoryorgapachehadoophadoop-client2.7.3hadoop-client-2.7.3.jar;C:Usersxinfang.m2
    epositoryorgapachehadoophadoop-mapreduce-client-app2.7.3hadoop-mapreduce-client-app-2.7.3.jar;C:Usersxinfang.m2
    epositoryorgapachehadoophadoop-mapreduce-client-common2.7.3hadoop-mapreduce-client-common-2.7.3.jar;C:Usersxinfang.m2
    epositoryorgapachehadoophadoop-yarn-client2.7.3hadoop-yarn-client-2.7.3.jar;C:Usersxinfang.m2
    epositoryorgapachehadoophadoop-yarn-server-common2.7.3hadoop-yarn-server-common-2.7.3.jar;C:Usersxinfang.m2
    epositoryorgapachehadoophadoop-mapreduce-client-shuffle2.7.3hadoop-mapreduce-client-shuffle-2.7.3.jar;C:Usersxinfang.m2
    epositoryorgapachehadoophadoop-yarn-api2.7.3hadoop-yarn-api-2.7.3.jar;C:Usersxinfang.m2
    epositoryorgapachehadoophadoop-mapreduce-client-core2.7.3hadoop-mapreduce-client-core-2.7.3.jar;C:Usersxinfang.m2
    epositoryorgapachehadoophadoop-yarn-common2.7.3hadoop-yarn-common-2.7.3.jar;C:Usersxinfang.m2
    epositoryjavaxxmlindjaxb-api2.2.2jaxb-api-2.2.2.jar;C:Usersxinfang.m2
    epositoryjavaxxmlstreamstax-api1.0-2stax-api-1.0-2.jar;C:Usersxinfang.m2
    epositoryjavaxactivationactivation1.1activation-1.1.jar;C:Usersxinfang.m2
    epositorycomsunjerseyjersey-client1.9jersey-client-1.9.jar;C:Usersxinfang.m2
    epositoryorgapachehadoophadoop-mapreduce-client-jobclient2.7.3hadoop-mapreduce-client-jobclient-2.7.3.jar;C:Usersxinfang.m2
    epositoryorgapachehadoophadoop-annotations2.7.3hadoop-annotations-2.7.3.jar;C:Usersxinfang.m2
    epositoryorgapachehadoophadoop-common2.7.3hadoop-common-2.7.3.jar;C:Usersxinfang.m2
    epositorycomgoogleguavaguava11.0.2guava-11.0.2.jar;C:Usersxinfang.m2
    epositorycommons-clicommons-cli1.2commons-cli-1.2.jar;C:Usersxinfang.m2
    epositoryxmlencxmlenc.52xmlenc-0.52.jar;C:Usersxinfang.m2
    epositorycommons-httpclientcommons-httpclient3.1commons-httpclient-3.1.jar;C:Usersxinfang.m2
    epositorycommons-codeccommons-codec1.4commons-codec-1.4.jar;C:Usersxinfang.m2
    epositorycommons-iocommons-io2.4commons-io-2.4.jar;C:Usersxinfang.m2
    epositorycommons-collectionscommons-collections3.2.2commons-collections-3.2.2.jar;C:Usersxinfang.m2
    epositoryjavaxservletservlet-api2.5servlet-api-2.5.jar;C:Usersxinfang.m2
    epositoryorgmortbayjettyjetty6.1.26jetty-6.1.26.jar;C:Usersxinfang.m2
    epositoryorgmortbayjettyjetty-util6.1.26jetty-util-6.1.26.jar;C:Usersxinfang.m2
    epositoryjavaxservletjspjsp-api2.1jsp-api-2.1.jar;C:Usersxinfang.m2
    epositorycomsunjerseyjersey-json1.9jersey-json-1.9.jar;C:Usersxinfang.m2
    epositoryorgcodehausjettisonjettison1.1jettison-1.1.jar;C:Usersxinfang.m2
    epositorycomsunxmlindjaxb-impl2.2.3-1jaxb-impl-2.2.3-1.jar;C:Usersxinfang.m2
    epositoryorgcodehausjacksonjackson-jaxrs1.8.3jackson-jaxrs-1.8.3.jar;C:Usersxinfang.m2
    epositoryorgcodehausjacksonjackson-xc1.8.3jackson-xc-1.8.3.jar;C:Usersxinfang.m2
    epositorycommons-loggingcommons-logging1.1.3commons-logging-1.1.3.jar;C:Usersxinfang.m2
    epositorycommons-langcommons-lang2.6commons-lang-2.6.jar;C:Usersxinfang.m2
    epositorycommons-configurationcommons-configuration1.6commons-configuration-1.6.jar;C:Usersxinfang.m2
    epositorycommons-digestercommons-digester1.8commons-digester-1.8.jar;C:Usersxinfang.m2
    epositorycommons-beanutilscommons-beanutils1.7.0commons-beanutils-1.7.0.jar;C:Usersxinfang.m2
    epositorycommons-beanutilscommons-beanutils-core1.8.0commons-beanutils-core-1.8.0.jar;C:Usersxinfang.m2
    epositoryorgcodehausjacksonjackson-core-asl1.9.13jackson-core-asl-1.9.13.jar;C:Usersxinfang.m2
    epositoryorgcodehausjacksonjackson-mapper-asl1.9.13jackson-mapper-asl-1.9.13.jar;C:Usersxinfang.m2
    epositoryorgapacheavroavro1.7.4avro-1.7.4.jar;C:Usersxinfang.m2
    epositorycomgoogleprotobufprotobuf-java2.5.0protobuf-java-2.5.0.jar;C:Usersxinfang.m2
    epositorycomgooglecodegsongson2.2.4gson-2.2.4.jar;C:Usersxinfang.m2
    epositoryorgapachehadoophadoop-auth2.7.3hadoop-auth-2.7.3.jar;C:Usersxinfang.m2
    epositoryorgapachehttpcomponentshttpclient4.2.5httpclient-4.2.5.jar;C:Usersxinfang.m2
    epositoryorgapachehttpcomponentshttpcore4.2.4httpcore-4.2.4.jar;C:Usersxinfang.m2
    epositoryorgapachedirectoryserverapacheds-kerberos-codec2.0.0-M15apacheds-kerberos-codec-2.0.0-M15.jar;C:Usersxinfang.m2
    epositoryorgapachedirectoryserverapacheds-i18n2.0.0-M15apacheds-i18n-2.0.0-M15.jar;C:Usersxinfang.m2
    epositoryorgapachedirectoryapiapi-asn1-api1.0.0-M20api-asn1-api-1.0.0-M20.jar;C:Usersxinfang.m2
    epositoryorgapachedirectoryapiapi-util1.0.0-M20api-util-1.0.0-M20.jar;C:Usersxinfang.m2
    epositorycomjcraftjsch.1.42jsch-0.1.42.jar;C:Usersxinfang.m2
    epositoryorgapachecuratorcurator-client2.7.1curator-client-2.7.1.jar;C:Usersxinfang.m2
    epositoryorgapachehtracehtrace-core3.1.0-incubatinghtrace-core-3.1.0-incubating.jar;C:Usersxinfang.m2
    epositoryorgapachezookeeperzookeeper3.4.6zookeeper-3.4.6.jar;C:Usersxinfang.m2
    epositoryorgapachecommonscommons-compress1.4.1commons-compress-1.4.1.jar;C:Usersxinfang.m2
    epositoryorg	ukaanixz1.0xz-1.0.jar;C:Usersxinfang.m2
    epositoryorgapachehadoophadoop-hdfs2.7.3hadoop-hdfs-2.7.3.jar;C:Usersxinfang.m2
    epositorycommons-daemoncommons-daemon1.0.13commons-daemon-1.0.13.jar;C:Usersxinfang.m2
    epositoryio
    etty
    etty3.6.2.Final
    etty-3.6.2.Final.jar;C:Usersxinfang.m2
    epositoryxercesxercesImpl2.9.1xercesImpl-2.9.1.jar;C:Usersxinfang.m2
    epositoryxml-apisxml-apis1.3.04xml-apis-1.3.04.jar;C:Usersxinfang.m2
    epositoryorgfusesourceleveldbjnileveldbjni-all1.8leveldbjni-all-1.8.jar" sparkPi
    Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
    17/11/07 15:34:22 INFO SparkContext: Running Spark version 1.6.3
    17/11/07 15:34:24 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
    17/11/07 15:34:24 INFO SecurityManager: Changing view acls to: xinfang
    17/11/07 15:34:24 INFO SecurityManager: Changing modify acls to: xinfang
    17/11/07 15:34:24 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(xinfang); users with modify permissions: Set(xinfang)
    17/11/07 15:34:26 INFO Utils: Successfully started service 'sparkDriver' on port 53931.
    17/11/07 15:34:26 INFO Slf4jLogger: Slf4jLogger started
    17/11/07 15:34:26 INFO Remoting: Starting remoting
    17/11/07 15:34:27 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriverActorSystem@172.20.107.151:53944]
    17/11/07 15:34:27 INFO Utils: Successfully started service 'sparkDriverActorSystem' on port 53944.
    17/11/07 15:34:27 INFO SparkEnv: Registering MapOutputTracker
    17/11/07 15:34:27 INFO SparkEnv: Registering BlockManagerMaster
    17/11/07 15:34:27 INFO DiskBlockManager: Created local directory at C:UsersxinfangAppDataLocalTemplockmgr-d4ba2426-7f9b-47f1-800e-2aad1aa75f70
    17/11/07 15:34:27 INFO MemoryStore: MemoryStore started with capacity 1122.0 MB
    17/11/07 15:34:27 INFO SparkEnv: Registering OutputCommitCoordinator
    17/11/07 15:34:27 INFO Utils: Successfully started service 'SparkUI' on port 4040.
    17/11/07 15:34:27 INFO SparkUI: Started SparkUI at http://172.20.107.151:4040
    17/11/07 15:34:27 INFO HttpFileServer: HTTP File server directory is C:UsersxinfangAppDataLocalTempspark-150f8c4e-c5df-4254-a9f0-a8158b4caff0httpd-b0aaa30f-cd92-4fbd-b064-6284fa604359
    17/11/07 15:34:27 INFO HttpServer: Starting HTTP Server
    17/11/07 15:34:28 INFO Utils: Successfully started service 'HTTP file server' on port 53947.
    17/11/07 15:34:28 INFO SparkContext: Added JAR D:workspacescalaoutscala.jar at http://172.20.107.151:53947/jars/scala.jar with timestamp 1510040068032
    17/11/07 15:34:28 INFO AppClient$ClientEndpoint: Connecting to master spark://192.168.66.66:7077...
    17/11/07 15:34:30 INFO SparkDeploySchedulerBackend: Connected to Spark cluster with app ID app-20171107153300-0010
    17/11/07 15:34:30 INFO AppClient$ClientEndpoint: Executor added: app-20171107153300-0010/0 on worker-20171106165832-192.168.66.66-7078 (192.168.66.66:7078) with 2 cores
    17/11/07 15:34:30 INFO SparkDeploySchedulerBackend: Granted executor ID app-20171107153300-0010/0 on hostPort 192.168.66.66:7078 with 2 cores, 1024.0 MB RAM
    17/11/07 15:34:30 INFO AppClient$ClientEndpoint: Executor updated: app-20171107153300-0010/0 is now RUNNING
    17/11/07 15:34:30 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 53967.
    17/11/07 15:34:30 INFO NettyBlockTransferService: Server created on 53967
    17/11/07 15:34:30 INFO BlockManagerMaster: Trying to register BlockManager
    17/11/07 15:34:30 INFO BlockManagerMasterEndpoint: Registering block manager 172.20.107.151:53967 with 1122.0 MB RAM, BlockManagerId(driver, 172.20.107.151, 53967)
    17/11/07 15:34:30 INFO BlockManagerMaster: Registered BlockManager
    17/11/07 15:34:31 INFO SparkDeploySchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.0
    Time:1510040062460
    17/11/07 15:34:31 INFO SparkContext: Starting job: reduce at sparkPi.scala:17
    17/11/07 15:34:31 INFO DAGScheduler: Got job 0 (reduce at sparkPi.scala:17) with 2 output partitions
    17/11/07 15:34:31 INFO DAGScheduler: Final stage: ResultStage 0 (reduce at sparkPi.scala:17)
    17/11/07 15:34:31 INFO DAGScheduler: Parents of final stage: List()
    17/11/07 15:34:31 INFO DAGScheduler: Missing parents: List()
    17/11/07 15:34:32 INFO DAGScheduler: Submitting ResultStage 0 (MapPartitionsRDD[1] at map at sparkPi.scala:13), which has no missing parents
    17/11/07 15:34:32 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 1848.0 B, free 1122.0 MB)
    17/11/07 15:34:32 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 1206.0 B, free 1122.0 MB)
    17/11/07 15:34:32 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on 172.20.107.151:53967 (size: 1206.0 B, free: 1122.0 MB)
    17/11/07 15:34:32 INFO SparkContext: Created broadcast 0 from broadcast at DAGScheduler.scala:1006
    17/11/07 15:34:32 INFO DAGScheduler: Submitting 2 missing tasks from ResultStage 0 (MapPartitionsRDD[1] at map at sparkPi.scala:13)
    17/11/07 15:34:32 INFO TaskSchedulerImpl: Adding task set 0.0 with 2 tasks
    17/11/07 15:34:36 INFO SparkDeploySchedulerBackend: Registered executor NettyRpcEndpointRef(null) (xinfang:53977) with ID 0
    17/11/07 15:34:36 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, xinfang, partition 0,PROCESS_LOCAL, 2130 bytes)
    17/11/07 15:34:36 INFO TaskSetManager: Starting task 1.0 in stage 0.0 (TID 1, xinfang, partition 1,PROCESS_LOCAL, 2130 bytes)
    17/11/07 15:34:37 INFO BlockManagerMasterEndpoint: Registering block manager xinfang:64456 with 511.1 MB RAM, BlockManagerId(0, xinfang, 64456)
    17/11/07 15:34:42 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on xinfang:64456 (size: 1206.0 B, free: 511.1 MB)
    17/11/07 15:34:44 INFO TaskSetManager: Finished task 1.0 in stage 0.0 (TID 1) in 7589 ms on xinfang (1/2)
    17/11/07 15:34:44 INFO TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 7654 ms on xinfang (2/2)
    17/11/07 15:34:44 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 
    17/11/07 15:34:44 INFO DAGScheduler: ResultStage 0 (reduce at sparkPi.scala:17) finished in 11.849 s
    17/11/07 15:34:44 INFO DAGScheduler: Job 0 finished: reduce at sparkPi.scala:17, took 12.553598 s
    Pi is roughly 3.15
    17/11/07 15:34:44 INFO SparkUI: Stopped Spark web UI at http://172.20.107.151:4040
    17/11/07 15:34:44 INFO SparkDeploySchedulerBackend: Shutting down all executors
    17/11/07 15:34:44 INFO SparkDeploySchedulerBackend: Asking each executor to shut down
    17/11/07 15:34:44 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
    17/11/07 15:34:44 INFO MemoryStore: MemoryStore cleared
    17/11/07 15:34:44 INFO BlockManager: BlockManager stopped
    17/11/07 15:34:44 INFO BlockManagerMaster: BlockManagerMaster stopped
    17/11/07 15:34:44 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
    17/11/07 15:34:44 INFO SparkContext: Successfully stopped SparkContext
    17/11/07 15:34:44 INFO RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.
    17/11/07 15:34:44 INFO RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.
    17/11/07 15:34:44 INFO ShutdownHookManager: Shutdown hook called
    17/11/07 15:34:44 INFO ShutdownHookManager: Deleting directory C:UsersxinfangAppDataLocalTempspark-150f8c4e-c5df-4254-a9f0-a8158b4caff0httpd-b0aaa30f-cd92-4fbd-b064-6284fa604359
    17/11/07 15:34:44 INFO ShutdownHookManager: Deleting directory C:UsersxinfangAppDataLocalTempspark-150f8c4e-c5df-4254-a9f0-a8158b4caff0
    
    Process finished with exit code 0
    

      

  • 相关阅读:
    jQuery on注册事件
    前端表格(Table)多条数据可以增加行删除行json封装后Post数据到后台处理
    导出Excel数据
    C#在一个实体类上新加字段并把另外一个实体类的字段值赋给它
    函数(五)-内置函数
    函数(四)-命名空间与作用域
    函数(三)-return与函数的调用
    函数(二)- 参数
    函数(一)-基本格式
    字符串格式化
  • 原文地址:https://www.cnblogs.com/xinfang520/p/7798095.html
Copyright © 2011-2022 走看看