zoukankan      html  css  js  c++  java
  • 对于maven创建spark项目的pom.xml配置文件(图文详解)

      不多说,直接上干货!

    http://mvnrepository.com/

      这里,怎么创建,见

    Spark编程环境搭建(基于Intellij IDEA的Ultimate版本)(包含Java和Scala版的WordCount)(博主强烈推荐)

      这里, 我重点说下spark项目,因为,对于hadoop这样的,我已经写了大量博客了。

       比如,我目前用得较多的spark-mllib。

      这里spark-mllib_2.10 就是你的scala版本是2.10.X系列。比如我一般是使用scala-2.10.4。

      这里spark-mllib_2.11 就是你的scala版本是2.11.X系列。

      同时,大家要养成规范,http://mvnrepository.com里是示例如下

    <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-mllib_2.10 -->
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-mllib_2.10</artifactId>
        <version>2.2.0</version>
        <scope>provided</scope>
    </dependency>

      

       但是呢,我不建议这样。反而是把版本抽取出来,

       以下是我的maven构建出来的spark项目的pom.xml,大家可以作为参考下。当然这不是最规范的。

    <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
      <modelVersion>4.0.0</modelVersion>
      <groupId>zhouls.bigdata</groupId>
      <artifactId>SparkMllibBook</artifactId>
      <version>1.0-SNAPSHOT</version>
      <inceptionYear>2008</inceptionYear>
    
      <properties>
        <scala.version>2.10.4</scala.version>
        <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
        <hadoop.version>2.6.0</hadoop.version>
        <spark.version>2.2.0</spark.version>
      </properties>
    
      <repositories>
        <repository>
          <id>scala-tools.org</id>
          <name>Scala-Tools Maven2 Repository</name>
          <url>http://scala-tools.org/repo-releases</url>
        </repository>
      </repositories>
    
      <pluginRepositories>
        <pluginRepository>
          <id>scala-tools.org</id>
          <name>Scala-Tools Maven2 Repository</name>
          <url>http://scala-tools.org/repo-releases</url>
        </pluginRepository>
      </pluginRepositories>
    
      <dependencies>
        <dependency>
          <groupId>org.scala-lang</groupId>
          <artifactId>scala-library</artifactId>
          <version>${scala.version}</version>
        </dependency>
        <dependency>
          <groupId>junit</groupId>
          <artifactId>junit</artifactId>
          <version>4.4</version>
          <scope>test</scope>
        </dependency>
        <dependency>
          <groupId>org.specs</groupId>
          <artifactId>specs</artifactId>
          <version>1.2.5</version>
          <scope>test</scope>
        </dependency>
    
        <dependency>
          <groupId>org.apache.hadoop</groupId>
          <artifactId>hadoop-common</artifactId>
          <version>${hadoop.version}</version>
          <scope>provided</scope>
        </dependency>
    
        <dependency>
          <groupId>org.apache.hadoop</groupId>
          <artifactId>hadoop-hdfs</artifactId>
          <version>${hadoop.version}</version>
          <scope>provided</scope>
        </dependency>
    
        <dependency>
          <groupId>org.apache.hadoop</groupId>
          <artifactId>hadoop-mapreduce-client-common</artifactId>
          <version>${hadoop.version}</version>
          <scope>provided</scope>
        </dependency>
    
        <dependency>
          <groupId>org.apache.hadoop</groupId>
          <artifactId>hadoop-mapreduce-client-core</artifactId>
          <version>${hadoop.version}</version>
          <scope>provided</scope>
        </dependency>
    
        <dependency>
          <groupId>org.apache.spark</groupId>
          <artifactId>spark-core_2.10</artifactId>
          <version>${spark.version}</version>
          <scope>provided</scope>
        </dependency>
    
        <dependency>
          <groupId>org.apache.spark</groupId>
          <artifactId>spark-streaming_2.10</artifactId>
          <version>${spark.version}</version>
          <scope>provided</scope>
        </dependency>
    
        <dependency>
          <groupId>org.apache.spark</groupId>
          <artifactId>spark-sql_2.10</artifactId>
          <version>${spark.version}</version>
          <scope>provided</scope>
        </dependency>
    
    
        <dependency>
          <groupId>org.apache.spark</groupId>
          <artifactId>spark-mllib_2.10</artifactId>
          <version>${spark.version}</version>
          <scope>provided</scope>
        </dependency>
      </dependencies>
    
    
      <build>
        <sourceDirectory>src/main/scala</sourceDirectory>
        <testSourceDirectory>src/test/scala</testSourceDirectory>
        <plugins>
          <plugin>
            <groupId>org.scala-tools</groupId>
            <artifactId>maven-scala-plugin</artifactId>
            <executions>
              <execution>
                <goals>
                  <goal>compile</goal>
                  <goal>testCompile</goal>
                </goals>
              </execution>
            </executions>
            <configuration>
              <scalaVersion>${scala.version}</scalaVersion>
              <args>
                <arg>-target:jvm-1.5</arg>
              </args>
            </configuration>
          </plugin>
          <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-eclipse-plugin</artifactId>
            <configuration>
              <downloadSources>true</downloadSources>
              <buildcommands>
                <buildcommand>ch.epfl.lamp.sdt.core.scalabuilder</buildcommand>
              </buildcommands>
              <additionalProjectnatures>
                <projectnature>ch.epfl.lamp.sdt.core.scalanature</projectnature>
              </additionalProjectnatures>
              <classpathContainers>
                <classpathContainer>org.eclipse.jdt.launching.JRE_CONTAINER</classpathContainer>
                <classpathContainer>ch.epfl.lamp.sdt.launching.SCALA_CONTAINER</classpathContainer>
              </classpathContainers>
            </configuration>
          </plugin>
        </plugins>
      </build>
      <reporting>
        <plugins>
          <plugin>
            <groupId>org.scala-tools</groupId>
            <artifactId>maven-scala-plugin</artifactId>
            <configuration>
              <scalaVersion>${scala.version}</scalaVersion>
            </configuration>
          </plugin>
        </plugins>
      </reporting>
    </project>

      也许,大家在具体执行代码时,出现如下问题

    IDEA里运行代码时出现Caused by: java.lang.ClassNotFoundException: org.apache.log4j.Logger的解决办法(图文详解)

    IDEA里运行代码时出现Error:scalac: error while loading JUnit4, Scala signature JUnit4 has wrong version expected: 5.0 found: 4.1 in JUnit4.class错误的解决办法(图文详解)

  • 相关阅读:
    MLPclassifier,MLP 多层感知器的的缩写(Multi-layer Perceptron)
    linux 内存不足时候 应该及时回收page cache
    关闭swap的危害——一旦内存耗尽,由于没有SWAP的缓冲,系统会立即开始OOM
    使用Networkx进行图的相关计算——黑产集团挖掘,我靠,可以做dns ddos慢速攻击检测啊
    ARIMA模型实例讲解——网络流量预测可以使用啊
    http://www.secrepo.com 安全相关的数据获取源
    什么是HTTP Referer?
    列举某域名下所有二级域名的方法
    HMM(隐马尔科夫模型)——本质上就是要预测出股市的隐藏状态(牛市、熊市、震荡、反弹等)和他们之间的转移概率
    成都优步uber司机第五组奖励政策
  • 原文地址:https://www.cnblogs.com/zlslch/p/7446002.html
Copyright © 2011-2022 走看看