zoukankan      html  css  js  c++  java
  • Spark 0.9.0启动脚本——bin/compute-classpath.sh

    1. 设置SCALA_VERSION

    2. 执行conf/spark-env.sh

    3. 设置CLASSPATH=<conf目录>

    4. 如果存在assembly/target/scala-$SCALA_VERSION/spark-assembly*hadoop*-deps.jar,则添加

    [core|repl|mllib|bagel|graphx|streaming]/target/scala-$SCALA_VERSION/classes:/assembly/target/scala-$SCALA_VERSION/spark-assembly*hadoop*-deps.jar

    如果不存在,则检测RELEASE目录,存在则添加jars/spark-assembly*.jar,不存在则添加assembly/target/scala-$SCALA_VERSION/spark-assembly*hadoop*.jar

    5. 检测SPARK_TESTING,为1则添加

    [core|repl|mllib|bagel|graphx|streaming]/target/scala-$SCALA_VERSION/test-classes

    6. 添加HADOOP_CONF_DIR、YARN_CONF_DIR

  • 相关阅读:
    py 5.11
    py 5.10
    py 5.9
    py 5.8
    python 5.7
    python 5.4
    python 5.3
    python 5.2
    python 4.28
    python 4.27
  • 原文地址:https://www.cnblogs.com/hujunfei/p/3624712.html
Copyright © 2011-2022 走看看