zoukankan      html  css  js  c++  java
  • Spark Standalone Mode Configuration

      For currently popular distributed framework Spark, here it shows the intro and steps to configure the spark standalone mode on several machines.

    It is easy to configure it from stratch.  The following instruction I note down is based on the spark-2.0.2-bin-hadoop2.7 as example on the linux debian machines for scala programming.

    Assume you have two machines with IP: 192.168.0.51 and 192.168.0.52 

    1.  Preinstall java, scala, sbt

      check: https://www.scala-lang.org/download/install.html        

                 http://www.scala-sbt.org/0.13/docs/Installing-sbt-on-Linux.html

    2. Download prebuilt spark version with hadoop. or you can compile on your own 

    the link can be referenced: https://spark.apache.org/downloads.html

    3. Unzip the file and create the link for easy visit later

    e.g.   execute: ln -s /usr/local/spark-2.0.2-bin-hadoop2.7 /usr/local/spark
     
    4. Configure the spark environments:
     (1) configure slaves file:   /usr/local/spark-2.0.2-bin-hadoop2.7/conf/slaves
    # A Spark Worker will be started on each of the machines listed below.
    192.168.0.51
    192.168.0.52
     
    (2) configure spar_env.sh.               e.g.
    #spark-env.sh
    export SCALA_HOME=/usr/local/scala
    export JAVA_HOME=/home/local/jdk
    #export SPARK_LOCAL_IP=localhost
    export SPARK_EXECUTOR_MEMORY=6g
    export SPARK_EXECUTOR_CORES=6
    export SPARK_MASTER_IP=192.168.0.51
    export SPARK_MASTER_PORT=8070
    export SPARK_MASTER_WEBUI_PORT=8080
    #export SPARK_WORKER_INSTANCES=1
    export SPARK_WORKER_PORT=8092
    #export SPARK_WORKER_MEMORY=4g
    #export SPARK_WORKER_CORES=4
     
    5.  Set up passwordless ssh access key
      (1) Generate ssh key without password
    $ ssh-keygen -t rsa -P ""
    

    (2) Copy id_rsa.pub to authorized-keys

    $  cat $HOME/.ssh/id_rsa.pub >> $HOME/.ssh/authorized_keys
    

    (3) Start ssh localhost           if you want to work in only one localhost machine for spark standalone

     $ ssh localhost
     
    6. Start spark
    $SPARK_HOME/sbin/start-all.sh
     execute  jps  to check worker and master have been up
     
    7. Write and run your application
     
    execute:  sbt package 
     
    execute:  $SPARK_HOME/bin/spark-submit
          --class "main.scala.MainAppTest"
          --master local[4]
          xxxxxxxx.jar 
     
     
     
     
     
  • 相关阅读:
    eclipse+maven springMVC搭建
    mac下搭建eclipse+git环境并导入项目
    mvn deploy返回400错误的几种可能
    vector的坑——C++primer练习6.33总结
    2015/4/8腾讯笔试
    ColKang v1.0
    C++学习——C++复合类型
    朴素贝叶斯文本分类-在《红楼梦》作者鉴别的应用上(python实现)
    python:BeautifulSoup学习
    python:网络爬虫的学习笔记
  • 原文地址:https://www.cnblogs.com/anxin6699/p/7150048.html
Copyright © 2011-2022 走看看