zoukankan      html  css  js  c++  java
  • Spark cluster on Mesos

    Spark cluster on Mesos


    官方网站:


    环境:
    CentOS 7
    spark-2.0
    jdk-1.8
    mesos-1.0
    Spark <wbr>cluster <wbr>on <wbr>Mesos


    一.mesos

    zk1: 192.168.8.101

    zk2: 192.168.8.102

    zk3: 192.168.8.103

    mesos-m1: 192.168.8.101

    mesos-m2: 192.168.8.102

    mesos-m3: 192.168.8.103

    mesos-a1: 192.168.8.101

    mesos-a2: 192.168.8.102

    mesos-a3: 192.168.8.103

    具体请参看mesos+marathon+docker


    二.spark
    1.配置jdk
    JAVA_HOME=/opt/jdk
    2.安装spark
    tar -xvf spark-2.0.0-bin-hadoop2.7.tgz -C /opt/
    mv /opt/spark-2.0.0-bin-hadoop2.7 /opt/spark
    cp /opt/spark/conf/log4j.properties.template /opt/spark/conf/log4j.properties
    sed -i 's/INFO/WARN/g' /opt/spark/conf/log4j.properties
    设置spark环境变量
    cat >/etc/profile.d/spark.sh <<HERE

    export SPARK_HOME=/opt/spark

    HERE

    source /etc/profile

    root@router:~#/opt/spark/bin/

    beeline       run-example   sparkR        spark-sql     

    pyspark       spark-class   spark-shell   spark-submit 


    提示:请确保主机名可以成功解析

    pyspark

    root@router:~#/opt/spark/bin/pyspark 

    Python 2.7.5 (default, Nov 20 2015, 02:00:19) 

    [GCC 4.8.5 20150623 (Red Hat 4.8.5-4)] on linux2

    Type "help", "copyright", "credits" or "license" for more information.

    16/08/02 17:55:07 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

    Welcome to

          ____              __

         / __/__  ___ _____/ /__

        _ / _ / _ `/ __/  '_/

       /__ / .__/\_,_/_/ /_/\_   version 2.0.0

          /_/


    Using Python version 2.7.5 (default, Nov 20 2015 02:00:19)

    SparkSession available as 'spark'.

    >>> 

    spark-shell

    root@router:~#/opt/spark/bin/spark-shell 

    16/08/02 18:00:25 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

    16/08/02 18:00:26 WARN SparkContext: Use an existing SparkContext, some configuration may not take effect.

    Spark context Web UI available at http://192.168.8.254:4040

    Spark context available as 'sc' (master = local[*], app id = local-1470132026404).

    Spark session available as 'spark'.

    Welcome to

          ____              __

         / __/__  ___ _____/ /__

        _ / _ / _ `/ __/  '_/

       /___/ .__/\_,_/_/ /_/\_   version 2.0.0

          /_/

             


    Spark <wbr>cluster <wbr>on <wbr>Mesos

    3.配置spark集群

    http://spark.apache.org/docs/latest/running-on-mesos.html

    spark1: 192.168.8.101

    spark2: 192.168.8.102

    spark3: 192.168.8.103

    /opt/spark/sbin/start-mesos-dispatcher.sh --master mesos://zk://192.168.8.101:2181,192.168.8.102:2181,192.168.8.103:2181/mesos 

    Spark <wbr>cluster <wbr>on <wbr>Mesos

    4.提交job到spark cluster

    这里直接借用spark自带example示例

    /opt/spark/bin/spark-submit

      --class org.apache.spark.examples.SparkPi

      --master mesos://192.168.8.102:7077

      --deploy-mode cluster

      --supervise

      --executor-memory 1G

      --total-executor-cores 100

      http://192.168.8.254/ftp/examples/src/main/python/pi.py

      1000

    /opt/spark/bin/spark-submit

      --class org.apache.spark.examples.SparkPi

      --master mesos://192.168.8.102:7077

      --deploy-mode cluster

      --supervise

      --executor-memory 1G

      --total-executor-cores 100

      http://192.168.8.254/ftp/examples/jars/spark-examples_2.11-2.0.0.jar 

      1000


    提示:

    执行的job需要http://, hdfs://等形式

    Spark <wbr>cluster <wbr>on <wbr>Mesos

  • 相关阅读:
    UVa 1605
    UVa 120
    UVa 10384
    UVa 11694
    UVa 11846
    常用小函数
    【DP】:CF #319 (Div. 2) B. Modulo Sum
    类的无参方法
    类和对象
    七言
  • 原文地址:https://www.cnblogs.com/lixuebin/p/10814028.html
Copyright © 2011-2022 走看看