1 安装
参考 http://www.kankanews.com/ICkengine/archives/106212.shtml
1.1 安装scala
wget http://www.scala-lang.org/files/archive/scala-2.9.3.tgz
export SCALA_HOME=yourpath
export PATH=$PATH:$SCALA_HOME/bin
1.2 安装spark
在http://archive.apache.org/dist/spark/spark-0.8.1-incubating/找到对应的spark版本
wget http://archive.apache.org/dist/spark/spark-0.8.1-incubating/spark-0.8.1-incubating-bin-cdh4.tgz
export SPARK_HOME=/home/hadoop/soft/spark-0.8.1
export PATH=$PATH:$SPARK_HOME/bin
cd conf
vim spark-env.sh
export JAVA_HOME=/home/hadoop/soft/jdk1.7.0_51
export SCALA_HOME=/home/hadoop/soft/scala-2.9.3
export HADOOP_HOME=/home/hadoop/hadoop-2.0.0-cdh4.5.0
集群模式
export SPARK_MASTER_IP=masterIP(spark://主机名:7077 格式)
配置好slaves文件
启动
cd bin
sh start-all.sh
验证是否出现master 和 worker
webui http://localhost:8080/
1.3
单机测试
cd $SPARK_HOME
./run-example org.apache.spark.examples.SparkPi local
集群测试
export SPARK_MASTER_IP=masterIP(spark://主机名:7077 格式)