1. 安装Docker, 见上篇。
2. 安装ubuntu: docker run --name dcSpark ubuntu
3. 运行 Bash: docker exec -ti dcSpark /bin/bash
4. apt-get update
5. apt-get install software-properties-common
6. 添加PPA软件源: add-apt-repository ppa:webupd8team/java
7. 然后更新系统,刷新软件源: apt-get update
8. 安装 JDK: apt-get install oracle-java8-installer
9. 查看 版本: java -version
1 ## Java 2 sudo apt-get update 3 sudo apt-get install default-jdk 4 5 ## Scala 6 sudo apt-get remove scala-library scala 7 sudo wget http://scala-lang.org/files/archive/scala-2.12.1.deb 8 sudo dpkg -i scala-2.12.1.deb 9 sudo apt-get update 10 sudo apt-get install scala 11 12 ## SBT 13 echo "deb https://dl.bintray.com/sbt/debian /" | sudo tee -a /etc/apt/sources.list.d/sbt.list 14 sudo apt-key adv --keyserver hkp://keyserver.ubuntu.com:80 --recv 2EE0EA64E40A89B84B2DF73499E82A75642AC823 15 sudo apt-get update 16 sudo apt-get install sbt
下载Spark:
mkdir download
cd download
wget https://www.apache.org/dyn/closer.lua/spark/spark-2.3.0/spark-2.3.0-bin-hadoop2.7.tgz
解压:
1 sudo tar -zxf ~/下载/spark-1.6.0-bin-without-hadoop.tgz -C /usr/local/ 2 cd /usr/local 3 sudo mv ./spark-1.6.0-bin-without-hadoop/ ./spark 4 sudo chown -R hadoop:hadoop ./spark
运行 Spark Shell, 到 Spark的目录下,执行
./bin/spark-shell
测试:(Scala)
1 val textFile = sc.textFile("file:///usr/local/spark/README.md") 2 3 textFile.count() // RDD 中的 item 数量,对于文本文件,就是总行数 4 // res0: Long = 95 5 6 textFile.first() // RDD 中的第一个 item,对于文本文件,就是第一行内容 7 // res1: String = # Apache Spark
参考: http://www.powerxing.com/spark-quick-start-guide/