1. 安裝Docker, 見上篇。
2. 安裝ubuntu: docker run --name dcSpark ubuntu
3. 運行 Bash: docker exec -ti dcSpark /bin/bash
4. apt-get update
5. apt-get install software-properties-common
6. 添加PPA軟件源: add-apt-repository ppa:webupd8team/java
7. 然后更新系統,刷新軟件源: apt-get update
8. 安裝 JDK: apt-get install oracle-java8-installer
9. 查看 版本: java -version
1 ## Java 2 sudo apt-get update 3 sudo apt-get install default-jdk 4 5 ## Scala 6 sudo apt-get remove scala-library scala 7 sudo wget http://scala-lang.org/files/archive/scala-2.12.1.deb 8 sudo dpkg -i scala-2.12.1.deb 9 sudo apt-get update 10 sudo apt-get install scala 11 12 ## SBT 13 echo "deb https://dl.bintray.com/sbt/debian /" | sudo tee -a /etc/apt/sources.list.d/sbt.list 14 sudo apt-key adv --keyserver hkp://keyserver.ubuntu.com:80 --recv 2EE0EA64E40A89B84B2DF73499E82A75642AC823 15 sudo apt-get update 16 sudo apt-get install sbt
下載Spark:
mkdir download
cd download
wget https://www.apache.org/dyn/closer.lua/spark/spark-2.3.0/spark-2.3.0-bin-hadoop2.7.tgz
解壓:
1 sudo tar -zxf ~/下載/spark-1.6.0-bin-without-hadoop.tgz -C /usr/local/ 2 cd /usr/local 3 sudo mv ./spark-1.6.0-bin-without-hadoop/ ./spark 4 sudo chown -R hadoop:hadoop ./spark
運行 Spark Shell, 到 Spark的目錄下,執行
./bin/spark-shell
測試:(Scala)
1 val textFile = sc.textFile("file:///usr/local/spark/README.md") 2 3 textFile.count() // RDD 中的 item 數量,對於文本文件,就是總行數 4 // res0: Long = 95 5 6 textFile.first() // RDD 中的第一個 item,對於文本文件,就是第一行內容 7 // res1: String = # Apache Spark
參考: http://www.powerxing.com/spark-quick-start-guide/