Spark應用程序開發流程


配置文件:

pom.xml

  <properties>
    <scala.version>2.11.8</scala.version>
    <spark.version>2.2.0</spark.version>
    <hadoop.version>2.6.0-cdh5.7.0</hadoop.version>
  </properties>

  <repositories>
    <!--添加cloudera倉庫依賴, CDH版本是cloudera倉庫下的-->
    <repository>
      <id>cloudera</id>
      <name>cloudera</name>
      <url>https://repository.cloudera.com/artifactory/cloudera-repos/</url>
    </repository>
  </repositories>
  
    <dependencies>

    <!--添加scala依賴-->
    <dependency>
      <groupId>org.scala-lang</groupId>
      <artifactId>scala-library</artifactId>
      <version>${scala.version}</version>
    </dependency>

    <!--添加spark-code的依賴,scala版本2.11-->
    <dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-core_2.11</artifactId>
      <version>${spark.version}</version>
    </dependency>

    <!--添加hadoop-client的依賴-->
    <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-client</artifactId>
      <version>${hadoop.version}</version>
    </dependency>

  </dependencies>



測試代碼:

傳入參數:
屏幕快照 2019-05-07 19.34.31

WordCountApp.scala

package com.ruozedata

import org.apache.spark.{SparkConf, SparkContext}

object WordCountApp extends App {

  val conf = new SparkConf()
  val sc = new SparkContext(conf)

  //輸入(用args()傳入參數,非硬編碼)
  val dataFile = sc.textFile(args(0))

  //業務邏輯
  val outputFile = dataFile.flatMap(_.split(",")).map((_,1)).reduceByKey(_+_)

  //輸出文件
  outputFile.saveAsTextFile(args(1))

  //關閉流(輸入流)
  sc.stop()
}



CLI中測試:

屏幕快照 2019-05-07 19.48.15



打包提交到服務器並執行:

屏幕快照 2019-05-07 18.53.47
屏幕快照 2019-05-07 18.39.57
屏幕快照 2019-05-07 18.37.34
屏幕快照 2019-05-07 18.29.52
屏幕快照 2019-05-07 18.19.31
屏幕快照 2019-05-07 18.21.00
屏幕快照 2019-05-07 18.38.28
屏幕快照 2019-05-07 19.39.07



Linux下本地模式提交到服務器: (在腳本中配置)

$ /home/hadoop/app/spark/bin/spark-submit \
--class com.ruozedata.WordCountApp \
--master local[2] \
--name WordCountApp \
/home/hadoop/lib/spark/SparkCodeApp-1.0.jar \
/wc_input/ /wc_output

具體配置參考Spark官網:

http://spark.apache.org/docs/2.2.0/rdd-programming-guide.html
http://spark.apache.org/docs/2.2.0/configuration.html
http://spark.apache.org/docs/2.2.0/submitting-applications.html


免責聲明!

本站轉載的文章為個人學習借鑒使用,本站對版權不負任何法律責任。如果侵犯了您的隱私權益,請聯系本站郵箱yoyou2525@163.com刪除。



 
粵ICP備18138465號   © 2018-2025 CODEPRJ.COM