開發環境:
win10+idea+jdk1.8+scala2.12.4
具體步驟:
- 編寫scala測試類
object MyTest { def main(args: Array[String]): Unit = { val conf = new SparkConf() conf.setAppName("MyTest") conf.setMaster("local") val sc = new SparkContext(conf) val input = sc.textFile("file:///F:/sparktest/catalina.out") val count = input.filter(_.contains("java.lang.NullPointerException")).count System.out.println("空指針異常數" + count) sc.stop() } }
- 設置工程輸出路徑
- 打jar包設置
- java編寫調用類(需要依賴saprk包,可以將所有相關的包都加到lib依賴)
public class SubmitScalaJobToSpark { public static void main(String[] args) { String[] arg0 = new String[]{ "--master", "spark://node101:7077", "--deploy-mode", "client", "--name", "test java submit job to spark", "--class", "MyTest",//指定spark任務執行函數所在類 "--executor-memory", "1G",//運行內存 "E:\\其他代碼倉庫\\spark\\out\\artifacts\\unnamed\\unnamed.jar",//jar包路徑 }; SparkSubmit.main(arg0); } }
- 運行測試