java spark遠程調用問題


從IntelliJ IDEA提交應用(submit Application),從spark webUI上能觀察到集群在不停地add、remove Executor,無法正常執行。代碼及截圖如下:

 

 

 

 

 

 代碼:

SparkConf conf = new SparkConf()
                .setSparkHome(sparkHome)
                .setAppName(appName)
        conf.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer");
或者
   SparkSession spark = SparkSession.builder().master("spark://server01:7077").appName("HBASEDATA")
                .getOrCreate();

控制台輸出信息:

20/07/21 10:29:06 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://DESKTOP-A56927L:4040
20/07/21 10:29:06 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark://server01:7077...
20/07/21 10:29:06 INFO TransportClientFactory: Successfully created connection to bikini-bottom/192.168.0.91:7077 after 149 ms (0 ms spent in bootstraps)
... ...
20/07/21 10:29:34 INFO CoarseGrainedSchedulerBackend$DriverEndpoint: Asked to remove non-existent executor 13
20/07/21 10:29:34 INFO StandaloneSchedulerBackend: Granted executor ID app-20190721102906-0002/14 on hostPort 192.168.0.91:46381 with 1 core(s), 800.0 MB RAM
20/07/21 10:29:34 INFO StandaloneAppClient$ClientEndpoint: Executor updated: app-20190721102906-0002/14 is now RUNNING
20/07/21 10:29:34 INFO StandaloneAppClient$ClientEndpoint: Executor updated: app-20190721102906-0002/12 is now EXITED (Command exited with code 1)
20/07/21 10:29:34 INFO StandaloneSchedulerBackend: Executor app-20190721102906-0002/12 removed: Command exited with code 1
20/07/21 10:29:34 INFO BlockManagerMaster: Removal of executor 12 requested

Spark集群Executor分配情況:

 

 

 

 

 

 Executor的報錯信息:

 

 

 

 ---------------------------------------------------------------------------------------------------------------------

解決:

配置主機:

 

修改代碼:

SparkConf conf = new SparkConf()
                .setSparkHome(sparkHome)
                .setAppName(appName)
                //指定driver 的hosts-name
                .set("spark.driver.host","DESKTOP-T5HC2II")
                //指定driver的服務端口
                .set("spark.driver.port","9095")
                //內存大小
                .set("spark.executor.memory","800m")
                //CPU核心數
                .set("spark.driver.cores","1")
                .setMaster(master);
        conf.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer");
或者
        SparkSession spark = SparkSession.builder().master("spark://server01:7077").appName("HBASEDATA")
                //指定driver 的hosts-name
                .config("spark.driver.host","DESKTOP-T5HC2II")
                //指定driver的服務端口
                .config("spark.driver.port","9092")
                .getOrCreate();

 


免責聲明!

本站轉載的文章為個人學習借鑒使用,本站對版權不負任何法律責任。如果侵犯了您的隱私權益,請聯系本站郵箱yoyou2525@163.com刪除。



 
粵ICP備18138465號   © 2018-2025 CODEPRJ.COM