Spark On Yarn中spark.yarn.jar屬性的使用


今天在測試spark-sql運行在yarn上的過程中,無意間從日志中發現了一個問題:

spark-sql --master yarn
14/12/29 15:23:17 INFO Client: Requesting a new application from cluster with 1 NodeManagers
14/12/29 15:23:17 INFO Client: Verifying our application has not requested more than the maximum memory capability of the cluster (8192 MB per container)
14/12/29 15:23:17 INFO Client: Will allocate AM container, with 896 MB memory including 384 MB overhead
14/12/29 15:23:17 INFO Client: Setting up container launch context for our AM
14/12/29 15:23:17 INFO Client: Preparing resources for our AM container
14/12/29 15:23:17 INFO Client: Uploading resource file:/home/spark/software/source/compile/deploy_spark/assembly/target/scala-2.10/spark-assembly-1.3.0-SNAPSHOT-hadoop2.3.0-cdh5.0.0.jar -> hdfs://hadoop000:8020/user/spark/.sparkStaging/application_1416381870014_0093/spark-assembly-1.3.0-SNAPSHOT-hadoop2.3.0-cdh5.0.0.jar
14/12/29 15:23:18 INFO Client: Setting up the launch environment for our AM container

再開啟一個spark-sql命令行,從日志中再次發現:

14/12/29 15:24:03 INFO Client: Requesting a new application from cluster with 1 NodeManagers
14/12/29 15:24:03 INFO Client: Verifying our application has not requested more than the maximum memory capability of the cluster (8192 MB per container)
14/12/29 15:24:03 INFO Client: Will allocate AM container, with 896 MB memory including 384 MB overhead
14/12/29 15:24:03 INFO Client: Setting up container launch context for our AM
14/12/29 15:24:03 INFO Client: Preparing resources for our AM container
14/12/29 15:24:03 INFO Client: Uploading resource file:/home/spark/software/source/compile/deploy_spark/assembly/target/scala-2.10/spark-assembly-1.3.0-SNAPSHOT-hadoop2.3.0-cdh5.0.0.jar -> hdfs://hadoop000:8020/user/spark/.sparkStaging/application_1416381870014_0094/spark-assembly-1.3.0-SNAPSHOT-hadoop2.3.0-cdh5.0.0.jar
14/12/29 15:24:05 INFO Client: Setting up the launch environment for our AM container

然后查看HDFS上的文件:

hadoop fs -ls hdfs://hadoop000:8020/user/spark/.sparkStaging/
drwx------   - spark supergroup          0 2014-12-29 15:23 hdfs://hadoop000:8020/user/spark/.sparkStaging/application_1416381870014_0093
drwx------   - spark supergroup          0 2014-12-29 15:24 hdfs://hadoop000:8020/user/spark/.sparkStaging/application_1416381870014_0094

每個Application都會上傳一個spark-assembly-x.x.x-SNAPSHOT-hadoopx.x.x-cdhx.x.x.jar的jar包,影響HDFS的性能以及占用HDFS的空間。

 

在Spark文檔(http://spark.apache.org/docs/latest/running-on-yarn.html)中發現spark.yarn.jar屬性,將spark-assembly-xxxxx.jar存放在hdfs://hadoop000:8020/spark_lib/下

在spark-defaults.conf添加屬性配置:

spark.yarn.jar hdfs://hadoop000:8020/spark_lib/spark-assembly-1.3.0-SNAPSHOT-hadoop2.3.0-cdh5.0.0.jar

再次啟動spark-sql --master yarn觀察日志:

14/12/29 15:39:02 INFO Client: Requesting a new application from cluster with 1 NodeManagers
14/12/29 15:39:02 INFO Client: Verifying our application has not requested more than the maximum memory capability of the cluster (8192 MB per container)
14/12/29 15:39:02 INFO Client: Will allocate AM container, with 896 MB memory including 384 MB overhead
14/12/29 15:39:02 INFO Client: Setting up container launch context for our AM
14/12/29 15:39:02 INFO Client: Preparing resources for our AM container
14/12/29 15:39:02 INFO Client: Source and destination file systems are the same. Not copying hdfs://hadoop000:8020/spark_lib/spark-assembly-1.3.0-SNAPSHOT-hadoop2.3.0-cdh5.0.0.jar
14/12/29 15:39:02 INFO Client: Setting up the launch environment for our AM container

觀察HDFS上文件

hadoop fs -ls hdfs://hadoop000:8020/user/spark/.sparkStaging/application_1416381870014_0097

該Application對應的目錄下沒有spark-assembly-xxxxx.jar了,從而節省assembly包上傳的過程以及HDFS空間占用。

 

我在測試過程中遇到了類似如下的錯誤:

Application application_xxxxxxxxx_yyyy failed 2 times due to AM Container for application_xxxxxxxxx_yyyy 

exited with exitCode: -1000 due to: java.io.FileNotFoundException: File /tmp/hadoop-spark/nm-local-dir/filecache does not exist

在/tmp/hadoop-spark/nm-local-dir路徑下創建filecache文件夾即可解決報錯問題。

 


免責聲明!

本站轉載的文章為個人學習借鑒使用,本站對版權不負任何法律責任。如果侵犯了您的隱私權益,請聯系本站郵箱yoyou2525@163.com刪除。



 
粵ICP備18138465號   © 2018-2025 CODEPRJ.COM