CDH5.16Spark升級成Spark2后HistoryServer無法啟動


報錯信息

21/04/22 09:53:15 INFO history.HistoryServer: Started daemon with process name: 10548@cloudera-02
21/04/22 09:53:15 INFO util.SignalUtils: Registered signal handler for TERM
21/04/22 09:53:15 INFO util.SignalUtils: Registered signal handler for HUP
21/04/22 09:53:15 INFO util.SignalUtils: Registered signal handler for INT
21/04/22 09:53:16 WARN spark.SparkConf: The configuration key 'spark.history.fs.update.interval.seconds' has been deprecated as of Spark 1.4 and may be removed in the future. Please use the new key 'spark.history.fs.update.interval' instead.
21/04/22 09:53:16 INFO spark.SecurityManager: Changing view acls to: root
21/04/22 09:53:16 INFO spark.SecurityManager: Changing modify acls to: root
21/04/22 09:53:16 INFO spark.SecurityManager: Changing view acls groups to: 
21/04/22 09:53:16 INFO spark.SecurityManager: Changing modify acls groups to: 
21/04/22 09:53:16 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(root); groups with view permissions: Set(); users  with modify permissions: Set(root); groups with modify permissions: Set()
21/04/22 09:53:16 INFO history.FsHistoryProvider: History server ui acls disabled; users with admin permissions: ; groups with admin permissions
Exception in thread "main" java.lang.reflect.InvocationTargetException
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
	at org.apache.spark.deploy.history.HistoryServer$.main(HistoryServer.scala:280)
	at org.apache.spark.deploy.history.HistoryServer.main(HistoryServer.scala)
Caused by: java.io.FileNotFoundException: Log directory specified does not exist: hdfs://cloudera-01:8020/user/spark/spark2ApplicationHistory
	at org.apache.spark.deploy.history.FsHistoryProvider.org$apache$spark$deploy$history$FsHistoryProvider$$startPolling(FsHistoryProvider.scala:267)
	at org.apache.spark.deploy.history.FsHistoryProvider.initialize(FsHistoryProvider.scala:211)
	at org.apache.spark.deploy.history.FsHistoryProvider.<init>(FsHistoryProvider.scala:207)
	at org.apache.spark.deploy.history.FsHistoryProvider.<init>(FsHistoryProvider.scala:86)
	... 6 more
Caused by: java.io.FileNotFoundException: File does not exist: hdfs://cloudera-01:8020/user/spark/spark2ApplicationHistory
	at org.apache.hadoop.hdfs.DistributedFileSystem$20.doCall(DistributedFileSystem.java:1270)
	at org.apache.hadoop.hdfs.DistributedFileSystem$20.doCall(DistributedFileSystem.java:1262)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1262)
	at org.apache.spark.deploy.history.FsHistoryProvider.org$apache$spark$deploy$history$FsHistoryProvider$$startPolling(FsHistoryProvider.scala:257)
	... 9 more

解決方案

原來是Spark升級后,沒有在HDFS上創建spark2ApplicationHistory目錄,根據錯誤提示

Caused by: java.io.FileNotFoundException: Log directory specified does not exist: hdfs://cloudera-01:8020/user/spark/spark2ApplicationHistory

手動在HDFS上創建目錄
使用hadoop dfs -mkdir /user/spark/spark2ApplicationHistory創建目錄,但是發現權限不足

原來,CDH5.16的HDFS中,最高權限用戶是root,我們需要修改HDFS權限,用來創建目錄

1)為了創建目錄,我們先進入到HDFS配置中,將dfs.permissions取消勾選(CDH默認會開啟權限認證)

2)重啟集群
3)然后再次使用hadoop dfs -mkdir /user/spark/spark2ApplicationHistory創建目錄,
4)創建好后啟動Spark2進行測試

5)Spark2運行成功后, 記得重新將dfs.permission勾選上

6)重啟集群,問題解決!


免責聲明!

本站轉載的文章為個人學習借鑒使用,本站對版權不負任何法律責任。如果侵犯了您的隱私權益,請聯系本站郵箱yoyou2525@163.com刪除。



 
粵ICP備18138465號   © 2018-2025 CODEPRJ.COM