java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries


在已經搭建好的集群環境Centos6.6+Hadoop2.7+Hbase0.98+Spark1.3.1下,在Win7系統Intellij開發工具中調試Spark讀取Hbase。運行直接報錯:

?
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
15 / 06 / 11 15 : 35 : 50 ERROR Shell: Failed to locate the winutils binary in the hadoop binary path
java.io.IOException: Could not locate executable null \bin\winutils.exe in the Hadoop binaries.
     at org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java: 356 )
     at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java: 371 )
     at org.apache.hadoop.util.Shell.<clinit>(Shell.java: 364 )
     at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java: 80 )
     at org.apache.hadoop.security.SecurityUtil.getAuthenticationMethod(SecurityUtil.java: 611 )
     at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java: 272 )
     at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java: 260 )
     at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java: 790 )
     at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java: 760 )
     at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java: 633 )
     at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$ 1 .apply(Utils.scala: 2001 )
     at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$ 1 .apply(Utils.scala: 2001 )
     at scala.Option.getOrElse(Option.scala: 120 )
     at org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala: 2001 )
     at org.apache.spark.SecurityManager.<init>(SecurityManager.scala: 207 )
     at org.apache.spark.SparkEnv$.create(SparkEnv.scala: 218 )
     at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala: 163 )
     at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala: 269 )
     at org.apache.spark.SparkContext.<init>(SparkContext.scala: 272 )
     at org.apache.spark.SparkContext.<init>(SparkContext.scala: 154 )
     at SparkFromHbase$.main(SparkFromHbase.scala: 15 )
     at SparkFromHbase.main(SparkFromHbase.scala)
     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java: 57 )
     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java: 43 )
     at java.lang.reflect.Method.invoke(Method.java: 606 )
     at com.intellij.rt.execution.application.AppMain.main(AppMain.java: 134 )

查看hadoop源碼發現里有這么一段:

?
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
   public static final String getQualifiedBinPath(String executable)
   throws IOException {
     // construct hadoop bin path to the specified executable
     String fullExeName = HADOOP_HOME_DIR + File.separator + "bin"
       + File.separator + executable;
 
     File exeFile = new File(fullExeName);
     if (!exeFile.exists()) {
       throw new IOException( "Could not locate executable " + fullExeName
         + " in the Hadoop binaries." );
     }
 
     return exeFile.getCanonicalPath();
   }
 
private static String HADOOP_HOME_DIR = checkHadoopHome();
private static String checkHadoopHome() {
 
     // first check the Dflag hadoop.home.dir with JVM scope
     String home = System.getProperty( "hadoop.home.dir" );
 
     // fall back to the system/user-global env variable
     if (home == null ) {
       home = System.getenv( "HADOOP_HOME" );
     }
      ...
}

很明顯應該是HADOOP_HOME的問題。如果HADOOP_HOME為空,必然fullExeName為null\bin\winutils.exe。解決方法很簡單,配置環境變量,不想重啟電腦可以在程序里加上:

?
1
System.setProperty( "hadoop.home.dir" , "E:\\Program Files\\hadoop-2.7.0" );

注:E:\\Program Files\\hadoop-2.7.0是我本機解壓的hadoop的路徑。

稍后再執行,你可能還是會出現同樣的錯誤,這個時候你可能會要怪我了。其實一開始我是拒絕的,因為你進入你的hadoop-x.x.x/bin目錄下看,你會發現你壓根就沒有winutils.exe這個東東。

於是我告訴你,你可以去github下載一個,地球人都知道的地址發你一個。

地址:https://github.com/srccodes/hadoop-common-2.2.0-bin

不要顧慮它的版本,不用怕,因為我用的最新的hadoop-2.7.0都沒問題!下載好后,把winutils.exe加入你的hadoop-x.x.x/bin下。

至此問題解決了,如果還沒解決,那你是奇葩哥了,可以加我的QQ!


免責聲明!

本站轉載的文章為個人學習借鑒使用,本站對版權不負任何法律責任。如果侵犯了您的隱私權益,請聯系本站郵箱yoyou2525@163.com刪除。



 
粵ICP備18138465號   © 2018-2025 CODEPRJ.COM