Hadoop 3.2.1 win10 64位系統 vs2015 編譯


Hadoop 3.2.1 win10 64位系統 vs2015 編譯

1        環境配置

1.1   JDK下載安裝

1.1.1         下載

 JDK 1.8    (jdk1.8.0_102)
   下載地址    : http://www.oracle.com/technetwork/java/javase/downloads/jdk8-downloads-2133151.html(按照電腦版本下載安裝)

1.1.2         安裝

解壓到指定文件夾

(1) 安裝JDK 
(2)新建系統變量JAVA_HOME=D:\Program Files\Java\jdk1.8.0_102 
(3)編輯系統變量Path,新增%JAVA_HOME%\bin%JAVA_HOME%\jre\bin

1.2   Marven下載和安裝

1.2.1         下載

https://blog.csdn.net/changge458/article/details/53576178

從該網站 http://maven.apache.org/download.cgi 下載

 

 

 

1.2.2         安裝

解壓到文件夾D:\marven\apache-maven-3.6.3

添加系統環境變量:MARVEN_HOME

 

 

 

D:\marven\apache-maven-3.6.3\

在系統環境變量path中加入

 

 

 

 

測試mvn是否安裝成功,打開cmd,以管理員身份運行,否則容易報錯:'mvn' 不是內部或外部命令,也不是可運行的程序

 

 

 

 

查看是否配置成功可在黑窗口中輸入 mvn –v 出現如下圖所示說明配置成功

 

 

 

 

打開D:\marven\apache-maven-3.6.3\conf\settings.xml 加入阿里雲鏡像,找到mirrors標簽,添加如下阿里雲鏡像地址,marven下載庫從這里下載更快速。

<mirror>

    <id>nexus-aliyun</id>

    <mirrorOf>central</mirrorOf>

    <name>Nexus aliyun</name>

    <url>http://maven.aliyun.com/nexus/content/groups/public</url>

</mirror>

 

1.3   編譯安裝protobuff

Protobuff是用於序列化數據的,hadoop編譯需要依賴這個庫。

1.3.1         下載

ProtocolBuffer 2.5.0    (兩個文件protobuf-2.5.0.zip  protoc-2.5.0-win32.zip )

   下載地址    : https://github.com/google/protobuf/releases/tag/v2.5.0

   注:除了下載protobuf源碼外,還需要下載相應版本的編譯過的用於Windows平台的protoc命令(protoc-2.5.0-win32.zip),該命令用於將.proto文件轉化為Java或C++源文件。

1.3.2         安裝

① 解壓ProtocolBuffer到指定目錄

② 解壓protoc-2.5.0-win32.zip,將protoc.exe復制到C:\WorkSpace\protobuf-2.5.0\src目錄下

③ 安裝ProtocolBuffer,打開CMD命令行

 

cd C:\WorkSpace\protobuf-2.5.0\java

mvn test

 

 

 

mvn install

 

 

 

protoc –version    這個命令失敗

protoc.exe所在路徑C:\WorkSpace\protobuf-2.5.0\src,添加到系統變量Path

 

 

 

 

1.4   Git下載和安裝

Git是在window的cmd黑窗口中,可以使用linux的命令。不安裝會出現錯誤

Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.3.1:exec (pre-dist) on project hadoop-project-dist: Command execution failed.: Cannot run program "bash" (in directory "D:\hadoop\hadoop-3.2.1-src\hadoop-project-dist\target"): CreateProcess error=2, 系統找不到指定的文件。 -> [Help 1]

 

 

 

1.4.1         下載

 https://git-for-windows.github.io/

1.4.2         安裝

(1)
     
      

 

 


   (2)、
      

 

 


    (3)、
      

 

 


    (4)、
      

 

 


   (5)、
      

 

 

 

 

1.5   CMake 安裝

1.5.1         下載

 https://cmake.org/download/

Windows win64-x64 ZIP  cmake-3.16.0-win64-x64.zip

1.5.2         安裝

解壓到指定文件夾即可,path環境變量添加bin路徑。

D:\hadoop\cmake-3.16.0-win64-x64\bin

 

 

 

1.6   Zlib下載安裝

1.6.1         下載

http://jaist.dl.sourceforge.net/project/libpng/zlib/1.2.8/

 

 

1.6.2         安裝

解壓到指定文件夾,添加zlib系統變量

 

 

 

 

2        編譯hadoop

2.1   升級項目Visual Studio的版本

hadoop 各個版本的下載地址如

http://archive.apache.org/dist/hadoop/core/

編譯hadoop時,自動編譯會去根據項目文件找到編譯器visual studio去編譯,hadoop3.2.1默認是VS2010,現在要用VS2015去編譯,需要將項目文件winutils.sln和native.sln升級為vs2015。由於Visual Studio版本問題,官方默認使用Visual Studio 2010 Professional,但本文采用Visual Studio 2015,因此對於生成失敗的項目,需要用Visual Studio 2015重新打開,會彈出升級的對話框,升級項目至Visual Studio 2015版本即可:

(1)       Window工具編譯

D:\hadoop\hadoop-3.2.1-src\hadoop-common-project\hadoop-common\src\main\winutils

(2)hadoop.dll編譯

D:\hadoop\hadoop-3.2.1-src\hadoop-common-project\hadoop-common\src\main\native

這個項目編譯輸出的就是hadoop.dll文件,編輯編譯生成按鈕,會報很多頭文件錯誤,是因為投啊文件件包含了..\..\..\target\native\javah,這個文件夾內的頭文件,編譯hadoop時,編譯命令自動先復制頭文件到此文件夾,然后才能編譯。

打開cmd窗口,進入路徑D:\hadoop\hadoop-3.2.1-src\,執行下面的命令:

mvn package -Pdist,native-win-DskipTests -Dtar

2.2   Apache Hadoop Common編譯錯誤

2.2.1         convert-ms-winutils錯誤描述

錯誤描述

[INFO] Apache Hadoop Auth Examples ........................ SUCCESS [ 12.916 s]

[INFO] Apache Hadoop Common ............................... FAILURE [02:06 min]

[INFO] Apache Hadoop NFS .................................. SKIPPED

[INFO] Apache Hadoop KMS .................................. SKIPPED

[INFO] ------------------------------------------------------------------------

[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.3.1:exec (convert-ms-winutils) on project hadoop-common: Command execution failed.: Process exited with an error: 1 (Exit value: 1) -> [Help 1]

[ERROR]

[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.

[ERROR] Re-run Maven using the -X switch to enable full debug logging.

[ERROR]

[ERROR] For more information about the errors and possible solutions, please read the following articles:

[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException

[ERROR]

[ERROR] After correcting the problems, you can resume the build with the command

[ERROR]   mvn <args> -rf :hadoop-common

 

解決辦法

(1)   確認2.1中兩個項目都升級成功,sln文件都是顯示Visual studio 14

(2)   VS 2015 x64 native Tools 工具執行命令mvn package -Pdist,native-win -DskipTests -Dtar -e -X

 

 

 

 

 

 

 

2.2.2         Javah錯誤

錯誤描述

[ERROR] Failed to execute goal org.codehaus.mojo:native-maven-plugin:1.0-alpha-8:javah (default) on project hadoop-common: Error running javah command: Error executing command line. Exit code:1 -> [Help 1]

org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.codehaus.mojo:native-maven-plugin:1.0-alpha-8:javah (default) on project hadoop-common: Error running javah command

 

 

 

解決辦法

因為D:\hadoop\hadoop-3.2.1-src\hadoop-common-project\hadoop-common路徑下的pom.xml文件中,<javahPath>${env.JAVA_HOME}/bin/javah</javahPath>采用env.JAVA_HOME 無法識別,所以講pom.xml文件中換成javah的絕對路徑:D:\Java\jdk1.8.0_181\bin\,一共有兩個地方。

 

 

 

然后命令加入參數執行–rf :hadoop-common 表示從hadoop-common開始編譯:

mvn package -Pdist,native-win-DskipTests –Dtar –rf :hadoop-common

2.3  HDFS Native Client編譯錯誤

2.3.1         錯誤描述

[INFO] Apache Hadoop HDFS Native Client ................... FAILURE [02:26 min]

[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (make) on project hadoop-hdfs-native-client: An Ant BuildException has occured: exec returned: 1

[ERROR] around Ant part ...<exec failonerror="true" dir="D:\hadoop\hadoop-3.2.1-src\hadoop-hdfs-project\hadoop-hdfs-native-client\target/native" executable="cmake">... @ 5:140 in D:\hadoop\hadoop-3.2.1-src\hadoop-hdfs-project\hadoop-hdfs-native-client\target\antrun\build-main.xml

[ERROR] -> [Help 1]

org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (make) on project hadoop-hdfs-native-client: An Ant BuildException has occured: exec returned: 1

around Ant part ...<exec failonerror="true" dir="D:\hadoop\hadoop-3.2.1-src\hadoop-hdfs-project\hadoop-hdfs-native-client\target/native" executable="cmake">... @ 5:140 in D:\hadoop\hadoop-3.2.1-src\hadoop-hdfs-project\hadoop-hdfs-native-client\target\antrun\build-main.xml

 

 

 

 

2.3.2         解決方法

D:\hadoop\hadoop-3.2.1-src\hadoop-hdfs-project\hadoop-hdfs-native-client\pom.xml文件打開,修改如下部分的true為false;

 

 

 

 

2.4  hadoop-hdfs-native-client :RelWithDebInfo編譯錯誤

2.4.1         錯誤描述

Finished at: 2019-12-01T18:20:30+08:00

[INFO] ------------------------------------------------------------------------

[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (make) on project hadoop-hdfs-native-client: An Ant BuildException has occured: D:\hadoop\hadoop-3.2.1-src\hadoop-hdfs-project\hadoop-hdfs-native-client\target\native\bin\RelWithDebInfo does not exist.

[ERROR] around Ant part ...<copy todir="D:\hadoop\hadoop-3.2.1-src\hadoop-hdfs-project\hadoop-hdfs-native-client\target/bin">... @ 13:101 in D:\hadoop\hadoop-3.2.1-src\hadoop-hdfs-project\hadoop-hdfs-native-client\target\antrun\build-main.xml

[ERROR] -> [Help 1]

 

2.4.2         解決方法

D:\hadoop\hadoop-3.2.1-src\hadoop-hdfs-project\hadoop-hdfs-native-client\target\native\bin\RelWithDebInfo does not exist.錯誤是這個目錄不存在,則創建這個目錄

2.5  執行maven-plugin:1.3.1:exec (pre-dist)失敗

2.5.1         錯誤描述

Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.3.1:exec (pre-dist) on project hadoop-hdfs-native-client: Command execution failed.: Process exited with an error: 1 (Exit value: 1) -> [Help 1]

org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.3.1:exec (pre-dist) on project hadoop-hdfs-native-client: Command execution failed.

2.5.2         解決辦法

下載安裝Cygwin
下載:http://www.cygwin.com/setup-x86_64.exe ,安裝
配置D:\cygwin64\bin到PATH

 

 

然后打開cygwin64的terminal窗口,執行命令mvn package -Pdist,native-win -DskipTests -Dtar -e -X -rf :hadoop-hdfs-native-client

 

 

 

 

2.6  Msbuild編譯錯誤

2.6.1         錯誤描述

Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (make) on project hadoop-hdfs-native-client: An Ant BuildException has occured: Execute failed: java.io.IOException: Cannot run program "msbuild" (in directory "D:\hadoop\hadoop-3.2.1-src\hadoop-hdfs-project\hadoop-hdfs-native-client\target\native"): CreateProcess error=2, ϵͳ▒Ҳ▒▒▒ָ▒▒▒▒▒ļ▒▒▒

[ERROR] around Ant part ...<exec failonerror="false" dir="D:\hadoop\hadoop-3.2.1-src\hadoop-hdfs-project\hadoop-hdfs-native-client\target/native" executable="msbuild">... @ 9:143 in D:\hadoop\hadoop-3.2.1-src\hadoop-hdfs-project\hadoop-hdfs-native-client\target\antrun\build-main.xml

 

2.6.2         解決辦法

下載安裝vscode  https://code.visualstudio.com/docs/?dv=win

將路徑C:\Program Files (x86)\MSBuild\14.0\Bin添加到path環境變量。

 

2.7  maven-surefire-plugin編譯錯誤

2.7.1         錯誤描述

Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:3.0.0-M1:test (default-test)

2.7.2         解決方法

在D:\hadoop\hadoop-3.2.1-src\hadoop-common-project\hadoop-common\pom.xml找到org.apache.maven.plugins,在configuration中添加<testFailureIgnore>true</testFailureIgnore>屏蔽測試錯誤。

<plugin>

            <groupId>org.apache.maven.plugins</groupId>

            <artifactId>maven-surefire-plugin</artifactId>

            <configuration>

                       <testFailureIgnore>true</testFailureIgnore>

              <forkCount>${testsThreadCount}</forkCount>

              <reuseForks>false</reuseForks>

              <argLine>${maven-surefire-plugin.argLine} -DminiClusterDedicatedDirs=true</argLine>

              <systemPropertyVariables>

                <testsThreadCount>${testsThreadCount}</testsThreadCount>

                <test.build.data>${test.build.data}/${surefire.forkNumber}</test.build.data>

                <test.build.dir>${test.build.dir}/${surefire.forkNumber}</test.build.dir>

                <hadoop.tmp.dir>${hadoop.tmp.dir}/${surefire.forkNumber}</hadoop.tmp.dir>

 

                <!-- Due to a Maven quirk, setting this to just -->

                <!-- surefire.forkNumber won't do the parameter substitution. -->

                <!-- Putting a prefix in front of it like "fork-" makes it -->

                <!-- work. -->

                <test.unique.fork.id>fork-${surefire.forkNumber}</test.unique.fork.id>

              </systemPropertyVariables>

            </configuration>

          </plugin>

 

2.8  Apache Hadoop Distribution編譯錯誤

2.8.1         錯誤描述

[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.3.1:exec (dist) on project hadoop-dist: Command execution failed.: Process exited with an error: 1 (Exit value: 1) -> [Help 1]

 

 

 

2.8.2         解決辦法

執行命令mvn package -Pdist,native-win -DskipTests -Dtar -e -X -rf :hadoop-dist,加上-e和-X參數顯示詳細的錯誤。會發現是目錄結構問題,找不到如下路徑。解決方法就是創建缺少的路徑,並將生成的jar包成果物復制到路徑下

[DEBUG] Executing command line: [bash, D:\hadoop\hadoop-3.2.1-src\hadoop-dist/../dev-support/bin/dist-layout-stitching, 3.2.1, D:\hadoop\hadoop-3.2.1-src\hadoop-dist\target]

cp: cannot stat '/d/hadoop/hadoop-3.2.1-src/hadoop-common-project/hadoop-kms/target/hadoop-kms-3.2.1/*': No such file or directory

2.9  hadoop-dist: exec (toolshooks)編譯錯誤

2.9.1         錯誤描述

Executing command line: [bash, D:\hadoop\hadoop-3.2.1-src\hadoop-dist/../dev-support/bin/dist-tools-hooks-maker, 3.2.1, D:\hadoop\hadoop-3.2.1-src\hadoop-dist\target, D:\hadoop\hadoop-3.2.1-src\hadoop-dist/../hadoop-tools]

找不到文件 - *.tools-builtin.txt

D:\hadoop\hadoop-3.2.1-src\hadoop-dist/../dev-support/bin/dist-tools-hooks-maker: line 137: D:\hadoop\hadoop-3.2.1-src\hadoop-dist\target/hadoop-3.2.1/etc/hadoop/hadoop-env.sh.new: No such file or directory

mv: cannot stat 'D:\hadoop\hadoop-3.2.1-src\hadoop-dist\target/hadoop-3.2.1/etc/hadoop/hadoop-env.sh.new': No such file or directory

Rewriting D:\hadoop\hadoop-3.2.1-src\hadoop-dist\target/hadoop-3.2.1/etc/hadoop/hadoop-env.sh

[INFO] ------------------------------------------------------------------------

[INFO] Reactor Summary for Apache Hadoop Distribution 3.2.1:

[INFO]

[INFO] Apache Hadoop Distribution ......................... FAILURE [ 25.920 s]

[INFO] Apache Hadoop Client Modules ....................... SKIPPED

[INFO] Apache Hadoop Cloud Storage ........................ SKIPPED

[INFO] Apache Hadoop Cloud Storage Project ................ SKIPPED

[INFO] ------------------------------------------------------------------------

[INFO] BUILD FAILURE

[INFO] ------------------------------------------------------------------------

[INFO] Total time:  33.530 s

[INFO] Finished at: 2019-12-08T11:56:12+08:00

[INFO] ------------------------------------------------------------------------

[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.3.1:exec (toolshooks) on project hadoop-dist: Command execution failed.: Process exited with an error: 1 (Exit value: 1) -> [Help 1]

org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.3.1:exec (toolshooks) on project hadoop-dist: Command execution failed.

2.9.2         解決方法

(1)錯誤原因和2.8相似,缺少了文件夾和文件,按照錯誤提示創建文件夾,找到缺少的文件放入指定文件夾。

D:\hadoop\hadoop-3.2.1-src\hadoop-dist/../dev-support/bin/dist-tools-hooks-maker, 3.2.1, D:\hadoop\hadoop-3.2.1-src\hadoop-dist\target, D:\hadoop\hadoop-3.2.1-src\hadoop-dist/../hadoop-tools]

找不到文件 - *.tools-builtin.txt

D:\hadoop\hadoop-3.2.1-src\hadoop-dist/../dev-support/bin/dist-tools-hooks-maker: line 137: D:\hadoop\hadoop-3.2.1-src\hadoop-dist\target/hadoop-3.2.1/etc/hadoop/hadoop-env.sh.new: No such file or directory

mv: cannot stat 'D:\hadoop\hadoop-3.2.1-src\hadoop-dist\target/hadoop-3.2.1/etc/hadoop/hadoop-env.sh.new': No such file or directory

Rewriting D:\hadoop\hadoop-3.2.1-src\hadoop-dist\target/hadoop-3.2.1/etc/hadoop/hadoop-env.sh

(2)根據描述是添加了/etc/hadoop文件夾,並且從D:\hadoop\hadoop-3.2.1-src\hadoop-common-project\hadoop-common\src\main\conf \ 中追到了hadoop-env.sh復制過去,但是發現在執行編譯時,總是會自動刪除這個文件夾和文件。在執行下面的命令時,刪除的:

[DEBUG] Executing command line: [bash, D:\hadoop\hadoop-3.2.1-src\hadoop-dist/../dev-support/bin/dist-layout-stitching, 3.2.1, D:\hadoop\hadoop-3.2.1-src\hadoop-dist\target]

Current directory /d/hadoop/hadoop-3.2.1-src/hadoop-dist/target

$ rm -rf hadoop-3.2.1

$ mkdir hadoop-3.2.1

$ cd hadoop-3.2.1

(3)打開D:\hadoop\hadoop-3.2.1-src\hadoop-dist/../dev-support/bin/dist-layout-stitching文件,發現dist-layout-stitching中有一個語句run rm -rf "hadoop-${VERSION}"會刪除hadoop-3.2.1文件夾,然后再新建,新建之后再復制,所以怎么加etc/hadoop文件夾都會被刪除

(4)在文件末尾處添加如下代碼,用代碼去創建文件夾,並復制hadoop-env.sh文件。

run mkdir "etc"

run cd "etc"

run mkdir "hadoop"

run cd "hadoop"

run copy  "${ROOT}\hadoop-common-project\hadoop-common\src\main\conf\hadoop-env.sh"

 

 

 

(5)執行命令>mvn package -Pdist,native-win -DskipTests -Dtar -e -X -rf :hadoop-dist

3       大工告成

[DEBUG]   (f) siteDirectory = D:\hadoop\hadoop-3.2.1-src\hadoop-cloud-storage-project\src\site

[DEBUG]   (f) skip = false

[DEBUG] -- end configuration --

[INFO] No site descriptor found: nothing to attach.

[INFO] ------------------------------------------------------------------------

[INFO] Reactor Summary for Apache Hadoop Distribution 3.2.1:

[INFO]

[INFO] Apache Hadoop Distribution ......................... SUCCESS [ 39.961 s]

[INFO] Apache Hadoop Client Modules ....................... SUCCESS [  5.721 s]

[INFO] Apache Hadoop Cloud Storage ........................ SUCCESS [  10:36 h]

[INFO] Apache Hadoop Cloud Storage Project ................ SUCCESS [  1.471 s]

[INFO] ------------------------------------------------------------------------

[INFO] BUILD SUCCESS

[INFO] ------------------------------------------------------------------------

[INFO] Total time:  10:37 h

[INFO] Finished at: 2019-12-09T10:05:07+08:00

[INFO] ------------------------------------------------------------------------

 

 

 

自己開發了一個股票智能分析軟件,功能很強大,需要的點擊下面的鏈接獲取:

https://www.cnblogs.com/bclshuai/p/11380657.html

 

參考文獻

https://blog.csdn.net/qq_37475168/article/details/90746823

 


免責聲明!

本站轉載的文章為個人學習借鑒使用,本站對版權不負任何法律責任。如果侵犯了您的隱私權益,請聯系本站郵箱yoyou2525@163.com刪除。



 
粵ICP備18138465號   © 2018-2025 CODEPRJ.COM