Windows配置本地Hadoop運行環境


 

很多人喜歡用Windows本地開發Hadoop程序,這里是一個在Windows下配置Hadoop的教程。

首先去官網下載hadoop,這里需要下載一個工具winutils,這個工具是編譯hadoop用的,下載完之后解壓hadoop文件,然后把winutils.exe放到hadoop文件的bin目錄下面

然后在hadoop/etc/hadoop下修改以下文件:

core-site.xml:

<?xml version="1.0" encoding="UTF-8"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<!--
  Licensed under the Apache License, Version 2.0 (the "License");
  you may not use this file except in compliance with the License.
  You may obtain a copy of the License at

    http://www.apache.org/licenses/LICENSE-2.0

  Unless required by applicable law or agreed to in writing, software
  distributed under the License is distributed on an "AS IS" BASIS,
  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
  See the License for the specific language governing permissions and
  limitations under the License. See accompanying LICENSE file.
-->

<!-- Put site-specific property overrides in this file. -->

<configuration>
  <property>
    <name>fs.defaultFS</name>
    <value>hdfs://localhost:9000/</value>
  </property>
  <property>
    <name>io.native.lib.available</name>
    <value>false</value>
  </property>
  <property>
    <name>hadoop.native.lib</name>
    <value>false</value>
  </property>
  <property>
    <name>io.compression.codecs</name>
    <value>org.apache.hadoop.io.compress.GzipCodec,
           org.apache.hadoop.io.compress.DefaultCodec,
           com.hadoop.compression.lzo.LzoCodec,
           com.hadoop.compression.lzo.LzopCodec,
           org.apache.hadoop.io.compress.BZip2Codec,
           org.apache.hadoop.io.compress.SnappyCodec
        </value>
</property>
<property>
    <name>io.compression.codec.lzo.class</name>
    <value>com.hadoop.compression.lzo.LzoCodec</value>
</property>

</configuration>

hdfs-site.xml:

<?xml version="1.0" encoding="UTF-8"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<!--
  Licensed under the Apache License, Version 2.0 (the "License");
  you may not use this file except in compliance with the License.
  You may obtain a copy of the License at

    http://www.apache.org/licenses/LICENSE-2.0

  Unless required by applicable law or agreed to in writing, software
  distributed under the License is distributed on an "AS IS" BASIS,
  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
  See the License for the specific language governing permissions and
  limitations under the License. See accompanying LICENSE file.
-->

<!-- Put site-specific property overrides in this file. -->

<configuration>  
       <property>  
               <name>dfs.replication</name>  
                <value>1</value>  
       </property>
       <property>
                <name>dfs.namenode.name.dir</name>  
               <value>file:///D:/Hadoop/namenode</value>  
       </property>
       <property>
               <name>dfs.datanode.data.dir</name>  
               <value>file:///D:/Hadoop/datanode</value>  
       </property>
</configuration>  

mapred-site.xml:

<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<!--
  Licensed under the Apache License, Version 2.0 (the "License");
  you may not use this file except in compliance with the License.
  You may obtain a copy of the License at

    http://www.apache.org/licenses/LICENSE-2.0

  Unless required by applicable law or agreed to in writing, software
  distributed under the License is distributed on an "AS IS" BASIS,
  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
  See the License for the specific language governing permissions and
  limitations under the License. See accompanying LICENSE file.
-->

<!-- Put site-specific property overrides in this file. -->

<configuration>
    <property>
        <name>mapreduce.framework.name</name>
        <value>yarn</value>
    </property>
    <property>
        <name>mapred.compress.map.output</name>
        <value>true</value>
    </property>
    <property>
        <name>mapred.map.output.compression.codec</name>
        <value>com.hadoop.compression.lzo.LzoCodec</value>
    </property> 
    <property> 
        <name>mapred.child.env</name> 
        <value>LD_LIBRARY_PATH=</value>
        <name>mapreduce.framework.name</name>
        <value>yarn</value>
    </property>
    <property>
        <name>mapred.compress.map.output</name>
        <value>true</value>
    </property>
    <property>
        <name>mapred.map.output.compression.codec</name>
        <value>com.hadoop.compression.lzo.LzoCodec</value>
    </property> 
    <property> 
        <name>mapred.child.env</name> 
        <value>LD_LIBRARY_PATH=D:\hadoop-2.7.3-win64\lib</value> 
    </property>
</configuration>

然后cmd到hadoop的bin目錄下執行:

hdfs namenode -format

然后在sbin目錄下執行:

start-all.cmd

 

然后瀏覽器打開http://localhost:8088:

執行hadoop命令:hadoop fs -ls /

空的,新建一個文件夾:hadoop fs -mkdir /data 

然后查看:hadoop fs -ls /

這樣就hadoop的本地偽分布式環境就配置好了。

 


免責聲明!

本站轉載的文章為個人學習借鑒使用,本站對版權不負任何法律責任。如果侵犯了您的隱私權益,請聯系本站郵箱yoyou2525@163.com刪除。



 
粵ICP備18138465號   © 2018-2025 CODEPRJ.COM