Java操作Hadoop-HDFS API Maven環境搭建


1、創建一個Java項目,將一下代碼粘貼到pom.xml中 

<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<hadoop.version>2.6.0</hadoop.version>
</properties>
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.10</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>${hadoop.version}</version>
</dependency>
<-- <dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>${hadoop.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs</artifactId>
<version>${hadoop.version}</version>
</dependency>
  -->
</dependencies>
2、測試代碼:
Configuration configuration = null;
FileSystem fileSystem = null;
String url = "hdfs://CentOS7:8020";
String user = "hadoop";
@Before
public void before() throws URISyntaxException, IOException, InterruptedException {
configuration = new Configuration();
configuration.set("dfs.replication","1");//設置副本數量為1
fileSystem = FileSystem.get(new URI(url),configuration,user);
System.out.println("before doing");
}
@After
public void after(){
System.out.println("after doing");
configuration = null;
try {
fileSystem.close();
} catch (IOException e) {
fileSystem = null;
e.printStackTrace();
}
}

/**
* 創建文件
* @throws IOException
*/
@Test
public void mkdir() throws IOException {
Path path = new Path("/mkdirTest1/123");
boolean re = fileSystem.mkdirs(path);
System.out.println(re);
}

3、運行,測試成功!


免責聲明!

本站轉載的文章為個人學習借鑒使用,本站對版權不負任何法律責任。如果侵犯了您的隱私權益,請聯系本站郵箱yoyou2525@163.com刪除。



 
粵ICP備18138465號   © 2018-2025 CODEPRJ.COM