一、windows上傳文件到 linux的hdfs
1、先在 centos 上開啟 hdfs, 用 jps 可以看到下面信息, 說明完成開啟

2、在win上配置 hadoop (https://www.cnblogs.com/Jomini/p/11432484.html) 后,
要在 hadoop 的 bin 文件上放以下兩個文件(網上找下載),

3、創建 maven 工程, 運行讀寫程序
pom 文件
<dependency> <groupId>org.apache.logging.log4j</groupId> <artifactId>log4j-core</artifactId> <version>2.8.2</version> </dependency> <dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-common</artifactId> <version>2.7.2</version> </dependency> <dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-client</artifactId> <version>2.7.2</version> </dependency> <dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-hdfs</artifactId> <version>2.7.2</version> </dependency>
運行上傳文件
import java.io.IOException;
import java.net.Socket;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileStatus;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
public class Apptest {
public static void main(String[] args) throws Exception, IOException {
upload();
}
public static void upload() throws IOException {
Configuration conf = new Configuration();
conf.set("fs.defaultFS", "hdfs://192.168.121.133:9000");
FileSystem fs = FileSystem.get(conf);
Path src = new Path("d://test.txt");
Path dest = new Path("/");
fs.copyFromLocalFile(src, dest);
FileStatus[] fileStatus = fs.listStatus(dest);
for (FileStatus file : fileStatus) {
System.out.println(file.getPath());
}
System.out.println("上傳成功");
}
}
運行使用 Run configuration, 要 配置 linux 上的用戶,不然拋出用戶權限問題

console

hdfs

二、在 hdfs 創建路徑創建路徑
2.1 在 hdfs 創建路徑
程序
import java.io.IOException; import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.fs.FileSystem; import org.apache.hadoop.fs.Path;
public class Apptest {
public static void main(String[] args) throws Exception, IOException {
Configuration conf = new Configuration(); conf.set("fs.defaultFS", "hdfs://192.168.121.133:9000"); //獲取hdfs 客戶端對象 FileSystem fs = FileSystem.get(conf); //在hdfs 上創建路徑 fs.mkdirs(new Path("/testPath")); //關閉資源 fs.close(); System.out.println("end"); } }
運行結果

2.2 在上面創建的路徑 "/testPath" 下面 再創建路徑 file
程序
import java.io.IOException;
import java.net.URI;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
public class Apptest {
public static void main(String[] args) throws Exception, IOException {
Configuration conf = new Configuration();
//獲取hdfs 客戶端對象
FileSystem fs = FileSystem.get(new URI("hdfs://192.168.121.133:9000"),conf,"root");
//在 /testPath 下創建路徑
fs.mkdirs(new Path("/testPath/file"));
//關閉資源
fs.close();
System.out.println("end");
}
}
點擊在hdfs上面的路徑 /testPath 會出現 /file

