1,安裝fuse
yum -y install hadoop-hdfs-fuse
2.修改環境變量
vi /etc/profile
增加如下配置:
JAVA_HOME=/usr/jdk64/jdk1.8.0_60
HADOOP_HOME=/usr/hdp/2.4.0.0-169/hadoop
export PATH=$HADOOP_HOME/bin:$JAVA_HOME/bin:$PATH
export LD_LIBRARY_PATH=/usr/hdp/2.4.0.0-169/usr/lib/:/usr/local/lib:/usr/lib:$LD_LIBRARY_PATH:$HADOOP_HOME/build/c++/Linux-amd64-64/lib:${JAVA_HOME}/jre/lib/amd64/server
3.創建掛載點 (要掛載到linux的本地目錄)
mkdir /hdfs
4.掛載
方法一:hadoop-fuse-dfs dfs://ocdp /hdfs
[root@vmocdp125 lib]# hadoop-fuse-dfs dfs://ocdp /hdfs INFO /grid/0/jenkins/workspace/HDP-build-centos6/bigtop/build/hadoop/rpm/BUILD/hadoop-2.7.1.2.4.0.0-src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_options.c:164 Adding FUSE arg /hdfs
”ocdp“ 為集群的名稱,hdfs-site.xml中nameservice的值
方法二:
自動掛載方法:
修改fstab文件:
查看一下:
grep hadoop /etc/fstab
vi /etc/fstab
添加以下信息:
hadoop-fuse-dfs#dfs://ocdp /hdfs fuse usetrash,rw 0 0
自動掛載:
mount -a
5.查看
[root@vmocdp125 bin]# df -h
Filesystem Size Used Avail Use% Mounted on
/dev/mapper/vg_ocdp01-lv_root
50G 14G 34G 29% /
tmpfs 11G 8.0K 11G 1% /dev/shm
/dev/sda1 477M 33M 419M 8% /boot
/dev/mapper/vg_ocdp01-lv_home
948G 674M 900G 1% /home
fuse_dfs 337G 3.3G 334G 1% /hdfs
進入掛載目錄可查看到hdfs上的文件夾都在掛載點/hdfs下
[root@vmocdp125 bin]# cd /hdfs [root@vmocdp125 hdfs]# ll total 52 drwxrwxrwx 5 yarn hadoop 4096 Oct 12 16:11 app-logs drwxr-xr-x 4 hdfs hdfs 4096 Sep 14 20:09 apps drwxr-xr-x 4 yarn hadoop 4096 Sep 14 19:48 ats drwxr-xr-x 4 flume hdfs 4096 Oct 31 18:55 flume drwxr-xr-x 3 ocetl hdfs 4096 Oct 13 14:52 ftp drwxr-xr-x 3 hdfs hdfs 4096 Sep 14 19:48 hdp drwxr-xr-x 3 ocetl hdfs 4096 Oct 21 16:05 hiveQuery drwxrwxrwx 4 ocetl hdfs 4096 Oct 18 17:45 home drwxr-xr-x 3 mapred hdfs 4096 Sep 14 19:48 mapred drwxrwxrwx 4 mapred hadoop 4096 Sep 14 19:48 mr-history drwxrwxrwx 46 spark hadoop 4096 Nov 1 18:26 spark-history drwxrwxrwx 9 hdfs hdfs 4096 Oct 14 17:22 tmp drwxr-xr-x 9 hdfs hdfs 4096 Oct 11 16:54 user

問題:
1..出現"error while loading shared libraries: libjvm.so: cannot open shared object file: No such file or directory"錯誤,是由於環境變量配置的有問題。
可能沒有配置 export LD_LIBRARY_PATH=""
所以在本地/etc/profile文件中把fuse共享庫與java共享庫加上去就可以了
2.出現"error while loading shared libraries: libhdfs.so.0.0.0: cannot open shared object file: No such file or directory"錯誤
查找libhdfs.so.0.0.0所在的目錄:find / -name libhdfs.so.0.0.0
加入到LD_LIBRARY_PATH中

export LD_LIBRARY_PATH=/usr/hdp/2.4.0.0-169/usr/lib/:/usr/local/lib:/usr/lib:$LD_LIBRARY_PATH:$HADOOP_HOME/build/c++/Linux-amd64-64/lib:${JAVA_HOME}/jre/lib/amd64/server
3.hadoop-fuse-dfs cmmand not found
安裝hadoop-fuse-dfs后HADOOP_HOME的bin目錄下有個hadoop-fuse-dfs可執行文件,找不到這個命令是因為沒有把HADOOP_HOME加入到PATH中
在PATH中增加$HADOOP_HOME/bin:
