fuse-dfs掛載全過程
fuse-dfs掛載終於成功了,斷斷續續弄了兩周多,而最後的一步掛載出錯花了我一周時間,加了個HDFS QQ群問了之後才知道哪裡弄錯了,且聽細細道來。
fuse-dfs掛載全過程
准備工作:
CentOS 6.3,Hadoop 1.2.0, jdk 1.6.0_45,fuse 2.8.4,ant 1.9.1
1.安裝fuse
yum install fuse fuse-libs fuse-devel
2.安裝ant
官網下載,解壓
3.系統配置
vi /etc/profile
最後添加:
export OS_ARCH=i386 #如果是64位機器填amd64
export OS_BIT=32 #64
export JAVA_HOME=/usr/java/jdk1.6.0_45
export ANT_HOME=/usr/ant
export PATH=$JAVA_HOME/bin:$ANT_HOME/bin:$PATH
export CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar:$HADOOP_HOME/lib:$HADOOP_HOME:$CLASSPATH
export HADOOP_HOME=/usr/hadoop
export LD_LIBRARY_PATH=$JAVA_HOME/jre/lib/$OS_ARCH/server:$HADOOP_HOME/c++/Linux-$OS_ARCH-$OS_BIT/lib:/usr/local/lib:/usr/lib
退出,編譯 source /etc/profile
4.編譯libhdfs
cd $HADOOP_HOME
ant compile-c++-libhdfs -Dlibhdfs=1 -Dcompile.c++=1
ln -s c++/Linux-$OS_ARCH-$OS_BIT/lib build/libhdfs
ant compile-contrib -Dlibhdfs=1 -Dfusedfs=1
(提示:1.如果編譯沒通過,缺少依賴包,yum install automakeautoconf m4 libtool pkgconfig fuse fuse-devel fuse-libs
2.在安裝的過程中還要安裝gcc 。編譯成功會提示 build successful,看到這句心情非常愉悅)
5.環境配置
cd $HADOOP_HOME/build/contrib/fuse-dfs
vi fuse_dfs_wrapper.sh
在文件最前面添加:
export JAVA_HOME=/usr/java/jdk1.6.0_45
export HADOOP_HOME=/usr/hadoop
export HADOOP_CONF_DIR=/usr/hadoop/conf
export OS_ARCH=i386
export OS_BIT=32
把最後一句“./fuse_dfs$@” 改成 “fuse_dfs@”
6.添加權限
$chmod +x /usr/hadoop/build/contrib/fuse-dfs/fuse_dfs_wrapper.sh
$chmod +x /usr/hadoop/build/contrib/fuse-dfs/fuse_dfs
$ln -s /usr/hadoop/build/contrib/fuse-dfs/fuse_dfs_wrapper.sh /usr/local/bin
$ln -s /usr/hadoop/build/contrib/fuse-dfs/fuse_dfs /usr/local/bin/
7.掛載
mkdir /mnt/dfs
cd $HADOOP_HOME/build/contrib/fuse-dfs
fuse_dfs_wrapper.sh dfs://localhost:9000 /mnt/dfs
(就是這最後一步!!糊弄了我一周!!關於fuse_dfs_wrapper.sh後面跟的這個鏈接,我一直遵循)conf/core-site.xml裡設置的value值:hdfs://localhost:9000,一直報錯fuse-dfs didn't recognize hdfs://localhost:9000,-2 fuse-dfs didn't recognize /mnt/dfs,-2 )
最後ls /mnt/dfs就可以看到hdfs裡的文件了