注意:以下都是在root用戶下完成的,不喜歡撸(打)sudo這個單詞。
Step1 配置JDK環境
自動化安裝jdk
root@master:~# javac
The program 'javac' can be found in the following packages:
* default-jdk
* ecj
* gcj-4.8-jdk
* openjdk-7-jdk
* gcj-4.6-jdk
* openjdk-6-jdk
Try: apt-get install
root@master:~#
所以需要我們安裝包安裝jdk也行,自動安裝也行。安裝包的話,見我的參考文檔,自動的話,我們開始吧。
輸入命令:
root@master:~# apt-get install openjdk-7-jdk
安裝的時間比較長,需要等待一段時間。
配置JDK的環境變量
root@master:~# find / -name 'jre*'
/usr/lib/jvm/java-7-openjdk-amd64/jre
root@master:~#
配置java的環境變量:
root@master:~# vi /etc/profile
將這些輸入到最後:
export JAVA_HOME=/usr/lib/jvm/java-7-openjdk-amd64
export JRE_HOME=$JAVA_HOME/jre
export CLASSPATH=.:$JAVA_HOME/lib:$JRE_HOME/lib:$CLASSPATH
export PATH=$JAVA_HOME/bin:$JRE_HOME/bin:$PATH
重新啟動機器,查看java的版本信息,如果出現如下信息,則安裝成功:
root@master:~# java -version
java version "1.7.0_101"
OpenJDK Runtime Environment (IcedTea 2.6.6) (7u101-2.6.6-0ubuntu0.14.04.1)
OpenJDK 64-Bit Server VM (build 24.95-b01, mixed mode)
root@master:~#
安裝Maven編譯器
root@master:~# apt-get install maven
經過漫長的等待,如果輸入查看版本信息,輸出版本信息則證明安裝成功:
root@master:~# mvn --version
Apache Maven 3.0.5
Maven home: /usr/share/maven
Java version: 1.7.0_101, vendor: Oracle Corporation
Java home: /usr/lib/jvm/java-7-openjdk-amd64/jre
Default locale: en_US, platform encoding: UTF-8
OS name: "linux", version: "3.13.0-24-generic", arch: "amd64", family: "unix"
root@master:~#
安裝openssh
apt-get install openssh-server openssh-client
又是一段漫長的等待啊:
root@master:~# apt-get install openssh-server openssh-client
Reading package lists... Done
Building dependency tree
Reading state information... Done
Suggested packages:
ssh-askpass libpam-ssh keychain monkeysphere rssh molly-guard
The following packages will be upgraded:
openssh-client openssh-server
2 upgraded, 0 newly installed, 0 to remove and 204 not upgraded.
Need to get 885 kB of archives.
After this operation, 4,096 B of additional disk space will be used.
Get:1 http://us.archive.ubuntu.com/ubuntu/ trusty-updates/main openssh-server amd64 1:6.6p1-2ubuntu2.7 [322 kB]
Get:2 http://us.archive.ubuntu.com/ubuntu/ trusty-updates/main openssh-client amd64 1:6.6p1-2ubuntu2.7 [564 kB]
Fetched 885 kB in 2min 44s (5,390 B/s)
Preconfiguring packages ...
(Reading database ... 65436 files and directories currently installed.)
Preparing to unpack .../openssh-server_1%3a6.6p1-2ubuntu2.7_amd64.deb ...
Unpacking openssh-server (1:6.6p1-2ubuntu2.7) over (1:6.6p1-2ubuntu1) ...
Preparing to unpack .../openssh-client_1%3a6.6p1-2ubuntu2.7_amd64.deb ...
Unpacking openssh-client (1:6.6p1-2ubuntu2.7) over (1:6.6p1-2ubuntu1) ...
Processing triggers for ureadahead (0.100.0-16) ...
ureadahead will be reprofiled on next reboot
Processing triggers for ufw (0.34~rc-0ubuntu2) ...
Processing triggers for man-db (2.6.7.1-1) ...
Setting up openssh-client (1:6.6p1-2ubuntu2.7) ...
Setting up openssh-server (1:6.6p1-2ubuntu2.7) ...
ssh stop/waiting
ssh start/running, process 4902
root@master:~#
安裝protobuf-compiler
root@master:~# apt-get install protobuf-compiler
Reading package lists... Done
Building dependency tree
Reading state information... Done
The following extra packages will be installed:
libprotobuf8 libprotoc8
The following NEW packages will be installed:
libprotobuf8 libprotoc8 protobuf-compiler
0 upgraded, 3 newly installed, 0 to remove and 204 not upgraded.
Need to get 550 kB of archives.
After this operation, 2,133 kB of additional disk space will be used.
Do you want to continue? [Y/n] y
Get:1 http://us.archive.ubuntu.com/ubuntu/ trusty/main libprotobuf8 amd64 2.5.0-9ubuntu1 [296 kB]
Get:2 http://us.archive.ubuntu.com/ubuntu/ trusty/main libprotoc8 amd64 2.5.0-9ubuntu1 [235 kB]
Get:3 http://us.archive.ubuntu.com/ubuntu/ trusty/main protobuf-compiler amd64 2.5.0-9ubuntu1 [19.8 kB]
Fetched 550 kB in 3s (173 kB/s)
Selecting previously unselected package libprotobuf8:amd64.
(Reading database ... 65436 files and directories currently installed.)
Preparing to unpack .../libprotobuf8_2.5.0-9ubuntu1_amd64.deb ...
Unpacking libprotobuf8:amd64 (2.5.0-9ubuntu1) ...
Selecting previously unselected package libprotoc8:amd64.
Preparing to unpack .../libprotoc8_2.5.0-9ubuntu1_amd64.deb ...
Unpacking libprotoc8:amd64 (2.5.0-9ubuntu1) ...
Selecting previously unselected package protobuf-compiler.
Preparing to unpack .../protobuf-compiler_2.5.0-9ubuntu1_amd64.deb ...
Unpacking protobuf-compiler (2.5.0-9ubuntu1) ...
Processing triggers for man-db (2.6.7.1-1) ...
Setting up libprotobuf8:amd64 (2.5.0-9ubuntu1) ...
Setting up libprotoc8:amd64 (2.5.0-9ubuntu1) ...
Setting up protobuf-compiler (2.5.0-9ubuntu1) ...
Processing triggers for libc-bin (2.19-0ubuntu6) ...
root@master:~# protoc --version
libprotoc 2.5.0
root@master:~#
安裝依賴庫
apt-get install g++ autoconf automake libtool cmake zlib1g-dev pkg-config libssl-dev
安裝findbugs
root@master:~# apt-get install findbugs
開始編譯
root@master:~# wget http://mirrors.tuna.tsinghua.edu.cn/apache/hadoop/common/hadoop-2.6.4/hadoop-2.6.4-src.tar.gz
--2016-07-05 18:24:46-- http://mirrors.tuna.tsinghua.edu.cn/apache/hadoop/common/hadoop-2.6.4/hadoop-2.6.4-src.tar.gz
Resolving mirrors.tuna.tsinghua.edu.cn (mirrors.tuna.tsinghua.edu.cn)... 166.111.206.63, 2402:f000:1:416:166:111:206:63
Connecting to mirrors.tuna.tsinghua.edu.cn (mirrors.tuna.tsinghua.edu.cn)|166.111.206.63|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 17282122 (16M) [application/octet-stream]
Saving to: ‘hadoop-2.6.4-src.tar.gz’
100%[========================================================================================>] 17,282,122 10.2MB/s in 1.6s
2016-07-05 18:24:48 (10.2 MB/s) - ‘hadoop-2.6.4-src.tar.gz’ saved [17282122/17282122]
root@master:~# ls
hadoop-2.6.4-src.tar.gz
解壓文件夾到當前目錄:
root@master:~# tar -zxvf hadoop-2.6.4-src.tar.gz
root@master:~# ls
hadoop-2.6.4-src hadoop-2.6.4-src.tar.gz
進入到Hadoop-2.6.4-src文件中:
root@master:~#cd /root/xyj/hadoop-2.6.4-src
激動人心的時刻到了,終於到了編譯時刻了:
root@master:~/hadoop-2.6.4-src# mvn clean package -Pdist,native -DskipTests -Dtar
OK,在這裡我們需要等待漫長的時間。期間保證網速達到要求,不斷電,一般都是一兩個小時即可完成編譯,如果設備和網絡的原因,那就需要時間大於2小時不等了。
Hadoop下載地址怎麼找?
【首先來到官網:http://hadoop.apache.org/,找到Getting Started,點擊Download,進入到Hadoop的release版本下載地址:http://hadoop.apache.org/releases.html,我們會發現很多版本的下載地址,在這裡選擇一個你認為順眼的一個,我選擇是Hadoop 2.6.4,點擊Tarball下對應的source,進入到:http://www.apache.org/dyn/closer.cgi/hadoop/common/hadoop-2.6.4/hadoop-2.6.4-src.tar.gz,我們會發現一個http地址,將這個地址就是我們的下載地址。把地址放在wget後面即可!】
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 54:47.276s
[INFO] Finished at: Tue Jul 05 07:16:18 EDT 2016
[INFO] Final Memory: 80M/473M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal on project hadoop-hdfs-httpfs: Could not resolve dependencies for project org.apache.hadoop:hadoop-hdfs-httpfs:war:2.6.4: Failed to collect dependencies for [junit:junit:jar:4.11 (test), org.mockito:mockito-all:jar:1.8.5 (test), org.apache.hadoop:hadoop-auth:jar:2.6.4 (compile), com.sun.jersey:jersey-core:jar:1.9 (compile), com.sun.jersey:jersey-server:jar:1.9 (compile), javax.servlet:servlet-api:jar:2.5 (provided), com.google.guava:guava:jar:11.0.2 (compile), com.googlecode.json-simple:json-simple:jar:1.1 (compile), org.mortbay.jetty:jetty:jar:6.1.26 (test), org.apache.hadoop:hadoop-common:jar:2.6.4 (compile), org.apache.hadoop:hadoop-hdfs:jar:2.6.4 (compile), org.apache.hadoop:hadoop-common:jar:tests:2.6.4 (test), org.apache.hadoop:hadoop-hdfs:jar:tests:2.6.4 (test), log4j:log4j:jar:1.2.17 (compile), org.slf4j:slf4j-api:jar:1.7.5 (compile), org.slf4j:slf4j-log4j12:jar:1.7.5 (runtime)]: Failed to read artifact descriptor for com.googlecode.json-simple:json-simple:jar:1.1: Could not transfer artifact com.googlecode.json-simple:json-simple:pom:1.1 from/to central (http://repo.maven.apache.org/maven2): Connection to http://repo.maven.apache.org refused: Connection refused -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn -rf :hadoop-hdfs-httpfs
root@master:~/xyj/hadoop-2.6.4-src#
去吃了個飯,發現編譯出錯了,好了,開始解決吧!在網上搜錯誤怎麼解決,沒找到,還是自己看日志吧,日志大致意思是tomcat服務器有問題,所以我感覺自己安裝的新系統沒有裝tomcat服務器,所以輸入命令:
root@master:~/xyj/hadoop-2.6.4-src# apt-get install tomcat7
重新編譯:
root@master:~/xyj/hadoop-2.6.4-src# mvn clean package -Pdist,native -DskipTests -Dtar
漫長的等待,以為上面沒有解決問題呢,但是我發現,編譯剛才錯的那一塊竟然過去了,過去了就過去了吧,這也是解決問題的方法之一,總比沒解決問題吧,唯一的不足之處就是真正不知道錯哪了,只是憑感覺走的,正好誤打誤撞過去了,希望不是吧!
錯誤2:
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:14:21.695s
[INFO] Finished at: Tue Jul 05 09:21:06 EDT 2016
[INFO] Final Memory: 76M/439M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal on project hadoop-yarn-server-nodemanager: Could not resolve dependencies for project org.apache.hadoop:hadoop-yarn-server-nodemanager:jar:2.6.4: Failed to collect dependencies for [org.apache.hadoop:hadoop-common:jar:2.6.4 (provided), org.apache.hadoop:hadoop-yarn-common:jar:2.6.4 (compile), org.apache.hadoop:hadoop-yarn-api:jar:2.6.4 (compile), javax.xml.bind:jaxb-api:jar:2.2.2 (compile), org.codehaus.jettison:jettison:jar:1.1 (compile), commons-lang:commons-lang:jar:2.6 (compile), javax.servlet:servlet-api:jar:2.5 (compile), commons-codec:commons-codec:jar:1.4 (compile), com.sun.jersey:jersey-core:jar:1.9 (compile), com.sun.jersey:jersey-client:jar:1.9 (compile), org.mortbay.jetty:jetty-util:jar:6.1.26 (compile), com.google.guava:guava:jar:11.0.2 (compile), commons-logging:commons-logging:jar:1.1.3 (compile), org.slf4j:slf4j-api:jar:1.7.5 (compile), org.apache.hadoop:hadoop-annotations:jar:2.6.4 (compile), org.apache.hadoop:hadoop-common:jar:tests:2.6.4 (test), com.google.inject.extensions:guice-servlet:jar:3.0 (compile), com.google.protobuf:protobuf-java:jar:2.5.0 (compile), junit:junit:jar:4.11 (test), org.mockito:mockito-all:jar:1.8.5 (test), com.google.inject:guice:jar:3.0 (compile), com.sun.jersey.jersey-test-framework:jersey-test-framework-grizzly2:jar:1.9 (test), com.sun.jersey:jersey-json:jar:1.9 (compile), com.sun.jersey.contribs:jersey-guice:jar:1.9 (compile), org.apache.hadoop:hadoop-yarn-common:jar:tests:2.6.4 (test), org.apache.hadoop:hadoop-yarn-server-common:jar:2.6.4 (compile), org.fusesource.leveldbjni:leveldbjni-all:jar:1.8 (compile)]: Failed to read artifact descriptor for org.glassfish.grizzly:grizzly-http:jar:2.1.2: Could not transfer artifact org.glassfish.grizzly:grizzly-http:pom:2.1.2 from/to apache.snapshots.https (https://repository.apache.org/content/repositories/snapshots): Read timed out -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn -rf :hadoop-yarn-server-nodemanager
真是頭疼,好好搜搜問題解決方法吧!告訴大家一個調試詳細錯誤的命令,在原有命令上添加一個 –X參數,我們就可以看到詳細的編譯過程了:
root@master:~/xyj/hadoop-2.6.4-src# mvn clean package -Pdist,native -DskipTests -Dtar -X
這個問題,我找了很多網站,也是找不到結果,看了很多的帖子和問題,歸根一點就是我的網速和鏡像站的問題,可能是網速有時候太慢,網絡不能訪問到外網的原因吧!所以索性下載了一個lantern,翻牆用的(也可能是網絡原因,網絡在某個時刻下載大文件卡住了,然後編譯執行到這一步,沒有這個文件,或者是文件下載不全,都會引起編譯失敗,所以不一定是翻牆的原因,可能是網絡在編譯時尋找路由或者是網站的時候,沒有找到,第二次就找到了,這怎麼感覺就是拼人品啊!)。
接下來就教大家安裝lantern吧!
Github下載地址(github源碼地址見上面網址):
root@master:~/xyj# wget https://raw.githubusercontent.com/getlantern/lantern-binaries/master/lantern-installer-3.0.4-64-bit.deb
安裝工具:
root@master:~/xyj# apt-get install gdebi-core
root@master:~/xyj# apt-get install libappindicator3-1
安裝開啟:
root@master:~/xyj #gdebi lantern-installer-3.0.4-64-bit.deb
root@master:~/xyj#lantern
root@master:~# lantern
Running installation script...
/usr/lib/lantern/lantern-binary: OK
Jul 06 09:24:51.955 - 0m0s DEBUG flashlight: flashlight.go:49 ****************************** Package Version: 2.1.2
Jul 06 09:24:51.956 - 0m0s DEBUG flashlight.ui: ui.go:58 Creating tarfs filesystem that prefers local resources at /lantern/src/github.com/getlantern/lantern-ui/app
Jul 06 09:24:51.960 - 0m0s DEBUG flashlight: settings.go:57 Loading settings
Jul 06 09:24:51.960 - 0m0s DEBUG flashlight: settings.go:70 Could not read file open /root/.lantern/settings.yaml: no such file or directory
Jul 06 09:24:51.961 - 0m0s DEBUG flashlight.ui: service.go:134 Accepting websocket connections at: /data
Jul 06 09:24:51.962 - 0m0s DEBUG flashlight: settings.go:99 Sending Lantern settings to new client
Jul 06 09:24:51.965 - 0m0s DEBUG flashlight: settings.go:109 Reading settings messages!!
Jul 06 09:24:52.000 - 0m0s DEBUG flashlight: flashlight.go:49 ****************************** Package Version: 2.1.2
Jul 06 09:24:52.001 - 0m0s DEBUG flashlight.ui: ui.go:58 Creating tarfs filesystem that prefers local resources at /lantern/src/github.com/getlantern/lantern-ui/app
Jul 06 09:24:52.028 - 0m0s DEBUG flashlight: settings.go:57 Loading settings
Jul 06 09:24:52.028 - 0m0s DEBUG flashlight: settings.go:70 Could not read file open /root/.lantern/settings.yaml: no such file or directory
Jul 06 09:24:52.031 - 0m0s DEBUG flashlight.ui: service.go:134 Accepting websocket connections at: /data
Jul 06 09:24:52.032 - 0m0s DEBUG flashlight: settings.go:99 Sending Lantern settings to new client
Jul 06 09:24:52.037 - 0m0s DEBUG flashlight: settings.go:109 Reading settings messages!!
(lantern:18322): Gtk-WARNING **: cannot open display:
查看一下:
root@master:~# ps -aux | grep lantern
root 18331 0.0 0.0 11740 940 pts/1 S+ 05:26 0:00 grep --color=auto lantern
好了,這個安裝完,我們繼續吧!我感覺只是給自己一個心理安慰,這個重新編譯是否成功,就看人品了!還好,編譯通過了,感謝人品啊!
找到編譯結果
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop Main ................................ SUCCESS [4.493s]
[INFO] Apache Hadoop Project POM ......................... SUCCESS [3.007s]
[INFO] Apache Hadoop Annotations ......................... SUCCESS [5.774s]
[INFO] Apache Hadoop Assemblies .......................... SUCCESS [0.552s]
[INFO] Apache Hadoop Project Dist POM .................... SUCCESS [5.550s]
[INFO] Apache Hadoop Maven Plugins ....................... SUCCESS [12.361s]
[INFO] Apache Hadoop MiniKDC ............................. SUCCESS [15.264s]
[INFO] Apache Hadoop Auth ................................ SUCCESS [15.034s]
[INFO] Apache Hadoop Auth Examples ....................... SUCCESS [8.842s]
[INFO] Apache Hadoop Common .............................. SUCCESS [5:43.242s]
[INFO] Apache Hadoop NFS ................................. SUCCESS [31.949s]
[INFO] Apache Hadoop KMS ................................. SUCCESS [35.223s]
[INFO] Apache Hadoop Common Project ...................... SUCCESS [0.500s]
[INFO] Apache Hadoop HDFS ................................ SUCCESS [9:19.175s]
[INFO] Apache Hadoop HttpFS .............................. SUCCESS [1:12.408s]
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS [25.086s]
[INFO] Apache Hadoop HDFS-NFS ............................ SUCCESS [14.865s]
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [0.188s]
[INFO] hadoop-yarn ....................................... SUCCESS [0.167s]
[INFO] hadoop-yarn-api ................................... SUCCESS [2:13.539s]
[INFO] hadoop-yarn-common ................................ SUCCESS [1:57.809s]
[INFO] hadoop-yarn-server ................................ SUCCESS [1.107s]
[INFO] hadoop-yarn-server-common ......................... SUCCESS [38.032s]
[INFO] hadoop-yarn-server-nodemanager .................... SUCCESS [1:03.906s]
[INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS [9.610s]
[INFO] hadoop-yarn-server-applicationhistoryservice ...... SUCCESS [15.939s]
[INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS [1:06.344s]
[INFO] hadoop-yarn-server-tests .......................... SUCCESS [23.420s]
[INFO] hadoop-yarn-client ................................ SUCCESS [18.195s]
[INFO] hadoop-yarn-applications .......................... SUCCESS [0.291s]
[INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS [5.631s]
[INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS [3.816s]
[INFO] hadoop-yarn-site .................................. SUCCESS [0.152s]
[INFO] hadoop-yarn-registry .............................. SUCCESS [11.657s]
[INFO] hadoop-yarn-project ............................... SUCCESS [23.399s]
[INFO] hadoop-mapreduce-client ........................... SUCCESS [0.980s]
[INFO] hadoop-mapreduce-client-core ...................... SUCCESS [1:26.429s]
[INFO] hadoop-mapreduce-client-common .................... SUCCESS [1:00.837s]
[INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS [18.942s]
[INFO] hadoop-mapreduce-client-app ....................... SUCCESS [28.096s]
[INFO] hadoop-mapreduce-client-hs ........................ SUCCESS [21.970s]
[INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS [35.789s]
[INFO] hadoop-mapreduce-client-hs-plugins ................ SUCCESS [5.011s]
[INFO] Apache Hadoop MapReduce Examples .................. SUCCESS [11.536s]
[INFO] hadoop-mapreduce .................................. SUCCESS [17.663s]
[INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS [19.450s]
[INFO] Apache Hadoop Distributed Copy .................... SUCCESS [59.560s]
[INFO] Apache Hadoop Archives ............................ SUCCESS [9.721s]
[INFO] Apache Hadoop Rumen ............................... SUCCESS [17.239s]
[INFO] Apache Hadoop Gridmix ............................. SUCCESS [11.690s]
[INFO] Apache Hadoop Data Join ........................... SUCCESS [11.958s]
[INFO] Apache Hadoop Ant Tasks ........................... SUCCESS [7.894s]
[INFO] Apache Hadoop Extras .............................. SUCCESS [6.610s]
[INFO] Apache Hadoop Pipes ............................... SUCCESS [29.171s]
[INFO] Apache Hadoop OpenStack support ................... SUCCESS [25.039s]
[INFO] Apache Hadoop Amazon Web Services support ......... SUCCESS [4:21.123s]
[INFO] Apache Hadoop Client .............................. SUCCESS [24.644s]
[INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS [0.710s]
[INFO] Apache Hadoop Scheduler Load Simulator ............ SUCCESS [22.584s]
[INFO] Apache Hadoop Tools Dist .......................... SUCCESS [32.450s]
[INFO] Apache Hadoop Tools ............................... SUCCESS [1.933s]
[INFO] Apache Hadoop Distribution ........................ SUCCESS [3:01.581s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 45:05.494s
[INFO] Finished at: Wed Jul 06 06:14:20 EDT 2016
[INFO] Final Memory: 101M/473M
[INFO] ------------------------------------------------------------------------
上圖顯示了編譯的所有文件,他們都是分布一個個編譯,當失敗到哪,就會把成功的和失敗的,跳過失敗之後的那些項目顯示出來,這個忘了截圖,好吧!你們自己編譯會看到的。
編譯好之後,我們需要找到我們編譯好的包,路徑是:
root@master:~/xyj/hadoop-2.6.4-src/hadoop-dist/target# pwd
/root/xyj/hadoop-2.6.4-src/hadoop-dist/target
root@master:~/xyj/hadoop-2.6.4-src/hadoop-dist/target# ll
total 539268
drwxr-xr-x 7 root root 4096 Jul 6 06:13 ./
drwxr-xr-x 3 root root 4096 Jul 6 06:11 ../
drwxr-xr-x 2 root root 4096 Jul 6 06:11 antrun/
-rw-r--r-- 1 root root 1866 Jul 6 06:11 dist-layout-stitching.sh
-rw-r--r-- 1 root root 639 Jul 6 06:11 dist-tar-stitching.sh
drwxr-xr-x 9 root root 4096 Jul 6 06:11 hadoop-2.6.4/
-rw-r--r-- 1 root root 183757063 Jul 6 06:12 hadoop-2.6.4.tar.gz
-rw-r--r-- 1 root root 2779 Jul 6 06:11 hadoop-dist-2.6.4.jar
-rw-r--r-- 1 root root 368403396 Jul 6 06:14 hadoop-dist-2.6.4-javadoc.jar
drwxr-xr-x 2 root root 4096 Jul 6 06:13 javadoc-bundle-options/
drwxr-xr-x 2 root root 4096 Jul 6 06:11 maven-archiver/
drwxr-xr-x 2 root root 4096 Jul 6 06:11 test-dir/
其中有一個hadoop-2.6.4.tar.gz的包就是的,還有解壓好的hadoop-2.6.4文件夾。
Hadoop的版本信息:
root@master:~/xyj/hadoop-2.6.4-src/hadoop-dist/target/hadoop-2.6.4/bin# ./hadoop version
Hadoop 2.6.4
Subversion Unknown -r Unknown
Compiled by root on 2016-07-06T09:30Z
Compiled with protoc 2.5.0
From source with checksum 8dee2286ecdbbbc930a6c87b65cbc010
This command was run using /root/xyj/hadoop-2.6.4-src/hadoop-dist/target/hadoop-2.6.4/share/hadoop/common/hadoop-common-2.6.4.jar
root@master:~/xyj/hadoop-2.6.4-src/hadoop-dist/target/hadoop-2.6.4/bin# pwd
/root/xyj/hadoop-2.6.4-src/hadoop-dist/target/hadoop-2.6.4/bin
Hadoop的動態庫連接庫:
/root/xyj/hadoop-2.6.4-src/hadoop-dist/target/hadoop-2.6.4/lib/native
root@master:~/xyj/hadoop-2.6.4-src/hadoop-dist/target/hadoop-2.6.4/lib/native# file *
libhadoop.a: current ar archive
libhadooppipes.a: current ar archive
libhadoop.so: symbolic link to `libhadoop.so.1.0.0'
libhadoop.so.1.0.0: ELF 64-bit LSB shared object, x86-64, version 1 (SYSV), dynamically linked, BuildID[sha1]=2414b17dc9802b68da89538507e71ff61c8630c4, not stripped
libhadooputils.a: current ar archive
libhdfs.a: current ar archive
libhdfs.so: symbolic link to `libhdfs.so.0.0.0'
libhdfs.so.0.0.0: ELF 64-bit LSB shared object, x86-64, version 1 (SYSV), dynamically linked, BuildID[sha1]=a5aa61121dfb8d075dca4deab83067c812acd4c4, not stripped
root@master:~/xyj/hadoop-2.6.4-src/hadoop-dist/target/hadoop-2.6.4/lib/native# ll
total 4768
drwxr-xr-x 2 root root 4096 Jul 6 06:11 ./
drwxr-xr-x 3 root root 4096 Jul 6 06:11 ../
-rw-r--r-- 1 root root 1278070 Jul 6 06:11 libhadoop.a
-rw-r--r-- 1 root root 1632656 Jul 6 06:11 libhadooppipes.a
lrwxrwxrwx 1 root root 18 Jul 6 06:11 libhadoop.so -> libhadoop.so.1.0.0*
-rwxr-xr-x 1 root root 750783 Jul 6 06:11 libhadoop.so.1.0.0*
-rw-r--r-- 1 root root 476210 Jul 6 06:11 libhadooputils.a
-rw-r--r-- 1 root root 441046 Jul 6 06:11 libhdfs.a
lrwxrwxrwx 1 root root 16 Jul 6 06:11 libhdfs.so -> libhdfs.so.0.0.0*
-rwxr-xr-x 1 root root 282519 Jul 6 06:11 libhdfs.so.0.0.0*
root@master:~/xyj/hadoop-2.6.4-src/hadoop-dist/target/hadoop-2.6.4/lib/native#
那個hadoop-2.6.4.tar.gz中的文件解壓後和這個一樣。一會用的時候拷貝那個hadoop-2.6.4.tar.gz的文件,人嘛,就喜歡用原裝的,嘻嘻。。。
小結