分享

hadoop2.4.1编译及hadoop2.4.1各种包下载

 
howtodown 2014-8-6 11:19:29 发表于 安装配置 [显示全部楼层] 回帖奖励 阅读模式 关闭右栏 127 119967
本帖最后由 howtodown 于 2014-8-19 11:49 编辑
问题导读:
1.如何修改本地时区?
2.需要安装那些软件?
3.带dos参数与不带dos编译有什么区别?









-- 参考:http://hadoop.apache.org/docs/r2 ... ativeLibraries.html

-- hadoop compile for CentOS

-- 编译hadoop之前,先修改源码中如下文件以支持本地时区
cd hadoop-2.4.1-src
vi ./hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common/src/main/resources/webapps/static/yarn.dt.plugins.js

-- 将如下行
return new Date(parseInt(data)).toUTCString();

-- 修改为:
return new Date(parseInt(data)).toLocaleString();



----------------------------------------------------------------------------------
Requirements:

* Unix System
* JDK 1.6+
* Maven 3.0 or later
* Findbugs 1.3.9 (if running findbugs)
* ProtocolBuffer 2.5.0
* CMake 2.6 or newer (if compiling native code)
* Internet connection for first build (to fetch all Maven and Hadoop dependencies)


-- 安装如下包:
[root@funshion-hadoop192 hadoop-2.4.1-src]# rpm -qa|grep gzip
gzip-1.3.12-19.el6_4.x86_64
[root@funshion-hadoop192 hadoop-2.4.1-src]# rpm -qa|grep zlib
zlib-1.2.3-29.el6.x86_64
zlib-devel-1.2.3-29.el6.x86_64
zlib-static-1.2.3-29.el6.x86_64


-- 安装protobuf 2.5.0
-- 参考:http://www.cnblogs.com/Anker/p/3209764.html
mkdir -p /usr/local/protobuf
tar -xvf protobuf-2.5.0.tar.gz
cd protobuf-2.5.0
./configure --prefix=/usr/local/protobuf
make
make check
make install

vi + /etc/profile
export PATH=$PATH:/usr/local/protobuf/bin
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/local/protobuf/lib

source /etc/profile

-- 开始编译:
[root@funshion-hadoop192 hadoop-2.4.1-src]# mvn package -Pdist,native -DskipTests -Dtar

...

main:
     [exec] $ tar cf hadoop-2.4.1.tar hadoop-2.4.1
     [exec] $ gzip -f hadoop-2.4.1.tar
     [exec]
     [exec] Hadoop dist tar available at: /opt/software/hadoop-2.4.1-src/hadoop-dist/target/hadoop-2.4.1.tar.gz
     [exec]
[INFO] Executed tasks
[INFO]
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-dist ---
[INFO] Building jar: /opt/software/hadoop-2.4.1-src/hadoop-dist/target/hadoop-dist-2.4.1-javadoc.jar
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop Main ................................ SUCCESS [  2.125 s]
[INFO] Apache Hadoop Project POM ......................... SUCCESS [  2.349 s]
[INFO] Apache Hadoop Annotations ......................... SUCCESS [  7.223 s]
[INFO] Apache Hadoop Assemblies .......................... SUCCESS [  0.494 s]
[INFO] Apache Hadoop Project Dist POM .................... SUCCESS [ 57.967 s]
[INFO] Apache Hadoop Maven Plugins ....................... SUCCESS [  5.738 s]
[INFO] Apache Hadoop MiniKDC ............................. SUCCESS [  6.730 s]
[INFO] Apache Hadoop Auth ................................ SUCCESS [01:24 min]
[INFO] Apache Hadoop Auth Examples ....................... SUCCESS [  5.100 s]
[INFO] Apache Hadoop Common .............................. SUCCESS [07:35 min]
[INFO] Apache Hadoop NFS ................................. SUCCESS [ 11.409 s]
[INFO] Apache Hadoop Common Project ...................... SUCCESS [  0.131 s]
[INFO] Apache Hadoop HDFS ................................ SUCCESS [15:32 min]
[INFO] Apache Hadoop HttpFS .............................. SUCCESS [ 34.246 s]
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS [ 14.162 s]
[INFO] Apache Hadoop HDFS-NFS ............................ SUCCESS [  7.359 s]
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [  0.104 s]
[INFO] hadoop-yarn ....................................... SUCCESS [  0.093 s]
[INFO] hadoop-yarn-api ................................... SUCCESS [01:57 min]
[INFO] hadoop-yarn-common ................................ SUCCESS [ 50.802 s]
[INFO] hadoop-yarn-server ................................ SUCCESS [  0.063 s]
[INFO] hadoop-yarn-server-common ......................... SUCCESS [ 16.060 s]
[INFO] hadoop-yarn-server-nodemanager .................... SUCCESS [ 26.625 s]
[INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS [  5.495 s]
[INFO] hadoop-yarn-server-applicationhistoryservice ...... SUCCESS [  9.923 s]
[INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS [ 24.955 s]
[INFO] hadoop-yarn-server-tests .......................... SUCCESS [  1.274 s]
[INFO] hadoop-yarn-client ................................ SUCCESS [ 10.712 s]
[INFO] hadoop-yarn-applications .......................... SUCCESS [  0.080 s]
[INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS [  4.875 s]
[INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS [  4.060 s]
[INFO] hadoop-yarn-site .................................. SUCCESS [  0.062 s]
[INFO] hadoop-yarn-project ............................... SUCCESS [  8.983 s]
[INFO] hadoop-mapreduce-client ........................... SUCCESS [  0.209 s]
[INFO] hadoop-mapreduce-client-core ...................... SUCCESS [ 36.397 s]
[INFO] hadoop-mapreduce-client-common .................... SUCCESS [ 29.828 s]
[INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS [  5.075 s]
[INFO] hadoop-mapreduce-client-app ....................... SUCCESS [ 16.686 s]
[INFO] hadoop-mapreduce-client-hs ........................ SUCCESS [ 14.421 s]
[INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS [ 11.174 s]
[INFO] hadoop-mapreduce-client-hs-plugins ................ SUCCESS [  4.673 s]
[INFO] Apache Hadoop MapReduce Examples .................. SUCCESS [ 12.380 s]
[INFO] hadoop-mapreduce .................................. SUCCESS [  7.728 s]
[INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS [  9.712 s]
[INFO] Apache Hadoop Distributed Copy .................... SUCCESS [ 12.677 s]
[INFO] Apache Hadoop Archives ............................ SUCCESS [  4.079 s]
[INFO] Apache Hadoop Rumen ............................... SUCCESS [ 10.651 s]
[INFO] Apache Hadoop Gridmix ............................. SUCCESS [  8.470 s]
[INFO] Apache Hadoop Data Join ........................... SUCCESS [  5.563 s]
[INFO] Apache Hadoop Extras .............................. SUCCESS [  6.085 s]
[INFO] Apache Hadoop Pipes ............................... SUCCESS [ 13.207 s]
[INFO] Apache Hadoop OpenStack support ................... SUCCESS [ 10.080 s]
[INFO] Apache Hadoop Client .............................. SUCCESS [  9.622 s]
[INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS [  0.245 s]
[INFO] Apache Hadoop Scheduler Load Simulator ............ SUCCESS [ 10.502 s]
[INFO] Apache Hadoop Tools Dist .......................... SUCCESS [  9.243 s]
[INFO] Apache Hadoop Tools ............................... SUCCESS [  0.086 s]
[INFO] Apache Hadoop Distribution ........................ SUCCESS [01:42 min]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 37:45 min
[INFO] Finished at: 2014-08-05T14:23:22+08:00
[INFO] Final Memory: 180M/917M
[INFO] ------------------------------------------------------------------------


-- 注意:带doc参数编译出来有近200M,不带doc编译出来的只有120M左右
mvn package -Pdist,native,docs -DskipTests -Dtar
mvn package -Pdist,native -DskipTests -Dtar

-- ############################################################################################ --
--------------------------------------------------------------------------------------------------
-- lzo 支持:
-- 参考:http://www.iteblog.com/archives/992

wget http://www.oberhumer.com/opensource/lzo/download/lzo-2.06.tar.gz
tar -zxvf lzo-2.06.tar.gz
cd lzo-2.06
export CFLAGS=-m64
./configure -enable-shared -prefix=/usr/local/hadoop/lzo/
make && make install


git clone https://github.com/twitter/hadoop-lzo.git
cd hadoop-lzo

export CFLAGS=-m64
export CXXFLAGS=-m64
export C_INCLUDE_PATH=/usr/local/hadoop/lzo/include
export LIBRARY_PATH=/usr/local/hadoop/lzo/lib
mvn clean package -Dmaven.test.skip=true
cd target/native/Linux-amd64-64
tar -cBf - -C lib . | tar -xBvf - -C ~
cp ~/libgplcompression* /usr/local/hadoop/lib/native/
cp /opt/software/hadoop-lzo/target/hadoop-lzo-0.4.20-SNAPSHOT.jar /usr/local/hadoop/share/hadoop/common/







hadoop2.4.1的64位centos的native包
链接:http://pan.baidu.com/s/1eQioTd4 密码:4pgb

hadoop2.4.1的64位编译包
链接:http://pan.baidu.com/s/1mgxBN3U 密码:
游客,如果您要查看本帖隐藏内容请回复



hadoop2.4.1的32位编译包
链接:http://pan.baidu.com/s/1mg2v4es 密码:krdx

官网源码包:未下载代码
http://pan.baidu.com/s/1dDgphJz


更多资源:
hadoop家族、strom、spark、Linux、flume等jar包、安装包汇总下载(持续更新)




已有(127)人评论

跳转到指定楼层
supersibly 发表于 2014-8-7 06:56:09
谢谢楼主大大分享
回复

使用道具 举报

rola 发表于 2014-8-8 16:17:59
支持,非常不错!
回复

使用道具 举报

tingxuec 发表于 2014-8-16 16:14:57
看看学习一下!
回复

使用道具 举报

bt2014 发表于 2014-8-19 11:20:04
不错很有帮助 谢谢了!!!!!!!!
回复

使用道具 举报

percent620 发表于 2014-8-29 17:11:04
谢谢楼主大大分享
回复

使用道具 举报

percent620 发表于 2014-8-29 17:11:47
不错很有帮助 谢谢了!!!!!!!!
回复

使用道具 举报

percent620 发表于 2014-8-29 17:12:19
回复

使用道具 举报

percent620 发表于 2014-8-29 17:15:24
看看学习一下!
回复

使用道具 举报

失落de环节 发表于 2014-9-1 15:47:50
谢谢楼主大大分享
回复

使用道具 举报

您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

关闭

推荐上一条 /2 下一条