分享

hadoop2.2.0遇到64位操作系统平台报错,重新编译hadoop及遇到的问题

xng2012 2014-1-22 17:59:38 发表于 实操演练 [显示全部楼层] 回帖奖励 阅读模式 关闭右栏 0 6260
本帖最后由 xng2012 于 2014-1-22 18:00 编辑

前提环境:操作oracle linux 6.3 64位
hadoop cmake  maven protobuf

问题描述
在64位linux装的hadoop,在很多地方会遇到libhadoop.so.1.0.0 which might have disabled stack guard. 是因为hadoop是32位的,需要手工编译hadoop。

hadoop为2.2.0,操作系统为oracle linux 6.3 64位。
实例和解决过程。
遇到的问题
[hadoop@hadoop01 input]$ hadoop dfs -put ./in

DEPRECATED: Use of this script to executehdfs command is deprecated.

Instead use the hdfs command for it.



Java HotSpot(TM) 64-BitServer VM warning: You have loaded library/app/hadoop/hadoop-2.2.0/lib/native/libhadoop.so.1.0.0 which might havedisabled stack guard. The VM will try to fix the stack guard now.

It's highly recommendedthat you fix the library with 'execstack -c <libfile>', or link it with'-z noexecstack'.

13/10/24 04:08:55 WARNutil.NativeCodeLoader: Unable to load native-hadoop library for yourplatform... using builtin-java classes where applicable

put: `in': No such file or directory



查看本地文件
[hadoop@hadoop01 input]$ file /app/hadoop/hadoop-2.2.0/lib/native/libhadoop.so.1.0.0

/app/hadoop/hadoop-2.2.0/lib/native/libhadoop.so.1.0.0:ELF 32-bit LSB shared object, Intel 80386,version 1 (SYSV), dynamically linked, not stripped



貌似是32位和64位的原因
  1. http://mail-archives.apache.org/ ... -user/201208.mbox/%3C19AD42E3F64F0F468A305399D0DF39D92EA4521578@winops07.win.compete.com%3E
  2. http://www.mail-archive.com/comm ... e.org/msg52576.html
复制代码
操作系统64位,软件是32位。悲剧了。。。装好的集群没法用。





解决方法:重新编译hadoop
解决方法,就是重新编译hadoop软件:

下载程序代码
机器得连网,如果没联网找可以联网的机器下载,但是编译时还是要下载一些东西,所以,实在不行。最好找相同平台(可以是虚拟机)能上网的机器做下面工作,弄好了再拷回来。



# svn checkout'http://svn.apache.org/repos/asf/hadoop/common/tags/release-2.2.0'



都下载到这了:

[hadoop@hadoop01 hadoop]$ ls

BUILDING.txt       hadoop-common-project     hadoop-maven-plugins  hadoop-tools

dev-support        hadoop-dist               hadoop-minicluster    hadoop-yarn-project

hadoop-assemblies  hadoop-hdfs-project       hadoop-project        pom.xml

hadoop-client      hadoop-mapreduce-project  hadoop-project-dist

安装开发环境
1.必要的包

[root@hadoop01 /]# yum install svn

[root@hadoop01 ~]# yum install autoconfautomake libtool cmake

root@hadoop01 ~]# yum install ncurses-devel

root@hadoop01 ~]# yum install openssl-devel

root@hadoop01 ~]# yum install gcc*

2.安装maven

下载,并解压

http://maven.apache.org/download.cgi



[root@hadoop01 stable]# mvapache-maven-3.1.1 /usr/local/

将/usr/local/apache-maven-3.1.1/bin加到环境变量中

3.安装protobuf



没装 protobuf,后面编译做不完,结果如下:

[INFO] ---hadoop-maven-plugins:2.2.0:protoc (compile-protoc) @ hadoop-common ---

[WARNING] [protoc, --version] failed:java.io.IOException: Cannot run program "protoc": error=2, No suchfile or directory

[ERROR] stdout: []

……………………

[INFO] Apache Hadoop Main................................ SUCCESS [5.672s]

[INFO] Apache Hadoop Project POM......................... SUCCESS [3.682s]

[INFO] Apache Hadoop Annotations......................... SUCCESS [8.921s]

[INFO] Apache Hadoop Assemblies.......................... SUCCESS [0.676s]

[INFO] Apache Hadoop Project Dist POM.................... SUCCESS [4.590s]

[INFO] Apache Hadoop Maven Plugins....................... SUCCESS [9.172s]

[INFO] Apache Hadoop Auth................................ SUCCESS [10.123s]

[INFO] Apache Hadoop Auth Examples....................... SUCCESS [5.170s]

[INFO] Apache HadoopCommon .............................. FAILURE [1.224s]

[INFO] Apache Hadoop NFS................................. SKIPPED

[INFO] Apache Hadoop Common Project...................... SKIPPED

[INFO] Apache Hadoop HDFS................................ SKIPPED

[INFO] Apache Hadoop HttpFS.............................. SKIPPED

[INFO] Apache Hadoop HDFS BookKeeperJournal ............. SKIPPED

[INFO] Apache Hadoop HDFS-NFS............................ SKIPPED

[INFO] Apache Hadoop HDFS Project........................ SKIPPED

安装protobuf过程

下载:https://protobuf.googlecode.com/files/protobuf-2.5.0.tar.gz

https://code.google.com/p/protobuf/downloads/list

[root@hadoop01 protobuf-2.5.0]# pwd

/soft/protobuf-2.5.0

依次执行下面的命令即可

./configure

make

make check

make install

[root@hadoop01 protobuf-2.5.0]# protoc--version

libprotoc 2.5.0



4.cmake安装

CMAKE报错:

main:

   [mkdir] Created dir:/soft/hadoop/hadoop-tools/hadoop-pipes/target/native

    [exec] -- The C compiler identification is GNU

    [exec] -- The CXX compiler identification is GNU

    [exec] -- Check for working C compiler: /usr/bin/gcc

    [exec] -- Check for working C compiler: /usr/bin/gcc -- works

    [exec] -- Detecting C compiler ABI info

    [exec] -- Detecting C compiler ABI info - done

    [exec] -- Check for working CXX compiler: /usr/bin/c++

    [exec] -- Check for working CXX compiler: /usr/bin/c++ -- works

    [exec] -- Detecting CXX compiler ABI info

    [exec] -- Detecting CXX compiler ABI info - done

    [exec] CMake Error at /usr/share/cmake/Modules/FindOpenSSL.cmake:66(MESSAGE):

    [exec]   Could NOT find OpenSSL

    [exec] Call Stack (most recent call first):

    [exec]   CMakeLists.txt:20(find_package)

    [exec]

    [exec]

    [exec] -- Configuring incomplete, errors occurred!

[INFO] Apache Hadoop Gridmix............................. SUCCESS [12.062s]

[INFO] Apache Hadoop Data Join........................... SUCCESS [8.694s]

[INFO] Apache Hadoop Extras.............................. SUCCESS [6.877s]

[INFO] Apache Hadoop Pipes ...............................FAILURE [5.295s]

[INFO] Apache Hadoop Tools Dist.......................... SKIPPED

[INFO] Apache Hadoop Tools............................... SKIPPED

[INFO] Apache Hadoop Distribution........................ SKIPPED

[INFO] Apache Hadoop Client.............................. SKIPPED

[INFO] Apache Hadoop Mini-Cluster........................ SKIPPED



需要安装

root@hadoop01 ~]# yum install ncurses-devel

root@hadoop01 ~]# yum install openssl-devel



编译hadoop


[hadoop@hadoop01 hadoop]$ pwd

/soft/hadoop

[hadoop@hadoop01 hadoop]$ ls

BUILDING.txt       hadoop-client          hadoop-hdfs-project       hadoop-minicluster   hadoop-tools

dev-support        hadoop-common-project  hadoop-mapreduce-project  hadoop-project       hadoop-yarn-project

hadoop-assemblies  hadoop-dist            hadoop-maven-plugins      hadoop-project-dist  pom.xml

[hadoop@hadoop01 hadoop]$ mvn package -Pdist,native -DskipTests -Dtar



编译是个很耗时的工作呀。。。。



下面是做完成功的结果

[INFO] Reactor Summary:

[INFO]

[INFO] Apache Hadoop Main................................ SUCCESS [6.600s]

[INFO] Apache Hadoop Project POM......................... SUCCESS [3.974s]

[INFO] Apache Hadoop Annotations......................... SUCCESS [9.878s]

[INFO] Apache Hadoop Assemblies.......................... SUCCESS [0.856s]

[INFO] Apache Hadoop Project Dist POM.................... SUCCESS [4.750s]

[INFO] Apache Hadoop Maven Plugins....................... SUCCESS [8.720s]

[INFO] Apache Hadoop Auth................................ SUCCESS [10.107s]

[INFO] Apache Hadoop Auth Examples....................... SUCCESS [5.734s]

[INFO] Apache Hadoop Common.............................. SUCCESS [4:32.636s]

[INFO] Apache Hadoop NFS................................. SUCCESS [29.700s]

[INFO] Apache Hadoop Common Project...................... SUCCESS [0.090s]

[INFO] Apache Hadoop HDFS................................ SUCCESS [6:15.394s]

[INFO] Apache Hadoop HttpFS.............................. SUCCESS [1:09.238s]

[INFO] Apache Hadoop HDFS BookKeeperJournal ............. SUCCESS [27.676s]

[INFO] Apache Hadoop HDFS-NFS............................ SUCCESS [13.954s]

[INFO] Apache Hadoop HDFS Project........................ SUCCESS [0.212s]

[INFO] hadoop-yarn....................................... SUCCESS [0.962s]

[INFO] hadoop-yarn-api................................... SUCCESS [1:48.066s]

[INFO] hadoop-yarn-common................................ SUCCESS [1:37.543s]

[INFO] hadoop-yarn-server................................ SUCCESS [4.301s]

[INFO] hadoop-yarn-server-common......................... SUCCESS [29.502s]

[INFO] hadoop-yarn-server-nodemanager.................... SUCCESS [36.593s]

[INFO] hadoop-yarn-server-web-proxy...................... SUCCESS [13.273s]

[INFO] hadoop-yarn-server-resourcemanager................ SUCCESS [30.612s]

[INFO] hadoop-yarn-server-tests.......................... SUCCESS [4.374s]

[INFO] hadoop-yarn-client................................ SUCCESS [14.115s]

[INFO] hadoop-yarn-applications.......................... SUCCESS [0.218s]

[INFO]hadoop-yarn-applications-distributedshell ......... SUCCESS [9.871s]

[INFO] hadoop-mapreduce-client........................... SUCCESS [1.095s]

[INFO] hadoop-mapreduce-client-core...................... SUCCESS [1:30.650s]

[INFO]hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS [15.089s]

[INFO] hadoop-yarn-site.................................. SUCCESS [0.637s]

[INFO] hadoop-yarn-project............................... SUCCESS [25.809s]

[INFO] hadoop-mapreduce-client-common.................... SUCCESS [45.919s]

[INFO] hadoop-mapreduce-client-shuffle................... SUCCESS [14.693s]

[INFO] hadoop-mapreduce-client-app....................... SUCCESS [39.562s]

[INFO] hadoop-mapreduce-client-hs........................ SUCCESS [19.299s]

[INFO] hadoop-mapreduce-client-jobclient................. SUCCESS [18.549s]

[INFO] hadoop-mapreduce-client-hs-plugins................ SUCCESS [5.134s]

[INFO] Apache Hadoop MapReduce Examples.................. SUCCESS [17.823s]

[INFO] hadoop-mapreduce.................................. SUCCESS [12.726s]

[INFO] Apache Hadoop MapReduce Streaming................. SUCCESS [19.760s]

[INFO] Apache Hadoop Distributed Copy.................... SUCCESS [33.332s]

[INFO] Apache Hadoop Archives............................ SUCCESS [9.522s]

[INFO] Apache Hadoop Rumen............................... SUCCESS [15.141s]

[INFO] Apache Hadoop Gridmix............................. SUCCESS [15.052s]

[INFO] Apache Hadoop Data Join........................... SUCCESS [8.621s]

[INFO] Apache Hadoop Extras.............................. SUCCESS [8.744s]

[INFO] Apache Hadoop Pipes............................... SUCCESS [28.645s]

[INFO] Apache Hadoop Tools Dist.......................... SUCCESS [6.238s]

[INFO] Apache Hadoop Tools............................... SUCCESS [0.126s]

[INFO] Apache Hadoop Distribution........................ SUCCESS [1:20.132s]

[INFO] Apache Hadoop Client.............................. SUCCESS [18.820s]

[INFO] Apache Hadoop Mini-Cluster........................ SUCCESS [2.151s]

[INFO]------------------------------------------------------------------------

[INFO] BUILD SUCCESS

[INFO]------------------------------------------------------------------------

[INFO] Total time: 29:07.811s

[INFO] Finished at: Thu Oct 24 09:43:18 CST2013

[INFO] Final Memory: 78M/239M

[INFO]------------------------------------------------------------------------

使用用编译好的软件再执行一次
[hadoop@hadoop01 input]$ hadoop dfs -put ./in

DEPRECATED: Use of this script to executehdfs command is deprecated.

Instead use the hdfs command for it.



13/10/24 15:12:53 WARNutil.NativeCodeLoader: Unable to load native-hadoop library for yourplatform... using builtin-java classes where applicable

put: `in': No such file or directory

欢迎加入about云群371358502、39327136,云计算爱好者群,亦可关注about云腾讯认证空间||关注本站微信

没找到任何评论,期待你打破沉寂

您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

关闭

推荐上一条 /2 下一条