hadoop2.2.0遇到64位操作系统平台报错,重新编译hadoop及遇到的问题
本帖最后由 xng2012 于 2014-1-22 18:00 编辑前提环境:操作oracle linux 6.3 64位
hadoop cmakemaven protobuf
问题描述
在64位linux装的hadoop,在很多地方会遇到libhadoop.so.1.0.0 which might have disabled stack guard. 是因为hadoop是32位的,需要手工编译hadoop。
hadoop为2.2.0,操作系统为oracle linux 6.3 64位。
实例和解决过程。
遇到的问题
$ hadoop dfs -put ./in
DEPRECATED: Use of this script to executehdfs command is deprecated.
Instead use the hdfs command for it.
Java HotSpot(TM) 64-BitServer VM warning: You have loaded library/app/hadoop/hadoop-2.2.0/lib/native/libhadoop.so.1.0.0 which might havedisabled stack guard. The VM will try to fix the stack guard now.
It's highly recommendedthat you fix the library with 'execstack -c <libfile>', or link it with'-z noexecstack'.
13/10/24 04:08:55 WARNutil.NativeCodeLoader: Unable to load native-hadoop library for yourplatform... using builtin-java classes where applicable
put: `in': No such file or directory
查看本地文件
$ file /app/hadoop/hadoop-2.2.0/lib/native/libhadoop.so.1.0.0
/app/hadoop/hadoop-2.2.0/lib/native/libhadoop.so.1.0.0:ELF 32-bit LSB shared object, Intel 80386,version 1 (SYSV), dynamically linked, not stripped
貌似是32位和64位的原因http://mail-archives.apache.org/ ... -user/201208.mbox/%3C19AD42E3F64F0F468A305399D0DF39D92EA4521578@winops07.win.compete.com%3E
http://www.mail-archive.com/comm ... e.org/msg52576.html操作系统64位,软件是32位。悲剧了。。。装好的集群没法用。
解决方法:重新编译hadoop
解决方法,就是重新编译hadoop软件:
下载程序代码
机器得连网,如果没联网找可以联网的机器下载,但是编译时还是要下载一些东西,所以,实在不行。最好找相同平台(可以是虚拟机)能上网的机器做下面工作,弄好了再拷回来。
# svn checkout'http://svn.apache.org/repos/asf/hadoop/common/tags/release-2.2.0'
都下载到这了:
$ ls
BUILDING.txt hadoop-common-project hadoop-maven-pluginshadoop-tools
dev-support hadoop-dist hadoop-minicluster hadoop-yarn-project
hadoop-assemblieshadoop-hdfs-project hadoop-project pom.xml
hadoop-client hadoop-mapreduce-projecthadoop-project-dist
安装开发环境
1.必要的包
# yum install svn
# yum install autoconfautomake libtool cmake
root@hadoop01 ~]# yum install ncurses-devel
root@hadoop01 ~]# yum install openssl-devel
root@hadoop01 ~]# yum install gcc*
2.安装maven
下载,并解压
http://maven.apache.org/download.cgi
# mvapache-maven-3.1.1 /usr/local/
将/usr/local/apache-maven-3.1.1/bin加到环境变量中
3.安装protobuf
没装 protobuf,后面编译做不完,结果如下:
---hadoop-maven-plugins:2.2.0:protoc (compile-protoc) @ hadoop-common ---
failed:java.io.IOException: Cannot run program "protoc": error=2, No suchfile or directory
stdout: []
……………………
Apache Hadoop Main................................ SUCCESS
Apache Hadoop Project POM......................... SUCCESS
Apache Hadoop Annotations......................... SUCCESS
Apache Hadoop Assemblies.......................... SUCCESS
Apache Hadoop Project Dist POM.................... SUCCESS
Apache Hadoop Maven Plugins....................... SUCCESS
Apache Hadoop Auth................................ SUCCESS
Apache Hadoop Auth Examples....................... SUCCESS
Apache HadoopCommon .............................. FAILURE
Apache Hadoop NFS................................. SKIPPED
Apache Hadoop Common Project...................... SKIPPED
Apache Hadoop HDFS................................ SKIPPED
Apache Hadoop HttpFS.............................. SKIPPED
Apache Hadoop HDFS BookKeeperJournal ............. SKIPPED
Apache Hadoop HDFS-NFS............................ SKIPPED
Apache Hadoop HDFS Project........................ SKIPPED
安装protobuf过程
下载:https://protobuf.googlecode.com/files/protobuf-2.5.0.tar.gz
https://code.google.com/p/protobuf/downloads/list
# pwd
/soft/protobuf-2.5.0
依次执行下面的命令即可
./configure
make
make check
make install
# protoc--version
libprotoc 2.5.0
4.cmake安装
CMAKE报错:
main:
Created dir:/soft/hadoop/hadoop-tools/hadoop-pipes/target/native
-- The C compiler identification is GNU
-- The CXX compiler identification is GNU
-- Check for working C compiler: /usr/bin/gcc
-- Check for working C compiler: /usr/bin/gcc -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Check for working CXX compiler: /usr/bin/c++
-- Check for working CXX compiler: /usr/bin/c++ -- works
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
CMake Error at /usr/share/cmake/Modules/FindOpenSSL.cmake:66(MESSAGE):
Could NOT find OpenSSL
Call Stack (most recent call first):
CMakeLists.txt:20(find_package)
-- Configuring incomplete, errors occurred!
Apache Hadoop Gridmix............................. SUCCESS
Apache Hadoop Data Join........................... SUCCESS
Apache Hadoop Extras.............................. SUCCESS
Apache Hadoop Pipes ...............................FAILURE
Apache Hadoop Tools Dist.......................... SKIPPED
Apache Hadoop Tools............................... SKIPPED
Apache Hadoop Distribution........................ SKIPPED
Apache Hadoop Client.............................. SKIPPED
Apache Hadoop Mini-Cluster........................ SKIPPED
需要安装
root@hadoop01 ~]# yum install ncurses-devel
root@hadoop01 ~]# yum install openssl-devel
编译hadoop
$ pwd
/soft/hadoop
$ ls
BUILDING.txt hadoop-client hadoop-hdfs-project hadoop-minicluster hadoop-tools
dev-support hadoop-common-projecthadoop-mapreduce-projecthadoop-project hadoop-yarn-project
hadoop-assemblieshadoop-dist hadoop-maven-plugins hadoop-project-distpom.xml
$ mvn package -Pdist,native -DskipTests -Dtar
编译是个很耗时的工作呀。。。。
下面是做完成功的结果
Reactor Summary:
Apache Hadoop Main................................ SUCCESS
Apache Hadoop Project POM......................... SUCCESS
Apache Hadoop Annotations......................... SUCCESS
Apache Hadoop Assemblies.......................... SUCCESS
Apache Hadoop Project Dist POM.................... SUCCESS
Apache Hadoop Maven Plugins....................... SUCCESS
Apache Hadoop Auth................................ SUCCESS
Apache Hadoop Auth Examples....................... SUCCESS
Apache Hadoop Common.............................. SUCCESS
Apache Hadoop NFS................................. SUCCESS
Apache Hadoop Common Project...................... SUCCESS
Apache Hadoop HDFS................................ SUCCESS
Apache Hadoop HttpFS.............................. SUCCESS
Apache Hadoop HDFS BookKeeperJournal ............. SUCCESS
Apache Hadoop HDFS-NFS............................ SUCCESS
Apache Hadoop HDFS Project........................ SUCCESS
hadoop-yarn....................................... SUCCESS
hadoop-yarn-api................................... SUCCESS
hadoop-yarn-common................................ SUCCESS
hadoop-yarn-server................................ SUCCESS
hadoop-yarn-server-common......................... SUCCESS
hadoop-yarn-server-nodemanager.................... SUCCESS
hadoop-yarn-server-web-proxy...................... SUCCESS
hadoop-yarn-server-resourcemanager................ SUCCESS
hadoop-yarn-server-tests.......................... SUCCESS
hadoop-yarn-client................................ SUCCESS
hadoop-yarn-applications.......................... SUCCESS
hadoop-yarn-applications-distributedshell ......... SUCCESS
hadoop-mapreduce-client........................... SUCCESS
hadoop-mapreduce-client-core...................... SUCCESS
hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS
hadoop-yarn-site.................................. SUCCESS
hadoop-yarn-project............................... SUCCESS
hadoop-mapreduce-client-common.................... SUCCESS
hadoop-mapreduce-client-shuffle................... SUCCESS
hadoop-mapreduce-client-app....................... SUCCESS
hadoop-mapreduce-client-hs........................ SUCCESS
hadoop-mapreduce-client-jobclient................. SUCCESS
hadoop-mapreduce-client-hs-plugins................ SUCCESS
Apache Hadoop MapReduce Examples.................. SUCCESS
hadoop-mapreduce.................................. SUCCESS
Apache Hadoop MapReduce Streaming................. SUCCESS
Apache Hadoop Distributed Copy.................... SUCCESS
Apache Hadoop Archives............................ SUCCESS
Apache Hadoop Rumen............................... SUCCESS
Apache Hadoop Gridmix............................. SUCCESS
Apache Hadoop Data Join........................... SUCCESS
Apache Hadoop Extras.............................. SUCCESS
Apache Hadoop Pipes............................... SUCCESS
Apache Hadoop Tools Dist.......................... SUCCESS
Apache Hadoop Tools............................... SUCCESS
Apache Hadoop Distribution........................ SUCCESS
Apache Hadoop Client.............................. SUCCESS
Apache Hadoop Mini-Cluster........................ SUCCESS
------------------------------------------------------------------------
BUILD SUCCESS
------------------------------------------------------------------------
Total time: 29:07.811s
Finished at: Thu Oct 24 09:43:18 CST2013
Final Memory: 78M/239M
------------------------------------------------------------------------
使用用编译好的软件再执行一次
$ hadoop dfs -put ./in
DEPRECATED: Use of this script to executehdfs command is deprecated.
Instead use the hdfs command for it.
13/10/24 15:12:53 WARNutil.NativeCodeLoader: Unable to load native-hadoop library for yourplatform... using builtin-java classes where applicable
put: `in': No such file or directory
hadoop2.2.0遇到NativeLibraries错误的解决过程
问题描述 在安装好hadoop进行测试学习时,遇到下面的问题。hadoop为2.2.0,操作系统为oracle linux 6.3 64位。
$ hadoop dfs -put ./in
DEPRECATED: Use of this script to executehdfs command is deprecated.
Instead use the hdfs command for it.
13/10/24 15:12:53 WARNutil.NativeCodeLoader: Unable to load native-hadoop library for yourplatform... using builtin-java classes where applicable
put: `in': No such file or directory
最后一行“put:`in': No such file or directory”先不管,肯定是语法命令有问题。
先解决“WARN util.NativeCodeLoader: Unable to loadnative-hadoop library for your platform... using builtin-java classes whereapplicable”
备注:我的hadoop环境是自己编译的,因为64位操作系统,hadoop2.2.0貌似只有32位的软件。关于64位编译请参考:
http://blog.csdn.net/bamuta/article/details/13506893解决过程1.开启debug$ export HADOOP_ROOT_LOGGER=DEBUG,console
$ hadoop dfs -put./in
DEPRECATED: Use of this script to executehdfs command is deprecated.
Instead use the hdfs command for it.
13/10/24 16:11:31 DEBUG util.Shell: setsidexited with exit code 0
13/10/24 16:11:31 DEBUGlib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRateorg.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess withannotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time,value=, about=,type=DEFAULT, always=false, sampleName=Ops)
13/10/24 16:11:31 DEBUGlib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRateorg.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure withannotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time,value=, about=,type=DEFAULT, always=false, sampleName=Ops)
13/10/24 16:11:31 DEBUGimpl.MetricsSystemImpl: UgiMetrics, User and group related metrics
13/10/24 16:11:32 DEBUGsecurity.Groups:Creating new Groupsobject
13/10/24 16:11:32 DEBUGutil.NativeCodeLoader: Trying to load the custom-built native-hadoop library...
13/10/24 16:11:32 DEBUGutil.NativeCodeLoader: Failed to load native-hadoopwith error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path
13/10/24 16:11:32 DEBUGutil.NativeCodeLoader: java.library.path=/usr/java/jdk1.7.0_45/lib:/app/hadoop/hadoop-2.2.0/lib/native:/app/hadoop/hadoop-2.2.0/lib/native
13/10/24 16:11:32 WARNutil.NativeCodeLoader: Unable to load native-hadoop library for yourplatform... using builtin-java classes where applicable
13/10/24 16:11:32 DEBUG security.JniBasedUnixGroupsMappingWithFallback:Falling back to shell based
13/10/24 16:11:32 DEBUGsecurity.JniBasedUnixGroupsMappingWithFallback: Group mappingimpl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
13/10/24 16:11:32 DEBUG security.Groups:Group mappingimpl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback;cacheTimeout=300000
13/10/24 16:11:32 DEBUGsecurity.UserGroupInformation: hadoop login
13/10/24 16:11:32 DEBUGsecurity.UserGroupInformation: hadoop login commit
13/10/24 16:11:32 DEBUGsecurity.UserGroupInformation: using local user:UnixPrincipal: hadoop
13/10/24 16:11:32 DEBUGsecurity.UserGroupInformation: UGI loginUser:hadoop (auth:SIMPLE)
13/10/24 16:11:33 DEBUGhdfs.BlockReaderLocal: dfs.client.use.legacy.blockreader.local = false
13/10/24 16:11:33 DEBUGhdfs.BlockReaderLocal: dfs.client.read.shortcircuit = false
13/10/24 16:11:33 DEBUGhdfs.BlockReaderLocal: dfs.client.domain.socket.data.traffic = false
13/10/24 16:11:33 DEBUGhdfs.BlockReaderLocal: dfs.domain.socket.path =
13/10/24 16:11:33 DEBUGimpl.MetricsSystemImpl: StartupProgress, NameNode startup progress
13/10/24 16:11:33 DEBUG retry.RetryUtils:multipleLinearRandomRetry = null
13/10/24 16:11:33 DEBUG ipc.Server:rpcKind=RPC_PROTOCOL_BUFFER, rpcRequestWrapperClass=classorg.apache.hadoop.ipc.ProtobufRpcEngine$RpcRequestWrapper,rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker@2e41d9a2
13/10/24 16:11:34 DEBUGhdfs.BlockReaderLocal: Both short-circuit local reads and UNIX domain socketare disabled.
13/10/24 16:11:34 DEBUG ipc.Client: Theping interval is 60000 ms.
13/10/24 16:11:34 DEBUG ipc.Client:Connecting to localhost/127.0.0.1:8020
13/10/24 16:11:34 DEBUG ipc.Client: IPCClient (2141757401) connection to localhost/127.0.0.1:8020 from hadoop:starting, having connections 1
13/10/24 16:11:34 DEBUG ipc.Client: IPCClient (2141757401) connection to localhost/127.0.0.1:8020 from hadoop sending#0
13/10/24 16:11:34 DEBUG ipc.Client: IPCClient (2141757401) connection to localhost/127.0.0.1:8020 from hadoop gotvalue #0
13/10/24 16:11:34 DEBUGipc.ProtobufRpcEngine: Call: getFileInfo took 82ms
13/10/24 16:11:34 DEBUG ipc.Client: IPCClient (2141757401) connection to localhost/127.0.0.1:8020 from hadoop sending#1
13/10/24 16:11:34 DEBUG ipc.Client: IPCClient (2141757401) connection to localhost/127.0.0.1:8020 from hadoop gotvalue #1
13/10/24 16:11:34 DEBUGipc.ProtobufRpcEngine: Call: getFileInfo took 4ms
put: `.': No such file or directory
13/10/24 16:11:34 DEBUG ipc.Client:Stopping client
13/10/24 16:11:34 DEBUG ipc.Client: IPCClient (2141757401) connection to localhost/127.0.0.1:8020 from hadoop: closed
13/10/24 16:11:34 DEBUG ipc.Client: IPCClient (2141757401) connection to localhost/127.0.0.1:8020 from hadoop:stopped, remaining connections 0
上述debug中的错误 :
Failed to load native-hadoop with error:java.lang.UnsatisfiedLinkError: no hadoop in java.library.path
为了解决这个错误,尝试过很多种办法,很多都是对环境变量的修改。都是一筹莫展。。
2.最后解决的方法最后详细读了官方的NativeLibraries文档。
http://hadoop.apache.org/docs/r2 ... ativeLibraries.html
Either download a hadoop release, whichwill include a pre-built version of the native hadoop library, or build yourown version of the native hadoop library. Whether you download or build, thename for the library is the same: libhadoop.so
发现人家要的名字是这个libhadoop.so,检查我的目录,有libhadoop.so.1.0.0这个。看了官方编译的软件,确实有那个libhadoop.so文件,但只是个link,所以照做
$ ln -slibhadoop.so.1.0.0 libhadoop.so
$ ln -s libhdfs.so.0.0.0libhdfs.so
问题解决了。
$ hadoop dfs -put./in
DEPRECATED: Use of this script to executehdfs command is deprecated.
Instead use the hdfs command for it.
put: `.': No such file or directory
页:
[1]