分享

hadoop+hbase ha集群 master启动失败

Harbin_LN 发表于 2015-6-3 16:07:16 [显示全部楼层] 回帖奖励 阅读模式 关闭右栏 3 31556
使用的是hadoop2.5.2 和 hbase0.98.12.1
hbase-site.xml
<configuration>
        <property>
                <name>hbase.rootdir</name>
                <value>hdfs://hadoop-all:8020/hbase</value>
        </property>
        <property>
                <name>hbase.cluster.distributed</name>
                <value>true</value>
        </property>
        <property>
                <name>hbase.master</name>
                <value>server6:60000</value>
        </property>
        <property>
                <name>hbase.zookeeper.quorum</name>
                <value>server1:2181,server2:2181,server3:2181</value>
        </property>
        <property>
                <name>hbase.regionserver.restart.on.zk.expire</name>
                <value>true</value>
        </property>
</configuration>

hbase/lib下的hadoop包在hbase编译好之后是2.2的,我换成了2.5.2的hadoop/share目录里的了

hbase-daemon.sh start master会报错
部分日志:
2015-06-03 14:45:00,418 DEBUG [main] security.UserGroupInformation: hadoop login
2015-06-03 14:45:00,419 DEBUG [main] security.UserGroupInformation: hadoop login commit
2015-06-03 14:45:00,427 DEBUG [main] security.UserGroupInformation: using local user:UnixPrincipal: hadoop
2015-06-03 14:45:00,429 DEBUG [main] security.UserGroupInformation: UGI loginUser:hadoop (auth:SIMPLE)
2015-06-03 14:45:00,663 DEBUG [main] hdfs.BlockReaderLocal: dfs.client.use.legacy.blockreader.local = false
2015-06-03 14:45:00,663 DEBUG [main] hdfs.BlockReaderLocal: dfs.client.read.shortcircuit = false
2015-06-03 14:45:00,663 DEBUG [main] hdfs.BlockReaderLocal: dfs.client.domain.socket.data.traffic = false
2015-06-03 14:45:00,663 DEBUG [main] hdfs.BlockReaderLocal: dfs.domain.socket.path =
2015-06-03 14:45:00,712 DEBUG [main] hdfs.NameNodeProxies: Couldn't create proxy provider null
java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class org.apache.Hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider not found
        at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1937)
        at org.apache.hadoop.hdfs.NameNodeProxies.getFailoverProxyProviderClass(NameNodeProxies.java:427)
        at org.apache.hadoop.hdfs.NameNodeProxies.createFailoverProxyProvider(NameNodeProxies.java:453)
        at org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:148)
        at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:602)
        at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:547)
        at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:139)
        at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2591)
        at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:89)
        at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2625)
        at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2607)
        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:368)
        at org.apache.hadoop.fs.Path.getFileSystem(Path.java:296)
        at org.apache.hadoop.hbase.util.FSUtils.getRootDir(FSUtils.java:942)
        at org.apache.hadoop.hbase.master.HMaster.<init>(HMaster.java:532)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
        at org.apache.hadoop.hbase.master.HMaster.constructMaster(HMaster.java:3010)
        at org.apache.hadoop.hbase.master.HMasterCommandLine.startMaster(HMasterCommandLine.java:193)
        at org.apache.hadoop.hbase.master.HMasterCommandLine.run(HMasterCommandLine.java:135)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
        at org.apache.hadoop.hbase.util.ServerCommandLine.doMain(ServerCommandLine.java:126)
        at org.apache.hadoop.hbase.master.HMaster.main(HMaster.java:3029)
Caused by: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class org.apache.Hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider not found
        at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1905)
        at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1929)
        ... 24 more
Caused by: java.lang.ClassNotFoundException: Class org.apache.Hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider not found
        at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:1811)
        at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1903)
        ... 25 more
2015-06-03 14:45:00,717 ERROR [main] master.HMasterCommandLine: Master exiting
java.lang.RuntimeException: Failed construction of Master: class org.apache.hadoop.hbase.master.HMaster
        at org.apache.hadoop.hbase.master.HMaster.constructMaster(HMaster.java:3015)
        at org.apache.hadoop.hbase.master.HMasterCommandLine.startMaster(HMasterCommandLine.java:193)
        at org.apache.hadoop.hbase.master.HMasterCommandLine.run(HMasterCommandLine.java:135)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
        at org.apache.hadoop.hbase.util.ServerCommandLine.doMain(ServerCommandLine.java:126)
        at org.apache.hadoop.hbase.master.HMaster.main(HMaster.java:3029)
Caused by: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class org.apache.Hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider not found
        at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1937)
        at org.apache.hadoop.hdfs.NameNodeProxies.getFailoverProxyProviderClass(NameNodeProxies.java:427)
        at org.apache.hadoop.hdfs.NameNodeProxies.createFailoverProxyProvider(NameNodeProxies.java:453)
        at org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:148)
        at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:602)
        at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:547)
        at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:139)
        at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2591)
        at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:89)
        at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2625)
        at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2607)
        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:368)
        at org.apache.hadoop.fs.Path.getFileSystem(Path.java:296)
        at org.apache.hadoop.hbase.util.FSUtils.getRootDir(FSUtils.java:942)
        at org.apache.hadoop.hbase.master.HMaster.<init>(HMaster.java:532)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
        at org.apache.hadoop.hbase.master.HMaster.constructMaster(HMaster.java:3010)
        ... 5 more
Caused by: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class org.apache.Hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider not found
        at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1905)
        at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1929)
        ... 24 more
        at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:Caused by: java.lang.ClassNotFoundException: Class org.apache.Hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider not found
1811)
        at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1903)
        ... 25 more

已有(3)人评论

跳转到指定楼层
starrycheng 发表于 2015-6-3 16:26:08
检查下面配置
<property>
          <name>hbase.rootdir</name>
          <value>hdfs://Cluster/hbase</value><!-- Cluster是hadoop HA集群的名称 -->
        </property>

<property>
           <name>hbase.master</name>
           <value>Cluster:9000</value>
        </property>  

回复

使用道具 举报

Harbin_LN 发表于 2015-6-4 10:25:26
starrycheng 发表于 2015-6-3 16:26
检查下面配置

          hbase.rootdir

这块配置没有问题啊~ 感觉是少配置了什么环境变量 但又找不到
回复

使用道具 举报

starrycheng 发表于 2015-6-8 15:53:11
Harbin_LN 发表于 2015-6-4 10:25
这块配置没有问题啊~ 感觉是少配置了什么环境变量 但又找不到

[mw_shl_code=xml,true]<property>
                <name>hbase.rootdir</name>
                <value>hdfs://hadoop-all:8020/hbase</value>
        </property>
        <property>
                <name>hbase.cluster.distributed</name>
                <value>true</value>
        </property>
        <property>
                <name>hbase.master</name>
                <value>server6:60000</value>
        </property>[/mw_shl_code]

Ha记得将hbase.master改成hadoop-all的方式。




回复

使用道具 举报

您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

关闭

推荐上一条 /2 下一条