hbase client连接hbase失败。
详情如下:
我有一个hbase客户端工程,需要链接hbase数据库进行查询。
现有两组hbase集群A和B
集群A | Hadoop 1.1.2 | HBase 0.94.7 | 集群B | Hadoop-2.0.0-cdh4.4.0 | Hbase-0.94.6-cdh4.4.0 | 工程都是放在集群中的datanode01上跑的,集群A上就没有问题,集群B上就会出现
[DEBUG] [2015-09-21 19:57:31] [org.apache.hadoop.hbase.ipc.SecureClient$SecureConnection.receiveResponse(SecureClient.java:368)] -
IPC Client (1877700835) connection to datanode01/192.168.1.3:60020 from yaxin got value #-1
貌似hbase client连接hbase服务器失败的现象。
请大侠帮忙看看问题出在哪儿?跟HBase Security 有关吗?
A上的hbase配置文件如下:
[mw_shl_code=xml,true]<configuration>
<!-- Configure HBase to use the HA NameNode nameservice -->
<property>
<name>hbase.rootdir</name>
<value>hdfs://hadooptest/hbase</value>
</property>
<property>
<name>hbase.cluster.distributed</name>
<value>true</value>
</property>
<property>
<name>hbase.zookeeper.quorum</name>
<value>node01,node02,node03</value>
</property>
<property>
<name>hbase.zookeeper.property.dataDir</name>
<value>/home/hadoop/dfsdata/zookeeper</value>
</property>
</configuration>[/mw_shl_code]
B上的配置文件如下:[mw_shl_code=xml,true]<configuration>
<property>
<name>hbase.rootdir</name>
<value>hdfs://namenode00:8020/hbase</value>
</property>
<property>
<name>hbase.client.write.buffer</name>
<value>2097152</value>
</property>
<property>
<name>hbase.client.pause</name>
<value>1000</value>
</property>
<property>
<name>hbase.client.retries.number</name>
<value>10</value>
</property>
<property>
<name>hbase.client.scanner.caching</name>
<value>1</value>
</property>
<property>
<name>hbase.client.keyvalue.maxsize</name>
<value>10485760</value>
</property>
<property>
<name>hbase.rpc.timeout</name>
<value>60000</value>
</property>
<property>
<name>zookeeper.session.timeout</name>
<value>60000</value>
</property>
<property>
<name>zookeeper.znode.parent</name>
<value>/hbase</value>
</property>
<property>
<name>zookeeper.znode.rootserver</name>
<value>root-region-server</value>
</property>
<property>
<name>hbase.zookeeper.quorum</name>
<value>namenode01,namenode02,datanode01,datanode02,datanode03,datanode04,datanode05,datanode06,datanode07,datanode08</value>
</property>
<property>
<name>hbase.zookeeper.property.clientPort</name>
<value>2181</value>
</property>
<property>
<name>hbase.security.authentication</name>
<value>kerberos</value>
</property>
<property>
<name>hbase.rpc.engine</name>
<value>org.apache.hadoop.hbase.ipc.SecureRpcEngine</value>
</property>
<property>
<name>hbase.regionserver.kerberos.principal</name>
<value>hbase/datanode00@EXAMPLE.COM</value>
</property>
<property>
<name>hbase.regionserver.keytab.file</name>
<value>/etc/hbase/conf/hbase.keytab</value>
</property>
<property>
<name>hbase.master.kerberos.principal</name>
<value>hbase/namenode00@EXAMPLE.COM</value>
</property>
<property>
<name>hbase.master.keytab.file</name>
<value>/etc/hbase/conf/hbase.keytab</value>
</property>
<property>
<name>hbase.cluster.distributed</name>
<value>true</value>
</property>
<property>
<name>hbase.coprocessor.master.classes</name>
<value>org.apache.hadoop.hbase.security.access.AccessController</value>
</property>
<property>
<name>hbase.coprocessor.region.classes</name>
<value>org.apache.hadoop.hbase.security.token.TokenProvider,org.apache.hadoop.hbase.security.access.AccessController</value>
</property>
</configuration>[/mw_shl_code]
A上的日志信息如下:
[mw_shl_code=text,true]
[INFO ] [2015-09-21 17:27:54] [XXXXXXX.report.system.hadoop.HadoopJavaClient.init(HadoopJavaClient.java:73)] - === Start initilization of hbase client! ===
[DEBUG] [2015-09-21 17:27:54] [org.apache.hadoop.metrics2.lib.MutableMetricsFactory.newForField(MutableMetricsFactory.java:42)] - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformati
on$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, value=[Rate of successful kerberos logins and latency (milliseconds)], about=, type=DEFAULT, always=false, sampleName=O
ps)
[DEBUG] [2015-09-21 17:27:54] [org.apache.hadoop.metrics2.lib.MutableMetricsFactory.newForField(MutableMetricsFactory.java:42)] - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformati
on$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, value=[Rate of failed kerberos logins and latency (milliseconds)], about=, type=DEFAULT, always=false, sampleName=Ops)
[DEBUG] [2015-09-21 17:27:55] [org.apache.hadoop.metrics2.impl.MetricsSystemImpl.register(MetricsSystemImpl.java:220)] - UgiMetrics, User and group related metrics
[DEBUG] [2015-09-21 17:27:55] [org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:180)] - Creating new Groups object
[DEBUG] [2015-09-21 17:27:55] [org.apache.hadoop.util.NativeCodeLoader.<clinit>(NativeCodeLoader.java:46)] - Trying to load the custom-built native-hadoop library...
[DEBUG] [2015-09-21 17:27:55] [org.apache.hadoop.util.NativeCodeLoader.<clinit>(NativeCodeLoader.java:55)] - Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path
[DEBUG] [2015-09-21 17:27:55] [org.apache.hadoop.util.NativeCodeLoader.<clinit>(NativeCodeLoader.java:56)] - java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
[WARN ] [2015-09-21 17:27:55] [org.apache.hadoop.util.NativeCodeLoader.<clinit>(NativeCodeLoader.java:62)] - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
[DEBUG] [2015-09-21 17:27:55] [org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback.<init>(JniBasedUnixGroupsMappingWithFallback.java:40)] - Falling back to shell based
[DEBUG] [2015-09-21 17:27:55] [org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback.<init>(JniBasedUnixGroupsMappingWithFallback.java:44)] - Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMappi
ng
[DEBUG] [2015-09-21 17:27:55] [org.apache.hadoop.security.Groups.<init>(Groups.java:66)] - Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000
[DEBUG] [2015-09-21 17:27:55] [org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.login(UserGroupInformation.java:175)] - hadoop login
[DEBUG] [2015-09-21 17:27:55] [org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.commit(UserGroupInformation.java:124)] - hadoop login commit
[DEBUG] [2015-09-21 17:27:55] [org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.commit(UserGroupInformation.java:154)] - using local user:UnixPrincipal: javadev
[DEBUG] [2015-09-21 17:27:55] [org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:697)] - UGI loginUser:javadev (auth:SIMPLE)
[DEBUG] [2015-09-21 17:27:55] [org.apache.hadoop.hbase.zookeeper.ZKUtil.connect(ZKUtil.java:120)] - hconnection opening connection to ZooKeeper with ensemble (10.1.252.126:2181)
[INFO ] [2015-09-21 17:27:56] [org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper.<init>(RecoverableZooKeeper.java:104)] - The identifier of this process is 2393@rhel61-1
[DEBUG] [2015-09-21 17:27:56] [org.apache.hadoop.hbase.zookeeper.ZooKeeperWatcher.process(ZooKeeperWatcher.java:273)] - hconnection Received ZooKeeper Event, type=None, state=SyncConnected, path=null
[DEBUG] [2015-09-21 17:27:56] [org.apache.hadoop.hbase.zookeeper.ZooKeeperWatcher.connectionEvent(ZooKeeperWatcher.java:350)] - hconnection-0x4fef46b5ca0006 connected
[DEBUG] [2015-09-21 17:27:56] [org.apache.hadoop.hbase.zookeeper.ZKUtil.logRetrievedMsg(ZKUtil.java:1599)] - hconnection-0x4fef46b5ca0006 Retrieved 36 byte(s) of data from znode /hbase/hbaseid; data=5538af66-0d89-4b82-a5bc-5
b14c...
[DEBUG] [2015-09-21 17:27:56] [org.apache.hadoop.hbase.zookeeper.ZKUtil.watchAndCheckExists(ZKUtil.java:423)] - hconnection-0x4fef46b5ca0006 Set watcher on existing znode /hbase/master
[DEBUG] [2015-09-21 17:27:56] [org.apache.hadoop.hbase.zookeeper.ZKUtil.logRetrievedMsg(ZKUtil.java:1599)] - hconnection-0x4fef46b5ca0006 Retrieved 30 byte(s) of data from znode /hbase/master and set watcher; \x00\x00dell-12
6,60000,144282...
[DEBUG] [2015-09-21 17:27:56] [org.apache.hadoop.hbase.zookeeper.ZKUtil.watchAndCheckExists(ZKUtil.java:423)] - hconnection-0x4fef46b5ca0006 Set watcher on existing znode /hbase/root-region-server
[DEBUG] [2015-09-21 17:27:56] [org.apache.hadoop.hbase.zookeeper.ZKUtil.logRetrievedMsg(ZKUtil.java:1599)] - hconnection-0x4fef46b5ca0006 Retrieved 30 byte(s) of data from znode /hbase/root-region-server and set watcher; dat
anode02,60020,1442828467011
[DEBUG] [2015-09-21 17:27:56] [org.apache.hadoop.hbase.ipc.HBaseRPC.getProtocolEngine(HBaseRPC.java:102)] - Using RpcEngine: org.apache.hadoop.hbase.ipc.WritableRpcEngine
[DEBUG] [2015-09-21 17:27:56] [org.apache.hadoop.hbase.ipc.HBaseClient.<init>(HBaseClient.java:866)] - The ping interval is60000ms.
[DEBUG] [2015-09-21 17:27:56] [org.apache.hadoop.hbase.zookeeper.ZKUtil.logRetrievedMsg(ZKUtil.java:1599)] - hconnection-0x4fef46b5ca0006 Retrieved 30 byte(s) of data from znode /hbase/root-region-server and set watcher; dat
anode02,60020,1442828467011
[DEBUG] [2015-09-21 17:27:56] [org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:875)] - Looked up root region location, connection=org.apache.hadoop.hbase.clien
t.HConnectionManager$HConnectionImplementation@5aa8a47b; serverName=datanode02,60020,1442828467011
[DEBUG] [2015-09-21 17:27:56] [org.apache.hadoop.hbase.ipc.HBaseClient$Connection.setupIOstreams(HBaseClient.java:434)] - Connecting to org.apache.hadoop.hbase.ipc.HBaseClient$ConnectionId@d21b918b
[DEBUG] [2015-09-21 17:27:56] [org.apache.hadoop.hbase.ipc.HBaseClient$Connection.sendParam(HBaseClient.java:614)] - IPC Client (832921053) connection to datanode02/10.1.198.144:60020 from javadev sending #0
[DEBUG] [2015-09-21 17:27:56] [org.apache.hadoop.hbase.ipc.HBaseClient$Connection.run(HBaseClient.java:581)] - IPC Client (832921053) connection to datanode02/10.1.198.144:60020 from javadev: starting, having connections 1
[DEBUG] [2015-09-21 17:27:56] [org.apache.hadoop.hbase.ipc.HBaseClient$Connection.receiveResponse(HBaseClient.java:655)] - IPC Client (832921053) connection to datanode02/10.1.198.144:60020 from javadev got value #0
[DEBUG] [2015-09-21 17:27:56] [org.apache.hadoop.hbase.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:92)] - Call: getProtocolVersion 175
[DEBUG] [2015-09-21 17:27:56] [org.apache.hadoop.hbase.ipc.HBaseClient$Connection.sendParam(HBaseClient.java:614)] - IPC Client (832921053) connection to datanode02/10.1.198.144:60020 from javadev sending #1
[DEBUG] [2015-09-21 17:27:56] [org.apache.hadoop.hbase.ipc.HBaseClient$Connection.receiveResponse(HBaseClient.java:655)] - IPC Client (832921053) connection to datanode02/10.1.198.144:60020 from javadev got value #1
[DEBUG] [2015-09-21 17:27:56] [org.apache.hadoop.hbase.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:92)] - Call: getClosestRowBefore 9
[DEBUG] [2015-09-21 17:27:56] [org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.cacheLocation(HConnectionManager.java:1268)] - Cached location for .META.,,1.1028785192 is datanode02:60020
[WARN ] [2015-09-21 17:27:56] [org.apache.hadoop.conf.Configuration.warnOnceIfDeprecated(Configuration.java:824)] - hadoop.native.lib is deprecated. Instead, use io.native.lib.available
[DEBUG] [2015-09-21 17:27:56] [org.apache.hadoop.hbase.ipc.HBaseClient$Connection.sendParam(HBaseClient.java:614)] - IPC Client (832921053) connection to datanode02/10.1.198.144:60020 from javadev sending #2
[DEBUG] [2015-09-21 17:27:56] [org.apache.hadoop.hbase.ipc.HBaseClient$Connection.receiveResponse(HBaseClient.java:655)] - IPC Client (832921053) connection to datanode02/10.1.198.144:60020 from javadev got value #2
[DEBUG] [2015-09-21 17:27:56] [org.apache.hadoop.hbase.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:92)] - Call: getClosestRowBefore 4
[DEBUG] [2015-09-21 17:27:56] [org.apache.hadoop.hbase.client.MetaScanner.metaScan(MetaScanner.java:201)] - Scanning .META. starting at row=DPI_ACCESS,,00000000000000 for max=10 rows using org.apache.hadoop.hbase.client.HCon
nectionManager$HConnectionImplementation@5aa8a47b
[DEBUG] [2015-09-21 17:27:56] [org.apache.hadoop.hbase.ipc.HBaseClient$Connection.sendParam(HBaseClient.java:614)] - IPC Client (832921053) connection to datanode02/10.1.198.144:60020 from javadev sending #3
[DEBUG] [2015-09-21 17:27:56] [org.apache.hadoop.hbase.ipc.HBaseClient$Connection.receiveResponse(HBaseClient.java:655)] - IPC Client (832921053) connection to datanode02/10.1.198.144:60020 from javadev got value #3
[DEBUG] [2015-09-21 17:27:56] [org.apache.hadoop.hbase.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:92)] - Call: openScanner 2
[DEBUG] [2015-09-21 17:27:56] [org.apache.hadoop.hbase.ipc.HBaseClient$Connection.sendParam(HBaseClient.java:614)] - IPC Client (832921053) connection to datanode02/10.1.198.144:60020 from javadev sending #4
[DEBUG] [2015-09-21 17:27:56] [org.apache.hadoop.hbase.ipc.HBaseClient$Connection.receiveResponse(HBaseClient.java:655)] - IPC Client (832921053) connection to datanode02/10.1.198.144:60020 from javadev got value #4
[DEBUG] [2015-09-21 17:27:56] [org.apache.hadoop.hbase.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:92)] - Call: next 5
[DEBUG] [2015-09-21 17:27:56] [org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.cacheLocation(HConnectionManager.java:1268)] - Cached location for DPI_ACCESS,,1440200307281.ea747027a0fcc04283527c61
5629d0af. is datanode02:60020
[DEBUG] [2015-09-21 17:27:56] [org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.cacheLocation(HConnectionManager.java:1268)] - Cached location for DPI_ACCESS,10.1.18.112_110.1.18.112_(18882060)_143
8482000000,1440240837834.911addee0caade74032d89f2c1fbbe57. is datanode02:60020
[DEBUG] [2015-09-21 17:27:56] [org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.cacheLocation(HConnectionManager.java:1268)] - Cached location for DPI_ACCESS,10.1.27.109_110.1.27.109_(28085962)_143
8476120000,1440240837834.f82e261873a96eb3436c341f5f93505d. is datanode02:60020
[DEBUG] [2015-09-21 17:27:56] [org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.cacheLocation(HConnectionManager.java:1268)] - Cached location for DPI_ACCESS,10.1.4.214_110.1.4.214_(4953887)_143847
1620000,1440160161211.90b4ea8faabd26fdfa69c4419b8ab2df. is datanode02:60020
[DEBUG] [2015-09-21 17:27:56] [org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.cacheLocation(HConnectionManager.java:1268)] - Cached location for DPI_ACCESS,172.1.0.0_60.1.0.0_(104590970)_14389493
70000,1440120207077.a7f4172a81fe60e6515020cd4c9f9934. is datanode02:60020
[DEBUG] [2015-09-21 17:27:56] [org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.cacheLocation(HConnectionManager.java:1268)] - Cached location for DPI_ACCESS,172.1.0.0_60.1.0.0_(11389509)_143974790
9000,1439896037479.7fbb0775c2174b2fa05935ef43505ca1. is dell-126:60020
[DEBUG] [2015-09-21 17:27:56] [org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.cacheLocation(HConnectionManager.java:1268)] - Cached location for DPI_ACCESS,172.1.0.0_60.1.0.0_(178553)_14385369530
00,1439874643210.94c3e5e6204d23d1b0fd98b6557bc151. is datanode02:60020
[DEBUG] [2015-09-21 17:27:56] [org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.cacheLocation(HConnectionManager.java:1268)] - Cached location for DPI_ACCESS,172.1.0.0_60.1.0.0_(2105745)_1438464145
000,1439875985257.d5921609e2b3c00cd2e731904b666376. is datanode02:60020
[DEBUG] [2015-09-21 17:27:56] [org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.cacheLocation(HConnectionManager.java:1268)] - Cached location for DPI_ACCESS,172.1.0.0_60.1.0.0_(2350831)_1438709231
000,1439922741266.495d0d35ad7f3cb92dc6fcd711a6b178. is datanode02:60020
[DEBUG] [2015-09-21 17:27:56] [org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.cacheLocation(HConnectionManager.java:1268)] - Cached location for DPI_ACCESS,172.1.0.0_60.1.0.0_(24916509)_143927490
9000,1439922741266.59114ccbe299fffee78e83f5d7f155f1. is datanode02:60020
[DEBUG] [2015-09-21 17:27:56] [org.apache.hadoop.hbase.ipc.HBaseClient$Connection.sendParam(HBaseClient.java:614)] - IPC Client (832921053) connection to datanode02/10.1.198.144:60020 from javadev sending #5
[DEBUG] [2015-09-21 17:27:56] [org.apache.hadoop.hbase.ipc.HBaseClient$Connection.receiveResponse(HBaseClient.java:655)] - IPC Client (832921053) connection to datanode02/10.1.198.144:60020 from javadev got value #5
[DEBUG] [2015-09-21 17:27:56] [org.apache.hadoop.hbase.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:92)] - Call: close 3
[INFO ] [2015-09-21 17:27:56] [XXXXXXX.report.system.hadoop.HadoopJavaClient.init(HadoopJavaClient.java:100)] - === End initilization of hbase client! ===[/mw_shl_code]
B上的日志信息如下:
[mw_shl_code=text,true][INFO ] [2015-09-21 19:57:29] [XXXXXXX.report.system.hadoop.HadoopJavaClient.init(HadoopJavaClient.java:73)] - === Start initilization of hbase client! ===
[DEBUG] [2015-09-21 19:57:29] [org.apache.hadoop.metrics2.lib.MutableMetricsFactory.newForField(MutableMetricsFactory.java:42)] - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, value=[Rate of successful kerberos logins and latency (milliseconds)], about=, type=DEFAULT, always=false, sampleName=Ops)
[DEBUG] [2015-09-21 19:57:29] [org.apache.hadoop.metrics2.lib.MutableMetricsFactory.newForField(MutableMetricsFactory.java:42)] - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, value=[Rate of failed kerberos logins and latency (milliseconds)], about=, type=DEFAULT, always=false, sampleName=Ops)
[DEBUG] [2015-09-21 19:57:29] [org.apache.hadoop.metrics2.impl.MetricsSystemImpl.register(MetricsSystemImpl.java:220)] - UgiMetrics, User and group related metrics
[DEBUG] [2015-09-21 19:57:29] [org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:180)] - Creating new Groups object
[DEBUG] [2015-09-21 19:57:29] [org.apache.hadoop.util.NativeCodeLoader.<clinit>(NativeCodeLoader.java:46)] - Trying to load the custom-built native-hadoop library...
[DEBUG] [2015-09-21 19:57:29] [org.apache.hadoop.util.NativeCodeLoader.<clinit>(NativeCodeLoader.java:55)] - Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path
[DEBUG] [2015-09-21 19:57:29] [org.apache.hadoop.util.NativeCodeLoader.<clinit>(NativeCodeLoader.java:56)] - java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
[WARN ] [2015-09-21 19:57:29] [org.apache.hadoop.util.NativeCodeLoader.<clinit>(NativeCodeLoader.java:62)] - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
[DEBUG] [2015-09-21 19:57:29] [org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback.<init>(JniBasedUnixGroupsMappingWithFallback.java:40)] - Falling back to shell based
[DEBUG] [2015-09-21 19:57:29] [org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback.<init>(JniBasedUnixGroupsMappingWithFallback.java:44)] - Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
[DEBUG] [2015-09-21 19:57:29] [org.apache.hadoop.security.Groups.<init>(Groups.java:66)] - Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000
[DEBUG] [2015-09-21 19:57:29] [org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.login(UserGroupInformation.java:175)] - hadoop login
[DEBUG] [2015-09-21 19:57:29] [org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.commit(UserGroupInformation.java:124)] - hadoop login commit
[DEBUG] [2015-09-21 19:57:29] [org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.commit(UserGroupInformation.java:154)] - using local user:UnixPrincipal: yaxin
[DEBUG] [2015-09-21 19:57:29] [org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:697)] - UGI loginUser:yaxin (auth:SIMPLE)
[DEBUG] [2015-09-21 19:57:30] [org.apache.hadoop.hbase.zookeeper.ZKUtil.connect(ZKUtil.java:120)] - hconnection opening connection to ZooKeeper with ensemble (datanode02:2181,datanode01:2181,namenode02:2181,namenode01:2181,datanode08:2181,datanode07:2181,datanode06:2181,datanode05:2181,datanode04:2181,datanode03:2181)
[INFO ] [2015-09-21 19:57:30] [org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper.<init>(RecoverableZooKeeper.java:104)] - The identifier of this process is 11501@datanode01
[DEBUG] [2015-09-21 19:57:30] [org.apache.hadoop.hbase.zookeeper.ZooKeeperWatcher.process(ZooKeeperWatcher.java:273)] - hconnection Received ZooKeeper Event, type=None, state=SyncConnected, path=null
[DEBUG] [2015-09-21 19:57:30] [org.apache.hadoop.hbase.zookeeper.ZooKeeperWatcher.connectionEvent(ZooKeeperWatcher.java:350)] - hconnection-0x1473e51bb280014 connected
[DEBUG] [2015-09-21 19:57:30] [org.apache.hadoop.hbase.zookeeper.ZKUtil.logRetrievedMsg(ZKUtil.java:1599)] - hconnection-0x1473e51bb280014 Retrieved 36 byte(s) of data from znode /hbase/hbaseid; data=b99caec7-0459-4d27-af97-dbef9...
[DEBUG] [2015-09-21 19:57:30] [org.apache.hadoop.hbase.zookeeper.ZKUtil.watchAndCheckExists(ZKUtil.java:423)] - hconnection-0x1473e51bb280014 Set watcher on existing znode /hbase/master
[DEBUG] [2015-09-21 19:57:30] [org.apache.hadoop.hbase.zookeeper.ZKUtil.logRetrievedMsg(ZKUtil.java:1599)] - hconnection-0x1473e51bb280014 Retrieved 32 byte(s) of data from znode /hbase/master and set watcher; \x00\x00namenode01,60000,1423...
[DEBUG] [2015-09-21 19:57:30] [org.apache.hadoop.hbase.zookeeper.ZKUtil.watchAndCheckExists(ZKUtil.java:423)] - hconnection-0x1473e51bb280014 Set watcher on existing znode /hbase/root-region-server
[DEBUG] [2015-09-21 19:57:30] [org.apache.hadoop.hbase.zookeeper.ZKUtil.logRetrievedMsg(ZKUtil.java:1599)] - hconnection-0x1473e51bb280014 Retrieved 30 byte(s) of data from znode /hbase/root-region-server and set watcher; datanode01,60020,1423485784289
[DEBUG] [2015-09-21 19:57:30] [org.apache.hadoop.hbase.ipc.HBaseRPC.getProtocolEngine(HBaseRPC.java:102)] - Using RpcEngine: org.apache.hadoop.hbase.ipc.SecureRpcEngine
[DEBUG] [2015-09-21 19:57:30] [org.apache.hadoop.hbase.ipc.HBaseClient.<init>(HBaseClient.java:866)] - The ping interval is60000ms.
[DEBUG] [2015-09-21 19:57:30] [org.apache.hadoop.hbase.ipc.SecureClient.<init>(SecureClient.java:474)] - fallbackAllowed=false
[DEBUG] [2015-09-21 19:57:30] [org.apache.hadoop.hbase.zookeeper.ZKUtil.logRetrievedMsg(ZKUtil.java:1599)] - hconnection-0x1473e51bb280014 Retrieved 30 byte(s) of data from znode /hbase/root-region-server and set watcher; datanode01,60020,1423485784289
[DEBUG] [2015-09-21 19:57:30] [org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:875)] - Looked up root region location, connection=org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@13a32288; serverName=datanode01,60020,1423485784289
[DEBUG] [2015-09-21 19:57:30] [org.apache.hadoop.hbase.ipc.SecureClient$SecureConnection.<init>(SecureClient.java:146)] - Use SIMPLE authentication for protocol HRegionInterface
[DEBUG] [2015-09-21 19:57:30] [org.apache.hadoop.hbase.ipc.SecureClient$SecureConnection.setupIOstreams(SecureClient.java:262)] - Connecting to datanode01/192.168.1.3:60020
[DEBUG] [2015-09-21 19:57:30] [org.apache.hadoop.hbase.ipc.HBaseClient$Connection.sendParam(HBaseClient.java:614)] - IPC Client (1877700835) connection to datanode01/192.168.1.3:60020 from yaxin sending #0
[DEBUG] [2015-09-21 19:57:30] [org.apache.hadoop.hbase.ipc.HBaseClient$Connection.run(HBaseClient.java:581)] - IPC Client (1877700835) connection to datanode01/192.168.1.3:60020 from yaxin: starting, having connections 1
[DEBUG] [2015-09-21 19:57:30] [org.apache.hadoop.hbase.ipc.SecureClient$SecureConnection.receiveResponse(SecureClient.java:368)] - IPC Client (1877700835) connection to datanode01/192.168.1.3:60020 from yaxin got value #-1
[DEBUG] [2015-09-21 19:57:30] [org.apache.hadoop.hbase.ipc.SecureClient$SecureConnection.receiveResponse(SecureClient.java:374)] - call #-1 state is -1
[DEBUG] [2015-09-21 19:57:30] [org.apache.hadoop.hbase.ipc.SecureClient$SecureConnection.close(SecureClient.java:446)] - closing ipc connection to datanode01/192.168.1.3:60020: Authentication is required
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Authentication is required
at org.apache.hadoop.hbase.ipc.SecureClient$SecureConnection.receiveResponse(SecureClient.java:394)
at org.apache.hadoop.hbase.ipc.HBaseClient$Connection.run(HBaseClient.java:586)
[DEBUG] [2015-09-21 19:57:30] [org.apache.hadoop.hbase.ipc.SecureClient$SecureConnection.close(SecureClient.java:454)] - IPC Client (1877700835) connection to datanode01/192.168.1.3:60020 from yaxin: closed
[DEBUG] [2015-09-21 19:57:30] [org.apache.hadoop.hbase.ipc.HBaseClient$Connection.run(HBaseClient.java:596)] - IPC Client (1877700835) connection to datanode01/192.168.1.3:60020 from yaxin: stopped, remaining connections 0
[DEBUG] [2015-09-21 19:57:31] [org.apache.hadoop.hbase.ipc.SecureClient$SecureConnection.<init>(SecureClient.java:146)] - Use SIMPLE authentication for protocol HRegionInterface
[DEBUG] [2015-09-21 19:57:31] [org.apache.hadoop.hbase.ipc.SecureClient$SecureConnection.setupIOstreams(SecureClient.java:262)] - Connecting to datanode01/192.168.1.3:60020
[DEBUG] [2015-09-21 19:57:31] [org.apache.hadoop.hbase.ipc.HBaseClient$Connection.sendParam(HBaseClient.java:614)] - IPC Client (1877700835) connection to datanode01/192.168.1.3:60020 from yaxin sending #1
[DEBUG] [2015-09-21 19:57:31] [org.apache.hadoop.hbase.ipc.HBaseClient$Connection.run(HBaseClient.java:581)] - IPC Client (1877700835) connection to datanode01/192.168.1.3:60020 from yaxin: starting, having connections 1
[DEBUG] [2015-09-21 19:57:31] [org.apache.hadoop.hbase.ipc.SecureClient$SecureConnection.receiveResponse(SecureClient.java:368)] - IPC Client (1877700835) connection to datanode01/192.168.1.3:60020 from yaxin got value #-1
[DEBUG] [2015-09-21 19:57:31] [org.apache.hadoop.hbase.ipc.SecureClient$SecureConnection.receiveResponse(SecureClient.java:374)] - call #-1 state is -1
[DEBUG] [2015-09-21 19:57:31] [org.apache.hadoop.hbase.ipc.SecureClient$SecureConnection.close(SecureClient.java:446)] - closing ipc connection to datanode01/192.168.1.3:60020: Authentication is required
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Authentication is required
at org.apache.hadoop.hbase.ipc.SecureClient$SecureConnection.receiveResponse(SecureClient.java:394)
at org.apache.hadoop.hbase.ipc.HBaseClient$Connection.run(HBaseClient.java:586)
[DEBUG] [2015-09-21 19:57:31] [org.apache.hadoop.hbase.ipc.SecureClient$SecureConnection.close(SecureClient.java:454)] - IPC Client (1877700835) connection to datanode01/192.168.1.3:60020 from yaxin: closed
[DEBUG] [2015-09-21 19:57:31] [org.apache.hadoop.hbase.ipc.HBaseClient$Connection.run(HBaseClient.java:596)] - IPC Client (1877700835) connection to datanode01/192.168.1.3:60020 from yaxin: stopped, remaining connections 0[/mw_shl_code]
然后就会反复出现如下信息,1秒钟1次:
[mw_shl_code=text,true][DEBUG] [2015-09-21 19:57:31] [org.apache.hadoop.hbase.ipc.SecureClient$SecureConnection.<init>(SecureClient.java:146)] - Use SIMPLE authentication for protocol HRegionInterface
[DEBUG] [2015-09-21 19:57:31] [org.apache.hadoop.hbase.ipc.SecureClient$SecureConnection.setupIOstreams(SecureClient.java:262)] - Connecting to datanode01/192.168.1.3:60020
[DEBUG] [2015-09-21 19:57:31] [org.apache.hadoop.hbase.ipc.HBaseClient$Connection.sendParam(HBaseClient.java:614)] - IPC Client (1877700835) connection to datanode01/192.168.1.3:60020 from yaxin sending #1
[DEBUG] [2015-09-21 19:57:31] [org.apache.hadoop.hbase.ipc.HBaseClient$Connection.run(HBaseClient.java:581)] - IPC Client (1877700835) connection to datanode01/192.168.1.3:60020 from yaxin: starting, having connections 1
[DEBUG] [2015-09-21 19:57:31] [org.apache.hadoop.hbase.ipc.SecureClient$SecureConnection.receiveResponse(SecureClient.java:368)] - IPC Client (1877700835) connection to datanode01/192.168.1.3:60020 from yaxin got value #-1
[DEBUG] [2015-09-21 19:57:31] [org.apache.hadoop.hbase.ipc.SecureClient$SecureConnection.receiveResponse(SecureClient.java:374)] - call #-1 state is -1
[DEBUG] [2015-09-21 19:57:31] [org.apache.hadoop.hbase.ipc.SecureClient$SecureConnection.close(SecureClient.java:446)] - closing ipc connection to datanode01/192.168.1.3:60020: Authentication is required
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Authentication is required
at org.apache.hadoop.hbase.ipc.SecureClient$SecureConnection.receiveResponse(SecureClient.java:394)
at org.apache.hadoop.hbase.ipc.HBaseClient$Connection.run(HBaseClient.java:586)
[DEBUG] [2015-09-21 19:57:31] [org.apache.hadoop.hbase.ipc.SecureClient$SecureConnection.close(SecureClient.java:454)] - IPC Client (1877700835) connection to datanode01/192.168.1.3:60020 from yaxin: closed
[DEBUG] [2015-09-21 19:57:31] [org.apache.hadoop.hbase.ipc.HBaseClient$Connection.run(HBaseClient.java:596)] - IPC Client (1877700835) connection to datanode01/192.168.1.3:60020 from yaxin: stopped, remaining connections 0[/mw_shl_code]
|