分享

hive hbase 整合 NullPointerException

zhouguanwu 发表于 2014-8-11 15:33:44 [显示全部楼层] 回帖奖励 阅读模式 关闭右栏 9 18950
hive hbase 版本升级对应都是用cdh4 版本:
整合关联数据后,新建表及写数据都没问题;
唯独hive升级后 之前关联上个版本的表数据
select 查询时 报一下错误:

Failed with exception java.io.IOException:java.lang.NullPointerException

尝试删除hive表,重新hive关联hbase存在表,还是同样的错误:

工作或者实验中 应该有人也遇到同样的问题没;求指教
======
hbase-0.94.15-cdh4.7.0
hive-0.10.0-cdh4.7.0

hbase 有关的 jar包已经copy到hive/lib下面
---
下面是hive hbase关联配置,元数据采用mysql存放
<property>
<name>hive.aux.jars.path</name>
<value>file:///home/hadoop/hdfs/hive/lib/hive-hbase-handler-0.10.0-cdh4.7.0.jar,file:///home/hadoop/hdfs/hive/lib/protobuf-java-2.4.0a.jar,file:///home/hadoop/hdfs/hive/lib/hbase-0.94.15-cdh4.7.0-security.jar,file:///home/hadoop/hdfs/hive/lib/zookeeper-3.4.5-cdh4.7.0.jar,file:///home/hadoop/hdfs/hive/lib/guava-11.0.2.jar</value>
</property>

<property>
<name>hive.zookeeper.quorum</name>
<value>nd2,dn1,dn2</value>
</property>

<property>
<name>hive.zookeeper.client.port</name>
<value>2181</value>
</property>

<property>
<name>hive.metastore.uris</name>
<value>thrift://nd1:9083</value>
</property>

已有(9)人评论

跳转到指定楼层
howtodown 发表于 2014-8-11 16:07:08
配置hive日志了吗?看看具体报的是什么错误。
还有看看环境变量是否发生变化。
回复

使用道具 举报

zhouguanwu 发表于 2014-8-11 16:12:21
howtodown 发表于 2014-8-11 16:07
配置hive日志了吗?看看具体报的是什么错误。
还有看看环境变量是否发生变化。

014-08-11 15:54:34,028 INFO  ql.Driver (Driver.java:execute(1099)) - Starting command: select * from fftest
2014-08-11 15:54:34,029 INFO  ql.Driver (PerfLogger.java:PerfLogEnd(115)) - </PERFLOG method=TimeToSubmit start=1407743673943 end=1407743674029 duration=86>
2014-08-11 15:54:34,029 INFO  ql.Driver (PerfLogger.java:PerfLogEnd(115)) - </PERFLOG method=Driver.execute start=1407743674028 end=1407743674029 duration=1>
2014-08-11 15:54:34,029 INFO  ql.Driver (SessionState.java:printInfo(418)) - OK
2014-08-11 15:54:34,029 INFO  ql.Driver (PerfLogger.java:PerfLogBegin(88)) - <PERFLOG method=releaseLocks>
2014-08-11 15:54:34,030 INFO  ql.Driver (PerfLogger.java:PerfLogEnd(115)) - </PERFLOG method=releaseLocks start=1407743674029 end=1407743674030 duration=1>
2014-08-11 15:54:34,030 INFO  ql.Driver (PerfLogger.java:PerfLogEnd(115)) - </PERFLOG method=Driver.run start=1407743673942 end=1407743674030 duration=88>
2014-08-11 15:54:34,062 WARN  conf.Configuration (Configuration.java:warnOnceIfDeprecated(981)) - io.bytes.per.checksum is deprecated. Instead, use dfs.bytes-per-checksum
2014-08-11 15:54:34,080 ERROR CliDriver (SessionState.java:printError(427)) - Failed with exception java.io.IOException:java.lang.NullPointerException
java.io.IOException: java.lang.NullPointerException
        at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:552)
        at org.apache.hadoop.hive.ql.exec.FetchOperator.pushRow(FetchOperator.java:496)
        at org.apache.hadoop.hive.ql.exec.FetchTask.fetch(FetchTask.java:137)
        at org.apache.hadoop.hive.ql.Driver.getResults(Driver.java:1474)
        at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:270)
        at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)
        at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:412)
        at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759)
        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:613)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
Caused by: java.lang.NullPointerException
        at org.apache.hadoop.net.DNS.reverseDns(DNS.java:93)
        at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.reverseDNS(TableInputFormatBase.java:219)
        at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.getSplits(TableInputFormatBase.java:184)
        at org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat.getSplits(HiveHBaseTableInputFormat.java:489)
        at org.apache.hadoop.hive.ql.exec.FetchOperator.getRecordReader(FetchOperator.java:388)
        at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:516)
        ... 13 more

2014-08-11 15:54:34,088 INFO  ql.Driver (PerfLogger.java:PerfLogBegin(88)) - <PERFLOG method=releaseLocks>
2014-08-11 15:54:34,088 INFO  ql.Driver (PerfLogger.java:PerfLogEnd(115)) - </PERFLOG method=releaseLocks start=1407743674088 end=1407743674088 duration=0>

回复

使用道具 举报

howtodown 发表于 2014-8-11 16:32:06
zhouguanwu 发表于 2014-8-11 16:12
014-08-11 15:54:34,028 INFO  ql.Driver (Driver.java:execute(1099)) - Starting command: select * fr ...
日志不是这样的,需要你的配置log。
你说的新建表及写数据都没问题,这个新建表是二者整合表,还是单独表。
回复

使用道具 举报

sstutu 发表于 2014-8-11 16:33:40
zhouguanwu 发表于 2014-8-11 16:12
014-08-11 15:54:34,028 INFO  ql.Driver (Driver.java:execute(1099)) - Starting command: select * fr ...
不行,就是做一个hive外部表,重新关联下hbase表
回复

使用道具 举报

zhouguanwu 发表于 2014-8-11 17:11:36
howtodown 发表于 2014-8-11 16:32
日志不是这样的,需要你的配置log。
你说的新建表及写数据都没问题,这个新建表是二者整合表,还是单独表 ...

SessionStart SESSION_ID="hadoop_201408111703" TIME="1407747834691"


QueryStart QUERY_STRING="CREATE EXTERNAL TABLE test (key string ,cf string)    STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'   WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,cf:cf1")   TBLPROPERTIES ("hbase.table.name" = "fog")" QUERY_ID="hadoop_20140811170404_b617e7c5-b135-48ea-8371-eb6cfdeae640" TIME="1407747890441"
Counters plan="{"queryId":"hadoop_20140811170404_b617e7c5-b135-48ea-8371-eb6cfdeae640","queryType":null,"queryAttributes":{"queryString":"CREATE EXTERNAL TABLE test (key string ,cf string)    STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'   WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,cf:cf1")   TBLPROPERTIES ("hbase.table.name" = "fog")"},"queryCounters":"null","stageGraph":{"nodeType":"STAGE","roots":"null","adjacencyList":"]"},"stageList":[{"stageId":"Stage-0","stageType":"DDL","stageAttributes":"null","stageCounters":"}","taskList":[{"taskId":"Stage-0_OTHER","taskType":"OTHER","taskAttributes":"null","taskCounters":"null","operatorGraph":"null","operatorList":"]","done":"false","started":"false"}],"done":"false","started":"false"}],"done":"false","started":"true"}" TIME="1407747890470"
TaskStart TASK_NAME="org.apache.hadoop.hive.ql.exec.DDLTask" TASK_ID="Stage-0" QUERY_ID="hadoop_20140811170404_b617e7c5-b135-48ea-8371-eb6cfdeae640" TIME="1407747890476"
Counters plan="{"queryId":"hadoop_20140811170404_b617e7c5-b135-48ea-8371-eb6cfdeae640","queryType":null,"queryAttributes":{"queryString":"CREATE EXTERNAL TABLE test (key string ,cf string)    STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'   WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,cf:cf1")   TBLPROPERTIES ("hbase.table.name" = "fog")"},"queryCounters":"null","stageGraph":{"nodeType":"STAGE","roots":"null","adjacencyList":"]"},"stageList":[{"stageId":"Stage-0","stageType":"DDL","stageAttributes":"null","stageCounters":"}","taskList":[{"taskId":"Stage-0_OTHER","taskType":"OTHER","taskAttributes":"null","taskCounters":"null","operatorGraph":"null","operatorList":"]","done":"false","started":"true"}],"done":"false","started":"true"}],"done":"false","started":"true"}" TIME="1407747890482"
Counters plan="{"queryId":"hadoop_20140811170404_b617e7c5-b135-48ea-8371-eb6cfdeae640","queryType":null,"queryAttributes":{"queryString":"CREATE EXTERNAL TABLE test (key string ,cf string)    STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'   WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,cf:cf1")   TBLPROPERTIES ("hbase.table.name" = "fog")"},"queryCounters":"null","stageGraph":{"nodeType":"STAGE","roots":"null","adjacencyList":"]"},"stageList":[{"stageId":"Stage-0","stageType":"DDL","stageAttributes":"null","stageCounters":"}","taskList":[{"taskId":"Stage-0_OTHER","taskType":"OTHER","taskAttributes":"null","taskCounters":"null","operatorGraph":"null","operatorList":"]","done":"true","started":"true"}],"done":"true","started":"true"}],"done":"false","started":"true"}" TIME="1407747894570"
TaskEnd TASK_RET_CODE="0" TASK_NAME="org.apache.hadoop.hive.ql.exec.DDLTask" TASK_ID="Stage-0" QUERY_ID="hadoop_20140811170404_b617e7c5-b135-48ea-8371-eb6cfdeae640" TIME="1407747894570"
QueryEnd QUERY_STRING="CREATE EXTERNAL TABLE test (key string ,cf string)    STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'   WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,cf:cf1")   TBLPROPERTIES ("hbase.table.name" = "fog")" QUERY_ID="hadoop_20140811170404_b617e7c5-b135-48ea-8371-eb6cfdeae640" QUERY_RET_CODE="0" QUERY_NUM_TASKS="0" TIME="1407747894570"
Counters plan="{"queryId":"hadoop_20140811170404_b617e7c5-b135-48ea-8371-eb6cfdeae640","queryType":null,"queryAttributes":{"queryString":"CREATE EXTERNAL TABLE test (key string ,cf string)    STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'   WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,cf:cf1")   TBLPROPERTIES ("hbase.table.name" = "fog")"},"queryCounters":"null","stageGraph":{"nodeType":"STAGE","roots":"null","adjacencyList":"]"},"stageList":[{"stageId":"Stage-0","stageType":"DDL","stageAttributes":"null","stageCounters":"}","taskList":[{"taskId":"Stage-0_OTHER","taskType":"OTHER","taskAttributes":"null","taskCounters":"null","operatorGraph":"null","operatorList":"]","done":"true","started":"true"}],"done":"true","started":"true"}],"done":"true","started":"true"}" TIME="1407747894571"
QueryStart QUERY_STRING="select * from test limit 5" QUERY_ID="hadoop_20140811170505_804fc1b0-b201-445d-9df1-3573dd7a1980" TIME="1407747956375"
Counters plan="{"queryId":"hadoop_20140811170505_804fc1b0-b201-445d-9df1-3573dd7a1980","queryType":null,"queryAttributes":{"queryString":"select * from test limit 5"},"queryCounters":"null","stageGraph":{"nodeType":"STAGE","roots":"null","adjacencyList":"]"},"stageList":"]","done":"false","started":"true"}" TIME="1407747956375"
QueryEnd QUERY_STRING="select * from test limit 5" QUERY_ID="hadoop_20140811170505_804fc1b0-b201-445d-9df1-3573dd7a1980" QUERY_RET_CODE="0" QUERY_NUM_TASKS="0" TIME="1407747956375"
Counters plan="{"queryId":"hadoop_20140811170505_804fc1b0-b201-445d-9df1-3573dd7a1980","queryType":null,"queryAttributes":{"queryString":"select * from test limit 5"},"queryCounters":"null","stageGraph":{"nodeType":"STAGE","roots":"null","adjacencyList":"]"},"stageList":"]","done":"true","started":"true"}" TIME="1407747956375"

你说的是不是这类log:
上面的log是hive创建 EXTERNAL表关联hbase 已存在表信息:


hive 查询结果:
  hive > select * from test limit 5;                                    
14/08/11 17:05:55 INFO metastore.HiveMetaStore: 3: source:/10.10.4.207 get_table : db=default tbl=test
14/08/11 17:05:55 INFO HiveMetaStore.audit: ugi=hadoop        ip=/10.10.4.207        cmd=source:/10.10.4.207 get_table : db=default tbl=test       
OK
Failed with exception java.io.IOException:java.lang.NullPointerException
Time taken: 1.074 seconds



回复

使用道具 举报

zhouguanwu 发表于 2014-8-11 17:15:14
sstutu 发表于 2014-8-11 16:33
不行,就是做一个hive外部表,重新关联下hbase表

尝试hive 重新创建关联hbase已存在表,创建外部表ok,查询还是:
Failed with exception java.io.IOException:java.lang.NullPointerException

你之前有遇到过类似问题吗,(hive hbase版本有升级,会不会数据结构变更有关系)

我的是从cdh-3u5升级到cdh4版本,谢谢
回复

使用道具 举报

bioger_hit 发表于 2014-8-12 01:13:57
zhouguanwu 发表于 2014-8-11 17:15
尝试hive 重新创建关联hbase已存在表,创建外部表ok,查询还是:
Failed with exception java.io.IOExce ...
日志显示,你查询的所有数据都为null
"taskId":"Stage-0_OTHER","taskType":"OTHER","taskAttributes":"null","taskCounters":"null","operatorGraph":"null","
所以报这个错误。
回复

使用道具 举报

zhouguanwu 发表于 2014-8-12 09:31:40
bioger_hit 发表于 2014-8-12 01:13
日志显示,你查询的所有数据都为null
"taskId":"Stage-0_OTHER","taskType":"OTHER","taskAttributes":"n ...

# hbase 表的结构式如下这样
hbase(main):002:0> describe 'fog'
DESCRIPTION                                                                                                   ENABLED                                                   
'fog', {NAME => 'cf', BLOOMFILTER => 'NONE', REPLICATION_SCOPE => '0', VERSIONS => '3', COMPRESSION => 'NONE true                                                      
', TTL => '2147483647', BLOCKSIZE => '65536', IN_MEMORY => 'false', BLOCKCACHE => 'true'}                                                                              
1 row(s) in 0.1430 seconds

hbase(main):003:0> scan 'fog'
ROW                                         COLUMN+CELL                                                                                                                  
00000016-3ea9-4e5c-8df4-31f5c19e716e-add-6 column=cf:cf1, timestamp=1406884408436, value=1                                                                              
-2445172                                                                                                                                                               
00000029-8abd-4881-982c-40036e5f0a16-add-0 column=cf:cf1, timestamp=1406884420293, value=1                                                                              
-2459511                                                                                                                                                               
00000066-bfdc-4858-ae89-90057b4168ac-add-9 column=cf:cf1, timestamp=1406883736524, value=1                                                                              
-478842                                                                                                                                                                 
00000072-c987-4a87-86a4-8a18cfe74948-add-1 column=cf:cf1, timestamp=1406883965119, value=1                                                                              
-1154017
。。。。。

# hive 关联表
hive> CREATE EXTERNAL TABLE test(key string ,cf string) STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,cf:cf1") TBLPROPERTIES ("hbase.table.name" = "fog");

hive 关联ok,查询

有什么错误码?         

# 下面是hive 执行创建关联表以及查询hive。log 信息,帮忙看看,谢谢
14/08/12 09:26:27 INFO metastore.HiveMetaStore: 1: source:/10.10.4.207 create_table: Table(tableName:test, dbName:default, owner:hadoop, createTime:1407806786, lastAccessTime:0, retention:0, sd:StorageDescriptor(cols:[FieldSchema(name:key, type:string, comment:null), FieldSchema(name:cf, type:string, comment:null)], location:null, inputFormat:org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat, outputFormat:org.apache.hadoop.hive.hbase.HiveHBaseTableOutputFormat, compressed:false, numBuckets:-1, serdeInfo:SerDeInfo(name:null, serializationLib:org.apache.hadoop.hive.hbase.HBaseSerDe, parameters:{hbase.columns.mapping=:key,cf:cf1, serialization.format=1}), bucketCols:[], sortCols:[], parameters:{}, skewedInfo:SkewedInfo(skewedColNames:[], skewedColValues:[], skewedColValueLocationMaps:{}), storedAsSubDirectories:false), partitionKeys:[], parameters:{EXTERNAL=TRUE, hbase.table.name=fog, storage_handler=org.apache.hadoop.hive.hbase.HBaseStorageHandler}, viewOriginalText:null, viewExpandedText:null, tableType:EXTERNAL_TABLE, privileges:PrincipalPrivilegeSet(userPrivileges:null, groupPrivileges:null, rolePrivileges:null))
14/08/12 09:26:27 INFO HiveMetaStore.audit: ugi=hadoop        ip=/10.10.4.207        cmd=source:/10.10.4.207 create_table: Table(tableName:test, dbName:default, owner:hadoop, createTime:1407806786, lastAccessTime:0, retention:0, sd:StorageDescriptor(cols:[FieldSchema(name:key, type:string, comment:null), FieldSchema(name:cf, type:string, comment:null)], location:null, inputFormat:org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat, outputFormat:org.apache.hadoop.hive.hbase.HiveHBaseTableOutputFormat, compressed:false, numBuckets:-1, serdeInfo:SerDeInfo(name:null, serializationLib:org.apache.hadoop.hive.hbase.HBaseSerDe, parameters:{hbase.columns.mapping=:key,cf:cf1, serialization.format=1}), bucketCols:[], sortCols:[], parameters:{}, skewedInfo:SkewedInfo(skewedColNames:[], skewedColValues:[], skewedColValueLocationMaps:{}), storedAsSubDirectories:false), partitionKeys:[], parameters:{EXTERNAL=TRUE, hbase.table.name=fog, storage_handler=org.apache.hadoop.hive.hbase.HBaseStorageHandler}, viewOriginalText:null, viewExpandedText:null, tableType:EXTERNAL_TABLE, privileges:PrincipalPrivilegeSet(userPrivileges:null, groupPrivileges:null, rolePrivileges:null))       
14/08/12 09:26:27 DEBUG ipc.Client: The ping interval is 60000 ms.
14/08/12 09:26:27 DEBUG ipc.Client: Use SIMPLE authentication for protocol ClientNamenodeProtocolPB
14/08/12 09:26:27 DEBUG ipc.Client: Connecting to nd1/10.10.4.207:9000
14/08/12 09:26:27 DEBUG ipc.Client: IPC Client (722629582) connection to nd1/10.10.4.207:9000 from hadoop: starting, having connections 1
14/08/12 09:26:27 DEBUG ipc.Client: IPC Client (722629582) connection to nd1/10.10.4.207:9000 from hadoop sending #0
14/08/12 09:26:27 DEBUG ipc.Client: IPC Client (722629582) connection to nd1/10.10.4.207:9000 from hadoop got value #0
14/08/12 09:26:27 DEBUG ipc.ProtobufRpcEngine: Call: getFileInfo took 75ms
14/08/12 09:26:37 DEBUG ipc.Client: IPC Client (722629582) connection to nd1/10.10.4.207:9000 from hadoop: closed
14/08/12 09:26:37 DEBUG ipc.Client: IPC Client (722629582) connection to nd1/10.10.4.207:9000 from hadoop: stopped, remaining connections 0



14/08/12 09:28:56 INFO metastore.HiveMetaStore: 1: source:/10.10.4.207 get_table : db=default tbl=test
14/08/12 09:28:56 INFO HiveMetaStore.audit: ugi=hadoop        ip=/10.10.4.207        cmd=source:/10.10.4.207 get_table : db=default tbl=test       
2014-08-12 09:28:57,277 WARN  conf.Configuration (Configuration.java:warnOnceIfDeprecated(981)) - mapred.task.is.map is deprecated. Instead, use mapreduce.task.ismap
2014-08-12 09:28:57,301 WARN  conf.Configuration (Configuration.java:warnOnceIfDeprecated(981)) - mapred.jar is deprecated. Instead, use mapreduce.job.jar
2014-08-12 09:28:57,358 ERROR CliDriver (SessionState.java:printError(401)) - Failed with exception java.io.IOException:java.lang.NullPointerException
java.io.IOException: java.lang.NullPointerException
        at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:521)
        at org.apache.hadoop.hive.ql.exec.FetchOperator.pushRow(FetchOperator.java:466)
        at org.apache.hadoop.hive.ql.exec.FetchTask.fetch(FetchTask.java:136)
        at org.apache.hadoop.hive.ql.Driver.getResults(Driver.java:1387)
        at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:270)
        at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)
        at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:412)
        at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:755)
        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:613)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
Caused by: java.lang.NullPointerException
        at org.apache.hadoop.net.DNS.reverseDns(DNS.java:93)
        at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.reverseDNS(TableInputFormatBase.java:219)
        at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.getSplits(TableInputFormatBase.java:184)
        at org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat.getSplits(HiveHBaseTableInputFormat.java:476)
        at org.apache.hadoop.hive.ql.exec.FetchOperator.getRecordReader(FetchOperator.java:373)
        at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:486)
        ... 13 more                                                              

回复

使用道具 举报

阿飞 发表于 2014-8-12 10:23:47
zhouguanwu 发表于 2014-8-12 09:31
# hbase 表的结构式如下这样
hbase(main):002:0> describe 'fog'
DESCRIPTION                         ...
外部表不是这样创建的,你这还是回到了整合表,外部表是这样创建的。
hive> create external table exter_table(
    > id int,
    > name string,
    > age int,
    > tel string)
    > location '/home/wyp/external';
OK
Time taken: 0.098 seconds


详细查看:
hive内部表与外部表区别详细介绍

回复

使用道具 举报

您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

关闭

推荐上一条 /2 下一条