分享

hive-hbase整合后 查全部能查 条件差报错

hive> select * from hive_hbase_member;
OK
bichiyang        {"city":"beijin"}        {"age":"100"}
scutshuxue        {"city":"hangzhou","contry":"china","province":"zhejiang"}        {"age":"26","birthday":"1987-06-17","company":"alibaba"}
Time taken: 0.346 seconds
------------------------------------------------------------------------------这里是查全部-----------------------------------------------------------------------------------
hive> select * from hive_hbase_member where key='bichiyang';
Total MapReduce jobs = 1
Launching Job 1 out of 1
Number of reduce tasks is set to 0 since there's no reduce operator
Starting Job = job_201605272043_0023, Tracking URL = http://master.hadoop:50030/jobde ... b_201605272043_0023
Kill Command = /usr/lib/hadoop/bin/hadoop job  -kill job_201605272043_0023
Hadoop job information for Stage-1: number of mappers: 1; number of reducers: 0
2016-06-02 15:32:52,075 Stage-1 map = 0%,  reduce = 0%
2016-06-02 15:33:22,289 Stage-1 map = 100%,  reduce = 100%
Ended Job = job_201605272043_0023 with errors
Error during job, obtaining debugging information...
Job Tracking URL: http://master.hadoop:50030/jobde ... b_201605272043_0023
Examining task ID: task_201605272043_0023_m_000002 (and more) from job job_201605272043_0023

Task with the most failures(4):
-----
Task ID:
  task_201605272043_0023_m_000000

URL:
  http://master.hadoop:50030/taskd ... 72043_0023_m_000000
-----
Diagnostic Messages for this Task:
Error: java.lang.ClassNotFoundException: org.antlr.runtime.CommonToken
        at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
        at org.apache.hadoop.hive.ql.exec.Utilities.serializeExpression(Utilities.java:445)
        at org.apache.hadoop.hive.ql.io.HiveInputFormat.pushFilters(HiveInputFormat.java:359)
        at org.apache.hadoop.hive.ql.io.HiveInputFormat.pushProjectionsAndFilters(HiveInputFormat.java:432)
        at org.apache.hadoop.hive.ql.io.HiveInputFormat.getRecordReader(HiveInputFormat.java:233)
        at org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getRecordReader(CombineHiveInputFormat.java:522)
        at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:394)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.

FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.MapRedTask
MapReduce Jobs Launched:
Job 0: Map: 1   HDFS Read: 0 HDFS Write: 0 FAIL
Total MapReduce CPU Time Spent: 0 msec
----------------------------------------------------------------------------这里按条件查询-------------------------------------------------------------------------------
求帮助

已有(5)人评论

跳转到指定楼层
请叫我野区养猪 发表于 2016-6-2 15:37:40
我的配置文件是hive-site-xml如下
<?xml version="1.0" encoding="UTF-8"?>

<!--Autogenerated by Cloudera CM on 2016-05-18T06:17:52.292Z-->
<configuration>
  <property>
    <name>hive.metastore.local</name>
    <value>false</value>
  </property>
  <property>
    <name>hive.metastore.uris</name>
    <value>thrift://master.hadoop:9083</value>
  </property>
  <property>
    <name>hive.metastore.client.socket.timeout</name>
    <value>300</value>
  </property>
  <property>
    <name>hive.metastore.warehouse.dir</name>
    <value>/user/hive/warehouse</value>
  </property>
  <property>
    <name>hive.warehouse.subdir.inherit.perms</name>
    <value>true</value>
  </property>
  <property>
    <name>mapred.reduce.tasks</name>
    <value>-1</value>
  </property>
  <property>
    <name>hive.exec.reducers.bytes.per.reducer</name>
    <value>1073741824</value>
  </property>
  <property>
    <name>hive.exec.reducers.max</name>
    <value>999</value>
  </property>
  <property>
    <name>hive.metastore.execute.setugi</name>
    <value>true</value>
  </property>
  <property>
    <name>hive.support.concurrency</name>
    <value>true</value>
  </property>
  <property>
    <name>hive.zookeeper.quorum</name>
    <value>slave3.hadoop,slave2.hadoop,slave1.hadoop</value>
  </property>
  <property>
    <name>hive.zookeeper.client.port</name>
    <value>2181</value>
  </property>
  <property>
    <name>hive.zookeeper.namespace</name>
    <value>hive_zookeeper_namespace_hive1</value>
  </property>
  <property>
    <name>hive.server2.enable.SSL</name>
    <value>false</value>
  </property>

<property>
    <name>hive.aux.jars.path</name>
<value>file:///usr/lib/hive/lib/hive-hbase-handler-0.10.0-cdh4.7.0.jar,file:///usr/lib/hive/lib/hbase.jar,file:///usr/lib/hive/lib/zookeeper.jar,file:///usr/lib/hive/lib/protobuf-java-2.4.0a.jar,file:///usr/lib/hive/lib/guava-11.0.2.jar</value>
</property>

</configuration>
回复

使用道具 举报

请叫我野区养猪 发表于 2016-6-2 15:45:01
查全部能查 查1条limit 1 --->10000000 都能查 就是按条件查询失败
回复

使用道具 举报

starrycheng 发表于 2016-6-2 18:31:21
hive.aux.jars.path
里面加入下面包
antlr-runtime-3.4.jar





回复

使用道具 举报

xw2016 发表于 2016-6-2 20:59:55
mapreduce有问题吧。单独执行一个mapreduce程序试试,如wordcount.
回复

使用道具 举报

请叫我野区养猪 发表于 2016-6-3 10:40:18
starrycheng 发表于 2016-6-2 18:31
hive.aux.jars.path
里面加入下面包
antlr-runtime-3.4.jar

现在想到的办法是 创建一张表 在创建一个外表 然后往里面hive表里面插入外表的数据
回复

使用道具 举报

您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

关闭

推荐上一条 /2 下一条