分享

hive查询报错java.io.FileNotFoundException: File does not exist:

szcountryboy 发表于 2014-8-22 15:14:19 [显示全部楼层] 回帖奖励 阅读模式 关闭右栏 3 83310
本帖最后由 pig2 于 2014-8-22 22:38 编辑

环境:hadoop-2.4.0
           hbase-0.96.2
           hive-0.13.1

问题描述

hive --service metastore  启动hive>select * from logtable;
*** 正常显示
hive>select count(*) from logtable;
Total jobs = 1
Launching Job 1 out of 1
Number of reduce tasks determined at compile time: 1
In order to change the average load for a reducer (in bytes):
  set hive.exec.reducers.bytes.per.reducer=<number>
In order to limit the maximum number of reducers:
  set hive.exec.reducers.max=<number>
In order to set a constant number of reducers:
  set mapreduce.job.reduces=<number>
14/08/22 03:08:26 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
14/08/22 03:08:27 WARN conf.Configuration: file:/tmp/root/hive_2014-08-22_03-08-22_822_994997341822272893-1/-local-10003/jobconf.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.retry.interval;  Ignoring.
14/08/22 03:08:27 WARN conf.Configuration: file:/tmp/root/hive_2014-08-22_03-08-22_822_994997341822272893-1/-local-10003/jobconf.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.attempts;  Ignoring.
14/08/22 03:08:27 WARN conf.HiveConf: DEPRECATED: hive.metastore.ds.retry.* no longer has any effect.  Use hive.hmshandler.retry.* instead
Execution log at: /root/hive-0.13.1/log/root_20140822030808_3917199d-7f3c-4a92-8d31-19a9f15ec31f.log
java.io.FileNotFoundException: File does not exist: hdfs://172.16.28.77:9000/root/hive-0.13.1/lib/geronimo-annotation_1.0_spec-1.1.1.jar
        at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1128)
        at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1120)
        at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
        at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1120)
        at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:288)
        at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:224)
        at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestamps(ClientDistributedCacheManager.java:93)
        at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestampsAndCacheVisibilities(ClientDistributedCacheManager.java:57)
        at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:265)
        at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:301)
        at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:389)
        at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285)
        at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
        at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282)
        at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:562)
        at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:557)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
        at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:557)
        at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:548)
        at org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:420)
        at org.apache.hadoop.hive.ql.exec.mr.ExecDriver.main(ExecDriver.java:740)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:601)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
Job Submission failed with exception 'java.io.FileNotFoundException(File does not exist: hdfs://172.16.28.77:9000/root/hive-0.13.1/lib/geronimo-annotation_1.0_spec-1.1.1.jar)'
Execution failed with exit status: 1
Obtaining error information

Task failed!
Task ID:
  Stage-1

Logs:

/root/hive-0.13.1/log/hive.log
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask

提示File does not exist: hdfs://172.16.28.77:9000/root/hive-0.13.1/lib/geronimo-annotation_1.0_spec-1.1.1.jar


这个jiar存在于hive/lib目录下


已有(3)人评论

跳转到指定楼层
szcountryboy 发表于 2014-8-22 17:18:51
配置如下:
<property>
  <name>hive.aux.jars.path</name>
  <value>file:///root/hive-0.13.1/lib/hive-hbase-handler-0.13.1.jar,file:///root/hive-0.13.1/lib/protobuf-java-2.5.0.jar,file:///root/hive-0.13.1/lib/hbase-client-0.96.2-hadoop2.jar,file:///root/hive-0.13.1/lib/hbase-common-0.96.2-hadoop2.jar,file:///root/hive-0.13.1/lib/zookeeper-3.4.6.jar,file:///root/hive-0.13.1/lib/guava-12.0.1.jar</value>
</property>


最后实在没办,就根据提示,把所有提示的jar(hadoop、hbase、hive)全部给他上传到hdfs的路径下

问题解决。

但是总感觉这样解决方法不妥
回复

使用道具 举报

nettman 发表于 2014-8-22 15:26:07
一般都是配置错误:
hdfs://172.16.28.77:9000/root/hive-0.13.1/lib/geronimo-annotation_1.0_spec-1.1.1.jar

参考下面,根据自己的情况该一下:
一般这个是配置文件错误:

hive.aux.jars.path切忌配置正确
不能有换行或则空格。特别是换行,看到很多文章都把他们给分开了,这对很多新手是一个很容易掉进去的陷阱。

  1. <property>
  2.   <name>hive.aux.jars.path</name>
  3.   <value>file:///usr/hive/lib/hive-hbase-handler-0.13.0-SNAPSHOT.jar,file:///usr/hive/lib/protobuf-java-2.5.0.jar,file:///usr/hive/lib/hbase-client-0.96.0-hadoop2.jar,file:///usr/hive/lib/hbase-common-0.96.0-hadoop2.jar,file:///usr/hive/lib/zookeeper-3.4.5.jar,file:///usr/hive/lib/guava-11.0.2.jar</value>
  4. </property>
复制代码

参考下面内容,或许对你有帮助:
hbase0.96与hive0.12整合高可靠文档及问题总结
http://www.aboutyun.com/thread-7881-1-1.html

回复

使用道具 举报

about-bigdata 发表于 2016-5-5 02:08:51
楼主,现在我也碰到了这个问题,解决不了,现在解决没?望不吝赐教!
回复

使用道具 举报

您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

关闭

推荐上一条 /2 下一条