分享

jdbc 连hive查询报错,但beeline/hive cli都能正确执行

使用的各版本如下:hive-0.13.1-cdh5.3.2
hadoop-2.5.0-cdh5.3.2


hive开启服务器: nohup hive --service hiveserver2 &
java应用通过jdbc方式连接hive, url: jdbc:hive2://10.199.201.41:10000/default


当执行 select * from kylin_account where account_country='FR' LIMIT 64,报以下错误:
[mw_shl_code=bash,true]2018-04-08 14:59:38,298 WARN  [HiveServer2-Handler-Pool: Thread-50]: fair.AllocationFileLoaderService (AllocationFileLoaderService.java:getAllocationFile(179)) - fair-scheduler.xml not found on the classpath.
2018-04-08 14:59:38,303 INFO  [HiveServer2-Handler-Pool: Thread-50]: session.SessionState (SessionState.java:start(383)) - No Tez session required at this point. hive.execution.engine=mr.
2018-04-08 14:59:38,309 INFO  [HiveServer2-Handler-Pool: Thread-50]: session.SessionState (SessionState.java:start(383)) - No Tez session required at this point. hive.execution.engine=mr.
2018-04-08 14:59:38,318 INFO  [HiveServer2-Handler-Pool: Thread-50]: parse.SemanticAnalyzer (SemanticAnalyzer.java:analyzeInternal(9353)) - Starting Semantic Analysis
2018-04-08 14:59:38,318 INFO  [HiveServer2-Handler-Pool: Thread-50]: parse.SemanticAnalyzer (SemanticAnalyzer.java:analyzeInternal(9388)) - Completed phase 1 of Semantic Analysis
2018-04-08 14:59:38,318 INFO  [HiveServer2-Handler-Pool: Thread-50]: parse.SemanticAnalyzer (SemanticAnalyzer.java:getMetaData(1205)) - Get metadata for source tables
2018-04-08 14:59:38,319 WARN  [HiveServer2-Handler-Pool: Thread-50]: security.UserGroupInformation (UserGroupInformation.java:getGroupNames(1506)) - No groups available for user apms
2018-04-08 14:59:38,319 WARN  [HiveServer2-Handler-Pool: Thread-50]: security.UserGroupInformation (UserGroupInformation.java:getGroupNames(1506)) - No groups available for user apms
2018-04-08 14:59:38,393 INFO  [HiveServer2-Handler-Pool: Thread-50]: parse.SemanticAnalyzer (SemanticAnalyzer.java:getMetaData(1346)) - Get metadata for subqueries
2018-04-08 14:59:38,393 INFO  [HiveServer2-Handler-Pool: Thread-50]: parse.SemanticAnalyzer (SemanticAnalyzer.java:getMetaData(1370)) - Get metadata for destination tables
2018-04-08 14:59:38,418 INFO  [HiveServer2-Handler-Pool: Thread-50]: parse.SemanticAnalyzer (SemanticAnalyzer.java:analyzeInternal(9391)) - Completed getting MetaData in Semantic Analysis
2018-04-08 14:59:38,465 INFO  [HiveServer2-Handler-Pool: Thread-50]: parse.SemanticAnalyzer (SemanticAnalyzer.java:genFileSinkPlan(6065)) - Set stats collection dir : hdfs://ebs-spark-wlt-3dxqw.vclound.com:9000/tmp/hive-root/hive_2018-04-08_14-59-38_315_1352087243136786510-3/-mr-10000/.hive-staging_hive_2018-04-08_14-59-38_315_1352087243136786510-3/-ext-10002
2018-04-08 14:59:38,477 INFO  [HiveServer2-Handler-Pool: Thread-50]: parse.SemanticAnalyzer (SemanticAnalyzer.java:analyzeInternal(9473)) - Completed plan generation
2018-04-08 14:59:38,478 INFO  [HiveServer2-Handler-Pool: Thread-50]: ql.Driver (Driver.java:compile(446)) - Semantic Analysis Completed
2018-04-08 14:59:38,480 INFO  [HiveServer2-Handler-Pool: Thread-50]: ql.Driver (Driver.java:getSchema(245)) - Returning Hive schema: Schema(fieldSchemas:[FieldSchema(name:kylin_account.account_id, type:bigint, comment:null), FieldSchema(name:kylin_account.account_buyer_level, type:int, comment:null), FieldSchema(name:kylin_account.account_seller_level, type:int, comment:null), FieldSchema(name:kylin_account.account_country, type:string, comment:null), FieldSchema(name:kylin_account.account_contact, type:string, comment:null)], properties:null)
2018-04-08 14:59:38,483 INFO  [HiveServer2-Background-Pool: Thread-213]: ql.Driver (Driver.java:checkConcurrency(165)) - Concurrency mode is disabled, not creating a lock manager
2018-04-08 14:59:38,483 INFO  [HiveServer2-Background-Pool: Thread-213]: ql.Driver (Driver.java:execute(1243)) - Starting command: select * from kylin_account where account_country='FR' LIMIT 64
2018-04-08 14:59:38,484 INFO  [HiveServer2-Background-Pool: Thread-213]: ql.Driver (SessionState.java:printInfo(558)) - Total jobs = 1
2018-04-08 14:59:38,485 INFO  [HiveServer2-Background-Pool: Thread-213]: ql.Driver (SessionState.java:printInfo(558)) - Launching Job 1 out of 1
2018-04-08 14:59:38,486 INFO  [HiveServer2-Background-Pool: Thread-213]: exec.Task (SessionState.java:printInfo(558)) - Number of reduce tasks is set to 0 since there's no reduce operator
2018-04-08 14:59:38,508 INFO  [HiveServer2-Background-Pool: Thread-213]: mr.ExecDriver (ExecDriver.java:execute(285)) - Using org.apache.hadoop.hive.ql.io.CombineHiveInputFormat
2018-04-08 14:59:38,642 WARN  [HiveServer2-Background-Pool: Thread-213]: mapreduce.JobSubmitter (JobSubmitter.java:copyAndConfigureFiles(151)) - Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this.
2018-04-08 14:59:38,963 INFO  [HiveServer2-Background-Pool: Thread-213]: exec.Task (SessionState.java:printInfo(558)) - Starting Job = job_1523169313235_0006, Tracking URL = http://ebs-spark-wlt-3dxqw.vclou ... 1523169313235_0006/
2018-04-08 14:59:38,964 INFO  [HiveServer2-Background-Pool: Thread-213]: exec.Task (SessionState.java:printInfo(558)) - Kill Command = /apps/svr/hadoop-2.5.0-cdh5.3.2/bin/hadoop job  -kill job_1523169313235_0006
2018-04-08 14:59:43,488 WARN  [HiveServer2-Handler-Pool: Thread-50]: ql.Driver (DriverContext.java:shutdown(137)) - Shutting down task : Stage-1:MAPRED
2018-04-08 14:59:43,736 WARN  [HiveServer2-Handler-Pool: Thread-50]: transport.TIOStreamTransport (TIOStreamTransport.java:close(112)) - Error closing output stream.
java.net.SocketException: Socket closed
        at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:116)
        at java.net.SocketOutputStream.write(SocketOutputStream.java:153)
        at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)
        at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140)
        at java.io.FilterOutputStream.close(FilterOutputStream.java:158)
        at org.apache.thrift.transport.TIOStreamTransport.close(TIOStreamTransport.java:110)
        at org.apache.thrift.transport.TSocket.close(TSocket.java:194)
        at org.apache.thrift.transport.TSaslTransport.close(TSaslTransport.java:381)
        at org.apache.thrift.transport.TSaslServerTransport.close(TSaslServerTransport.java:41)
        at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:261)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)
2018-04-08 14:59:44,548 INFO  [HiveServer2-Background-Pool: Thread-213]: exec.Task (SessionState.java:printInfo(558)) - Hadoop job information for Stage-1: number of mappers: 0; number of reducers: 0
2018-04-08 14:59:44,578 WARN  [HiveServer2-Background-Pool: Thread-213]: mapreduce.Counters (AbstractCounters.java:getGroup(234)) - Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
2018-04-08 14:59:44,579 INFO  [HiveServer2-Background-Pool: Thread-213]: exec.Task (SessionState.java:printInfo(558)) - 2018-04-08 14:59:44,578 Stage-1 map = 0%,  reduce = 0%
2018-04-08 14:59:44,583 WARN  [HiveServer2-Background-Pool: Thread-213]: mapreduce.Counters (AbstractCounters.java:getGroup(234)) - Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
2018-04-08 14:59:44,587 ERROR [HiveServer2-Background-Pool: Thread-213]: exec.Task (SessionState.java:printError(567)) - Ended Job = job_1523169313235_0006 with errors
2018-04-08 14:59:44,588 ERROR [Thread-199]: exec.Task (SessionState.java:printError(567)) - Error during job, obtaining debugging information...
2018-04-08 14:59:44,595 ERROR [HiveServer2-Background-Pool: Thread-213]: ql.Driver (SessionState.java:printError(567)) - FAILED: Operation cancelled
2018-04-08 14:59:44,595 INFO  [HiveServer2-Background-Pool: Thread-213]: ql.Driver (SessionState.java:printInfo(558)) - MapReduce Jobs Launched:
2018-04-08 14:59:44,596 WARN  [HiveServer2-Background-Pool: Thread-213]: mapreduce.Counters (AbstractCounters.java:getGroup(234)) - Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
2018-04-08 14:59:44,596 INFO  [HiveServer2-Background-Pool: Thread-213]: ql.Driver (SessionState.java:printInfo(558)) - Stage-Stage-1:  HDFS Read: 0 HDFS Write: 0 FAIL
2018-04-08 14:59:44,596 INFO  [HiveServer2-Background-Pool: Thread-213]: ql.Driver (SessionState.java:printInfo(558)) - Total MapReduce CPU Time Spent: 0 msec
2018-04-08 14:59:44,596 ERROR [HiveServer2-Background-Pool: Thread-213]: operation.Operation (SQLOperation.java:run(203)) - Error running hive query:
org.apache.hive.service.cli.HiveSQLException: Illegal Operation state transition from CLOSED to ERROR
        at org.apache.hive.service.cli.OperationState.validateTransition(OperationState.java:91)
        at org.apache.hive.service.cli.OperationState.validateTransition(OperationState.java:97)
        at org.apache.hive.service.cli.operation.Operation.setState(Operation.java:116)
        at org.apache.hive.service.cli.operation.SQLOperation.runInternal(SQLOperation.java:159)
        at org.apache.hive.service.cli.operation.SQLOperation.access$000(SQLOperation.java:69)
        at org.apache.hive.service.cli.operation.SQLOperation$1$1.run(SQLOperation.java:200)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1642)
        at org.apache.hadoop.hive.shims.HadoopShimsSecure.doAs(HadoopShimsSecure.java:502)
        at org.apache.hive.service.cli.operation.SQLOperation$1.run(SQLOperation.java:213)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)
[/mw_shl_code]

但通过 beeline/hive cli 都能正确执行,百思不得其解...

已有(10)人评论

跳转到指定楼层
JacksonH10 发表于 2018-4-8 16:26:15
附上beeline执行时的日志:
[mw_shl_code=text,true]2018-04-08 14:58:18,735 INFO  [HiveServer2-Handler-Pool: Thread-70]: parse.SemanticAnalyzer (SemanticAnalyzer.java:analyzeInternal(9353)) - Starting Semantic Analysis
2018-04-08 14:58:18,749 INFO  [HiveServer2-Handler-Pool: Thread-70]: parse.SemanticAnalyzer (SemanticAnalyzer.java:analyzeInternal(9388)) - Completed phase 1 of Semantic Analysis
2018-04-08 14:58:18,750 INFO  [HiveServer2-Handler-Pool: Thread-70]: parse.SemanticAnalyzer (SemanticAnalyzer.java:getMetaData(1205)) - Get metadata for source tables
2018-04-08 14:58:18,785 INFO  [HiveServer2-Handler-Pool: Thread-70]: parse.SemanticAnalyzer (SemanticAnalyzer.java:getMetaData(1346)) - Get metadata for subqueries
2018-04-08 14:58:18,785 INFO  [HiveServer2-Handler-Pool: Thread-70]: parse.SemanticAnalyzer (SemanticAnalyzer.java:getMetaData(1370)) - Get metadata for destination tables
2018-04-08 14:58:18,818 INFO  [HiveServer2-Handler-Pool: Thread-70]: parse.SemanticAnalyzer (SemanticAnalyzer.java:analyzeInternal(9391)) - Completed getting MetaData in Semantic Analysis
2018-04-08 14:58:18,873 INFO  [HiveServer2-Handler-Pool: Thread-70]: parse.SemanticAnalyzer (SemanticAnalyzer.java:genFileSinkPlan(6065)) - Set stats collection dir : hdfs://ebs-spark-wlt-3dxqw.vclound.com:9000/tmp/hive-root/hive_2018-04-08_14-58-18_733_422531375643354255-5/-mr-10000/.hive-staging_hive_2018-04-08_14-58-18_733_422531375643354255-5/-ext-10002
2018-04-08 14:58:18,878 INFO  [HiveServer2-Handler-Pool: Thread-70]: parse.SemanticAnalyzer (SemanticAnalyzer.java:analyzeInternal(9473)) - Completed plan generation
2018-04-08 14:58:18,878 INFO  [HiveServer2-Handler-Pool: Thread-70]: ql.Driver (Driver.java:compile(446)) - Semantic Analysis Completed
2018-04-08 14:58:18,879 INFO  [HiveServer2-Handler-Pool: Thread-70]: ql.Driver (Driver.java:getSchema(245)) - Returning Hive schema: Schema(fieldSchemas:[FieldSchema(name:kylin_account.account_id, type:bigint, comment:null), FieldSchema(name:kylin_account.account_buyer_level, type:int, comment:null), FieldSchema(name:kylin_account.account_seller_level, type:int, comment:null), FieldSchema(name:kylin_account.account_country, type:string, comment:null), FieldSchema(name:kylin_account.account_contact, type:string, comment:null)], properties:null)
2018-04-08 14:58:18,884 INFO  [HiveServer2-Background-Pool: Thread-179]: ql.Driver (Driver.java:checkConcurrency(165)) - Concurrency mode is disabled, not creating a lock manager
2018-04-08 14:58:18,885 INFO  [HiveServer2-Background-Pool: Thread-179]: ql.Driver (Driver.java:execute(1243)) - Starting command: select * from kylin_account where account_country='FR'
2018-04-08 14:58:18,885 INFO  [HiveServer2-Background-Pool: Thread-179]: ql.Driver (SessionState.java:printInfo(558)) - Total jobs = 1
2018-04-08 14:58:18,886 INFO  [HiveServer2-Background-Pool: Thread-179]: ql.Driver (SessionState.java:printInfo(558)) - Launching Job 1 out of 1
2018-04-08 14:58:18,888 INFO  [HiveServer2-Background-Pool: Thread-179]: exec.Task (SessionState.java:printInfo(558)) - Number of reduce tasks is set to 0 since there's no reduce operator
2018-04-08 14:58:18,911 INFO  [HiveServer2-Background-Pool: Thread-179]: mr.ExecDriver (ExecDriver.java:execute(285)) - Using org.apache.hadoop.hive.ql.io.CombineHiveInputFormat
2018-04-08 14:58:19,491 WARN  [HiveServer2-Background-Pool: Thread-179]: mapreduce.JobSubmitter (JobSubmitter.java:copyAndConfigureFiles(151)) - Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this.
2018-04-08 14:58:19,829 INFO  [HiveServer2-Background-Pool: Thread-179]: exec.Task (SessionState.java:printInfo(558)) - Starting Job = job_1523169313235_0004, Tracking URL = http://ebs-spark-wlt-3dxqw.vclou ... 1523169313235_0004/
2018-04-08 14:58:19,830 INFO  [HiveServer2-Background-Pool: Thread-179]: exec.Task (SessionState.java:printInfo(558)) - Kill Command = /apps/svr/hadoop-2.5.0-cdh5.3.2/bin/hadoop job  -kill job_1523169313235_0004
2018-04-08 14:58:25,072 INFO  [HiveServer2-Background-Pool: Thread-179]: exec.Task (SessionState.java:printInfo(558)) - Hadoop job information for Stage-1: number of mappers: 1; number of reducers: 0
2018-04-08 14:58:25,105 WARN  [HiveServer2-Background-Pool: Thread-179]: mapreduce.Counters (AbstractCounters.java:getGroup(234)) - Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
2018-04-08 14:58:25,106 INFO  [HiveServer2-Background-Pool: Thread-179]: exec.Task (SessionState.java:printInfo(558)) - 2018-04-08 14:58:25,105 Stage-1 map = 0%,  reduce = 0%
2018-04-08 14:58:30,317 INFO  [HiveServer2-Background-Pool: Thread-179]: exec.Task (SessionState.java:printInfo(558)) - 2018-04-08 14:58:30,305 Stage-1 map = 100%,  reduce = 0%, Cumulative CPU 2.7 sec
2018-04-08 14:58:31,347 INFO  [HiveServer2-Background-Pool: Thread-179]: exec.Task (SessionState.java:printInfo(558)) - MapReduce Total cumulative CPU time: 2 seconds 700 msec
2018-04-08 14:58:31,366 INFO  [HiveServer2-Background-Pool: Thread-179]: exec.Task (SessionState.java:printInfo(558)) - Ended Job = job_1523169313235_0004
2018-04-08 14:58:31,399 INFO  [HiveServer2-Background-Pool: Thread-179]: ql.Driver (SessionState.java:printInfo(558)) - MapReduce Jobs Launched:
2018-04-08 14:58:31,399 INFO  [HiveServer2-Background-Pool: Thread-179]: ql.Driver (SessionState.java:printInfo(558)) - Stage-Stage-1: Map: 1   Cumulative CPU: 2.7 sec   HDFS Read: 200277 HDFS Write: 28740 SUCCESS
2018-04-08 14:58:31,399 INFO  [HiveServer2-Background-Pool: Thread-179]: ql.Driver (SessionState.java:printInfo(558)) - Total MapReduce CPU Time Spent: 2 seconds 700 msec
2018-04-08 14:58:31,399 INFO  [HiveServer2-Background-Pool: Thread-179]: ql.Driver (SessionState.java:printInfo(558)) - OK
[/mw_shl_code]
回复

使用道具 举报

nextuser 发表于 2018-4-8 16:30:29

看下图

hiveserver.png

server应该都是一样的,猜测可能是客户端的问题,也就是Java解析看看是不是有问题。尝试下其它语句
回复

使用道具 举报

JacksonH10 发表于 2018-4-8 16:36:11

如果不带条件的查询就没问题:select * from kylin_account LIMIT 64但只要带条件用到了mr才会有问题,应该不是sql语句的问题吧?
回复

使用道具 举报

nextuser 发表于 2018-4-8 18:14:38
JacksonH10 发表于 2018-4-8 16:36
如果不带条件的查询就没问题:select * from kylin_account LIMIT 64但只要带条件用到了mr才会有问题,应 ...

这说明hive配置,运行是没有问题的,看看资源及外部因素。比如内存,hdfs是否足够。比如尝试,是否所有的带有条件的sql语句都可以在beeline运行,而Java jdbc不可以。从多个角度去排除问题
回复

使用道具 举报

JacksonH10 发表于 2018-4-8 18:25:52
nextuser 发表于 2018-4-8 18:14
这说明hive配置,运行是没有问题的,看看资源及外部因素。比如内存,hdfs是否足够。比如尝试,是否所有的 ...

的确是所有的带条件,需要调用mr的就不行,出错最可疑的地方就是无端执行了shutdown操作,不知是什么原因会造成这个调用,而beeline调用时是不会有这句的。

2018-04-08 14:59:38,964 INFO  [HiveServer2-Background-Pool: Thread-213]: exec.Task (SessionState.java:printInfo(558)) - Kill Command = /apps/svr/hadoop-2.5.0-cdh5.3.2/bin/hadoop job  -kill job_1523169313235_0006
2018-04-08 14:59:43,488 WARN  [HiveServer2-Handler-Pool: Thread-50]: ql.Driver (DriverContext.java:shutdown(137)) - Shutting down task : Stage-1:MAPRED
2018-04-08 14:59:43,736 WARN  [HiveServer2-Handler-Pool: Thread-50]: transport.TIOStreamTransport (TIOStreamTransport.java:close(112)) - Error closing output stream.



回复

使用道具 举报

qcbb001 发表于 2018-4-8 19:53:24
JacksonH10 发表于 2018-4-8 18:25
的确是所有的带条件,需要调用mr的就不行,出错最可疑的地方就是无端执行了shutdown操作,不知是什么原因 ...

问题点不在server,而在jdbc链接hive的客户端,例如是不是在代码里面加什么额外配置了。
回复

使用道具 举报

JacksonH10 发表于 2018-4-9 10:01:25
qcbb001 发表于 2018-4-8 19:53
问题点不在server,而在jdbc链接hive的客户端,例如是不是在代码里面加什么额外配置了。

但connection只有几个必要的参数而已:
[mw_shl_code=java,true]
// url:jdbc:hive2://${IP}:${PORT}
public Connection getConnection() throws SQLException {
                try {
                        Class.forName(this.driverClassName);
                }catch(Exception e) {
                        throw new SqlSessionException(e);
                }
                return DriverManager.getConnection(this.url, this.username, this.password);
        }[/mw_shl_code]
回复

使用道具 举报

qcbb001 发表于 2018-4-9 13:02:24
JacksonH10 发表于 2018-4-9 10:01
但connection只有几个必要的参数而已:
[mw_shl_code=java,true]
// url:jdbc:hive2://${IP}:${PORT}
...

是否用户权限的问题。
回复

使用道具 举报

JacksonH10 发表于 2018-4-9 16:34:52
qcbb001 发表于 2018-4-9 13:02
是否用户权限的问题。

没看到有权限问题的日志,另外不带条件查询是可以的
回复

使用道具 举报

12下一页
您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

关闭

推荐上一条 /2 下一条