使用的各版本如下:hive-0.13.1-cdh5.3.2
hadoop-2.5.0-cdh5.3.2
hive开启服务器: nohup hive --service hiveserver2 &
java应用通过jdbc方式连接hive, url: jdbc:hive2://10.199.201.41:10000/default
当执行 select * from kylin_account where account_country='FR' LIMIT 64,报以下错误:
[mw_shl_code=bash,true]2018-04-08 14:59:38,298 WARN [HiveServer2-Handler-Pool: Thread-50]: fair.AllocationFileLoaderService (AllocationFileLoaderService.java:getAllocationFile(179)) - fair-scheduler.xml not found on the classpath.
2018-04-08 14:59:38,303 INFO [HiveServer2-Handler-Pool: Thread-50]: session.SessionState (SessionState.java:start(383)) - No Tez session required at this point. hive.execution.engine=mr.
2018-04-08 14:59:38,309 INFO [HiveServer2-Handler-Pool: Thread-50]: session.SessionState (SessionState.java:start(383)) - No Tez session required at this point. hive.execution.engine=mr.
2018-04-08 14:59:38,318 INFO [HiveServer2-Handler-Pool: Thread-50]: parse.SemanticAnalyzer (SemanticAnalyzer.java:analyzeInternal(9353)) - Starting Semantic Analysis
2018-04-08 14:59:38,318 INFO [HiveServer2-Handler-Pool: Thread-50]: parse.SemanticAnalyzer (SemanticAnalyzer.java:analyzeInternal(9388)) - Completed phase 1 of Semantic Analysis
2018-04-08 14:59:38,318 INFO [HiveServer2-Handler-Pool: Thread-50]: parse.SemanticAnalyzer (SemanticAnalyzer.java:getMetaData(1205)) - Get metadata for source tables
2018-04-08 14:59:38,319 WARN [HiveServer2-Handler-Pool: Thread-50]: security.UserGroupInformation (UserGroupInformation.java:getGroupNames(1506)) - No groups available for user apms
2018-04-08 14:59:38,319 WARN [HiveServer2-Handler-Pool: Thread-50]: security.UserGroupInformation (UserGroupInformation.java:getGroupNames(1506)) - No groups available for user apms
2018-04-08 14:59:38,393 INFO [HiveServer2-Handler-Pool: Thread-50]: parse.SemanticAnalyzer (SemanticAnalyzer.java:getMetaData(1346)) - Get metadata for subqueries
2018-04-08 14:59:38,393 INFO [HiveServer2-Handler-Pool: Thread-50]: parse.SemanticAnalyzer (SemanticAnalyzer.java:getMetaData(1370)) - Get metadata for destination tables
2018-04-08 14:59:38,418 INFO [HiveServer2-Handler-Pool: Thread-50]: parse.SemanticAnalyzer (SemanticAnalyzer.java:analyzeInternal(9391)) - Completed getting MetaData in Semantic Analysis
2018-04-08 14:59:38,465 INFO [HiveServer2-Handler-Pool: Thread-50]: parse.SemanticAnalyzer (SemanticAnalyzer.java:genFileSinkPlan(6065)) - Set stats collection dir : hdfs://ebs-spark-wlt-3dxqw.vclound.com:9000/tmp/hive-root/hive_2018-04-08_14-59-38_315_1352087243136786510-3/-mr-10000/.hive-staging_hive_2018-04-08_14-59-38_315_1352087243136786510-3/-ext-10002
2018-04-08 14:59:38,477 INFO [HiveServer2-Handler-Pool: Thread-50]: parse.SemanticAnalyzer (SemanticAnalyzer.java:analyzeInternal(9473)) - Completed plan generation
2018-04-08 14:59:38,478 INFO [HiveServer2-Handler-Pool: Thread-50]: ql.Driver (Driver.java:compile(446)) - Semantic Analysis Completed
2018-04-08 14:59:38,480 INFO [HiveServer2-Handler-Pool: Thread-50]: ql.Driver (Driver.java:getSchema(245)) - Returning Hive schema: Schema(fieldSchemas:[FieldSchema(name:kylin_account.account_id, type:bigint, comment:null), FieldSchema(name:kylin_account.account_buyer_level, type:int, comment:null), FieldSchema(name:kylin_account.account_seller_level, type:int, comment:null), FieldSchema(name:kylin_account.account_country, type:string, comment:null), FieldSchema(name:kylin_account.account_contact, type:string, comment:null)], properties:null)
2018-04-08 14:59:38,483 INFO [HiveServer2-Background-Pool: Thread-213]: ql.Driver (Driver.java:checkConcurrency(165)) - Concurrency mode is disabled, not creating a lock manager
2018-04-08 14:59:38,483 INFO [HiveServer2-Background-Pool: Thread-213]: ql.Driver (Driver.java:execute(1243)) - Starting command: select * from kylin_account where account_country='FR' LIMIT 64
2018-04-08 14:59:38,484 INFO [HiveServer2-Background-Pool: Thread-213]: ql.Driver (SessionState.java:printInfo(558)) - Total jobs = 1
2018-04-08 14:59:38,485 INFO [HiveServer2-Background-Pool: Thread-213]: ql.Driver (SessionState.java:printInfo(558)) - Launching Job 1 out of 1
2018-04-08 14:59:38,486 INFO [HiveServer2-Background-Pool: Thread-213]: exec.Task (SessionState.java:printInfo(558)) - Number of reduce tasks is set to 0 since there's no reduce operator
2018-04-08 14:59:38,508 INFO [HiveServer2-Background-Pool: Thread-213]: mr.ExecDriver (ExecDriver.java:execute(285)) - Using org.apache.hadoop.hive.ql.io.CombineHiveInputFormat
2018-04-08 14:59:38,642 WARN [HiveServer2-Background-Pool: Thread-213]: mapreduce.JobSubmitter (JobSubmitter.java:copyAndConfigureFiles(151)) - Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this.
2018-04-08 14:59:38,963 INFO [HiveServer2-Background-Pool: Thread-213]: exec.Task (SessionState.java:printInfo(558)) - Starting Job = job_1523169313235_0006, Tracking URL = http://ebs-spark-wlt-3dxqw.vclou ... 1523169313235_0006/
2018-04-08 14:59:38,964 INFO [HiveServer2-Background-Pool: Thread-213]: exec.Task (SessionState.java:printInfo(558)) - Kill Command = /apps/svr/hadoop-2.5.0-cdh5.3.2/bin/hadoop job -kill job_1523169313235_0006
2018-04-08 14:59:43,488 WARN [HiveServer2-Handler-Pool: Thread-50]: ql.Driver (DriverContext.java:shutdown(137)) - Shutting down task : Stage-1:MAPRED
2018-04-08 14:59:43,736 WARN [HiveServer2-Handler-Pool: Thread-50]: transport.TIOStreamTransport (TIOStreamTransport.java:close(112)) - Error closing output stream.
java.net.SocketException: Socket closed
at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:116)
at java.net.SocketOutputStream.write(SocketOutputStream.java:153)
at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)
at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140)
at java.io.FilterOutputStream.close(FilterOutputStream.java:158)
at org.apache.thrift.transport.TIOStreamTransport.close(TIOStreamTransport.java:110)
at org.apache.thrift.transport.TSocket.close(TSocket.java:194)
at org.apache.thrift.transport.TSaslTransport.close(TSaslTransport.java:381)
at org.apache.thrift.transport.TSaslServerTransport.close(TSaslServerTransport.java:41)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:261)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
2018-04-08 14:59:44,548 INFO [HiveServer2-Background-Pool: Thread-213]: exec.Task (SessionState.java:printInfo(558)) - Hadoop job information for Stage-1: number of mappers: 0; number of reducers: 0
2018-04-08 14:59:44,578 WARN [HiveServer2-Background-Pool: Thread-213]: mapreduce.Counters (AbstractCounters.java:getGroup(234)) - Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
2018-04-08 14:59:44,579 INFO [HiveServer2-Background-Pool: Thread-213]: exec.Task (SessionState.java:printInfo(558)) - 2018-04-08 14:59:44,578 Stage-1 map = 0%, reduce = 0%
2018-04-08 14:59:44,583 WARN [HiveServer2-Background-Pool: Thread-213]: mapreduce.Counters (AbstractCounters.java:getGroup(234)) - Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
2018-04-08 14:59:44,587 ERROR [HiveServer2-Background-Pool: Thread-213]: exec.Task (SessionState.java:printError(567)) - Ended Job = job_1523169313235_0006 with errors
2018-04-08 14:59:44,588 ERROR [Thread-199]: exec.Task (SessionState.java:printError(567)) - Error during job, obtaining debugging information...
2018-04-08 14:59:44,595 ERROR [HiveServer2-Background-Pool: Thread-213]: ql.Driver (SessionState.java:printError(567)) - FAILED: Operation cancelled
2018-04-08 14:59:44,595 INFO [HiveServer2-Background-Pool: Thread-213]: ql.Driver (SessionState.java:printInfo(558)) - MapReduce Jobs Launched:
2018-04-08 14:59:44,596 WARN [HiveServer2-Background-Pool: Thread-213]: mapreduce.Counters (AbstractCounters.java:getGroup(234)) - Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
2018-04-08 14:59:44,596 INFO [HiveServer2-Background-Pool: Thread-213]: ql.Driver (SessionState.java:printInfo(558)) - Stage-Stage-1: HDFS Read: 0 HDFS Write: 0 FAIL
2018-04-08 14:59:44,596 INFO [HiveServer2-Background-Pool: Thread-213]: ql.Driver (SessionState.java:printInfo(558)) - Total MapReduce CPU Time Spent: 0 msec
2018-04-08 14:59:44,596 ERROR [HiveServer2-Background-Pool: Thread-213]: operation.Operation (SQLOperation.java:run(203)) - Error running hive query:
org.apache.hive.service.cli.HiveSQLException: Illegal Operation state transition from CLOSED to ERROR
at org.apache.hive.service.cli.OperationState.validateTransition(OperationState.java:91)
at org.apache.hive.service.cli.OperationState.validateTransition(OperationState.java:97)
at org.apache.hive.service.cli.operation.Operation.setState(Operation.java:116)
at org.apache.hive.service.cli.operation.SQLOperation.runInternal(SQLOperation.java:159)
at org.apache.hive.service.cli.operation.SQLOperation.access$000(SQLOperation.java:69)
at org.apache.hive.service.cli.operation.SQLOperation$1$1.run(SQLOperation.java:200)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1642)
at org.apache.hadoop.hive.shims.HadoopShimsSecure.doAs(HadoopShimsSecure.java:502)
at org.apache.hive.service.cli.operation.SQLOperation$1.run(SQLOperation.java:213)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
[/mw_shl_code]
但通过 beeline/hive cli 都能正确执行,百思不得其解...
|
|