分享

hive1.2.1在哈hadoop2.5.2测试inser,delete,update的失败经历,请大神支招

linbowei 发表于 2016-4-12 10:40:53 [显示全部楼层] 回帖奖励 阅读模式 关闭右栏 5 19363
在hadoop2.5.2配置了hive1.2.1,如果没有配置如下参数:hive的使用是正常的

配置了这些参数之后,进入hive命令行,会报如下错误(感觉好像是hadoop连不上了,但是找不出原因):
[hadoop@hadoop-master bin]$ hive
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/work/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/work/spark/lib/spark-assembly-1.5.2-hadoop2.4.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/work/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/work/spark/lib/spark-assembly-1.5.2-hadoop2.4.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
16/04/12 02:39:50 [main]: DEBUG common.LogUtils: Using hive-site.xml found on CLASSPATH at /work/hive/conf/hive-site.xml
16/04/12 02:39:50 [main]: DEBUG session.SessionState: SessionState user: null

Logging initialized using configuration in file:/work/hive/conf/hive-log4j.properties
16/04/12 02:39:50 [main]: INFO SessionState:
Logging initialized using configuration in file:/work/hive/conf/hive-log4j.properties
16/04/12 02:39:50 [main]: DEBUG parse.VariableSubstitution: Substitution is on: hive
16/04/12 02:39:50 [main]: DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(value=[Rate of successful kerberos logins and latency (milliseconds)], about=, valueName=Time, type=DEFAULT, always=false, sampleName=Ops)
16/04/12 02:39:50 [main]: DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(value=[Rate of failed kerberos logins and latency (milliseconds)], about=, valueName=Time, type=DEFAULT, always=false, sampleName=Ops)
16/04/12 02:39:50 [main]: DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(value=[GetGroups], about=, valueName=Time, type=DEFAULT, always=false, sampleName=Ops)
16/04/12 02:39:50 [main]: DEBUG impl.MetricsSystemImpl: UgiMetrics, User and group related metrics
16/04/12 02:39:51 [main]: DEBUG security.Groups:  Creating new Groups object
16/04/12 02:39:51 [main]: DEBUG util.NativeCodeLoader: Trying to load the custom-built native-hadoop library...
16/04/12 02:39:51 [main]: DEBUG util.NativeCodeLoader: Loaded the native-hadoop library
16/04/12 02:39:51 [main]: DEBUG security.JniBasedUnixGroupsMapping: Using JniBasedUnixGroupsMapping for Group resolution
16/04/12 02:39:51 [main]: DEBUG security.JniBasedUnixGroupsMappingWithFallback: Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMapping
16/04/12 02:39:51 [main]: DEBUG security.Groups: Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000; warningDeltaMs=5000
16/04/12 02:39:51 [main]: DEBUG security.UserGroupInformation: hadoop login
16/04/12 02:39:51 [main]: DEBUG security.UserGroupInformation: hadoop login commit
16/04/12 02:39:51 [main]: DEBUG security.UserGroupInformation: using local user:UnixPrincipal: hadoop
16/04/12 02:39:51 [main]: DEBUG security.UserGroupInformation: UGI loginUser:hadoop (auth:SIMPLE)
16/04/12 02:39:51 [main]: INFO hive.metastore: Trying to connect to metastore with URI thrift://192.168.42.128:9083
16/04/12 02:39:51 [main]: DEBUG security.Groups: Returning fetched groups for 'hadoop'
16/04/12 02:39:51 [main]: INFO hive.metastore: Connected to metastore.
16/04/12 02:39:51 [main]: DEBUG : address: hadoop-master/192.168.42.128 isLoopbackAddress: false, with host 192.168.42.128 hadoop-master
16/04/12 02:39:51 [main]: DEBUG hdfs.BlockReaderLocal: dfs.client.use.legacy.blockreader.local = false
16/04/12 02:39:51 [main]: DEBUG hdfs.BlockReaderLocal: dfs.client.read.shortcircuit = false
16/04/12 02:39:51 [main]: DEBUG hdfs.BlockReaderLocal: dfs.client.domain.socket.data.traffic = false
16/04/12 02:39:51 [main]: DEBUG hdfs.BlockReaderLocal: dfs.domain.socket.path =
16/04/12 02:39:51 [main]: DEBUG retry.RetryUtils: multipleLinearRandomRetry = null
16/04/12 02:39:51 [main]: DEBUG ipc.Server: rpcKind=RPC_PROTOCOL_BUFFER, rpcRequestWrapperClass=class org.apache.hadoop.ipc.ProtobufRpcEngine$RpcRequestWrapper, rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker@6e51ad67
16/04/12 02:39:51 [main]: DEBUG ipc.Client: getting client out of cache: org.apache.hadoop.ipc.Client@57f1a90a
16/04/12 02:39:51 [Thread-3]: DEBUG unix.DomainSocketWatcher: org.apache.hadoop.net.unix.DomainSocketWatcher$1@7008a429: starting with interruptCheckPeriodMs = 60000
16/04/12 02:39:51 [main]: DEBUG shortcircuit.DomainSocketFactory: Both short-circuit local reads and UNIX domain socket are disabled.
16/04/12 02:39:51 [main]: DEBUG ipc.Client: The ping interval is 60000 ms.
16/04/12 02:39:51 [main]: DEBUG ipc.Client: Connecting to /192.168.42.128:9000
16/04/12 02:39:51 [IPC Client (1528709932) connection to /192.168.42.128:9000 from hadoop]: DEBUG ipc.Client: IPC Client (1528709932) connection to /192.168.42.128:9000 from hadoop: starting, having connections 1
16/04/12 02:39:51 [IPC Parameter Sending Thread #0]: DEBUG ipc.Client: IPC Client (1528709932) connection to /192.168.42.128:9000 from hadoop sending #0
16/04/12 02:39:51 [IPC Client (1528709932) connection to /192.168.42.128:9000 from hadoop]: DEBUG ipc.Client: IPC Client (1528709932) connection to /192.168.42.128:9000 from hadoop got value #0
16/04/12 02:39:51 [main]: DEBUG ipc.ProtobufRpcEngine: Call: getFileInfo took 41ms
16/04/12 02:39:52 [IPC Parameter Sending Thread #0]: DEBUG ipc.Client: IPC Client (1528709932) connection to /192.168.42.128:9000 from hadoop sending #1
16/04/12 02:39:52 [IPC Client (1528709932) connection to /192.168.42.128:9000 from hadoop]: DEBUG ipc.Client: IPC Client (1528709932) connection to /192.168.42.128:9000 from hadoop got value #1
16/04/12 02:39:52 [main]: DEBUG ipc.ProtobufRpcEngine: Call: getFileInfo took 4ms
16/04/12 02:39:52 [main]: DEBUG session.SessionState: HDFS root scratch dir: /tmp/hive with schema null, permission: rwx-wx-wx
16/04/12 02:39:52 [IPC Parameter Sending Thread #0]: DEBUG ipc.Client: IPC Client (1528709932) connection to /192.168.42.128:9000 from hadoop sending #2
16/04/12 02:39:52 [IPC Client (1528709932) connection to /192.168.42.128:9000 from hadoop]: DEBUG ipc.Client: IPC Client (1528709932) connection to /192.168.42.128:9000 from hadoop got value #2
16/04/12 02:39:52 [main]: DEBUG ipc.ProtobufRpcEngine: Call: getFileInfo took 3ms
16/04/12 02:39:52 [main]: DEBUG nativeio.NativeIO: Initialized cache for IDs to User/Group mapping with a  cache timeout of 14400 seconds.
16/04/12 02:39:52 [main]: INFO session.SessionState: Created local directory: /home/hive/iotmp/03900d60-772d-476a-85cb-8193ea0f79b3_resources
16/04/12 02:39:52 [IPC Parameter Sending Thread #0]: DEBUG ipc.Client: IPC Client (1528709932) connection to /192.168.42.128:9000 from hadoop sending #3
16/04/12 02:39:52 [IPC Client (1528709932) connection to /192.168.42.128:9000 from hadoop]: DEBUG ipc.Client: IPC Client (1528709932) connection to /192.168.42.128:9000 from hadoop got value #3
16/04/12 02:39:52 [main]: DEBUG ipc.ProtobufRpcEngine: Call: getFileInfo took 1ms
16/04/12 02:39:52 [main]: DEBUG hdfs.DFSClient: /tmp/hive/hadoop/03900d60-772d-476a-85cb-8193ea0f79b3: masked=rwx------
16/04/12 02:39:52 [IPC Parameter Sending Thread #0]: DEBUG ipc.Client: IPC Client (1528709932) connection to /192.168.42.128:9000 from hadoop sending #4
16/04/12 02:39:52 [IPC Client (1528709932) connection to /192.168.42.128:9000 from hadoop]: DEBUG ipc.Client: IPC Client (1528709932) connection to /192.168.42.128:9000 from hadoop got value #4
16/04/12 02:39:52 [main]: DEBUG ipc.ProtobufRpcEngine: Call: mkdirs took 4ms
16/04/12 02:39:52 [main]: INFO session.SessionState: Created HDFS directory: /tmp/hive/hadoop/03900d60-772d-476a-85cb-8193ea0f79b3
16/04/12 02:39:52 [IPC Parameter Sending Thread #0]: DEBUG ipc.Client: IPC Client (1528709932) connection to /192.168.42.128:9000 from hadoop sending #5
16/04/12 02:39:52 [IPC Client (1528709932) connection to /192.168.42.128:9000 from hadoop]: DEBUG ipc.Client: IPC Client (1528709932) connection to /192.168.42.128:9000 from hadoop got value #5
16/04/12 02:39:52 [main]: DEBUG ipc.ProtobufRpcEngine: Call: getFileInfo took 2ms
16/04/12 02:39:52 [main]: INFO session.SessionState: Created local directory: /home/hive/iotmp/hive/03900d60-772d-476a-85cb-8193ea0f79b3
16/04/12 02:39:52 [IPC Parameter Sending Thread #0]: DEBUG ipc.Client: IPC Client (1528709932) connection to /192.168.42.128:9000 from hadoop sending #6
16/04/12 02:39:52 [IPC Client (1528709932) connection to /192.168.42.128:9000 from hadoop]: DEBUG ipc.Client: IPC Client (1528709932) connection to /192.168.42.128:9000 from hadoop got value #6
16/04/12 02:39:52 [main]: DEBUG ipc.ProtobufRpcEngine: Call: getFileInfo took 2ms
16/04/12 02:39:52 [main]: DEBUG hdfs.DFSClient: /tmp/hive/hadoop/03900d60-772d-476a-85cb-8193ea0f79b3/_tmp_space.db: masked=rwx------
16/04/12 02:39:52 [IPC Parameter Sending Thread #0]: DEBUG ipc.Client: IPC Client (1528709932) connection to /192.168.42.128:9000 from hadoop sending #7
16/04/12 02:39:52 [IPC Client (1528709932) connection to /192.168.42.128:9000 from hadoop]: DEBUG ipc.Client: IPC Client (1528709932) connection to /192.168.42.128:9000 from hadoop got value #7
16/04/12 02:39:52 [main]: DEBUG ipc.ProtobufRpcEngine: Call: mkdirs took 4ms
16/04/12 02:39:52 [main]: INFO session.SessionState: Created HDFS directory: /tmp/hive/hadoop/03900d60-772d-476a-85cb-8193ea0f79b3/_tmp_space.db
16/04/12 02:39:52 [IPC Parameter Sending Thread #0]: DEBUG ipc.Client: IPC Client (1528709932) connection to /192.168.42.128:9000 from hadoop sending #8
16/04/12 02:39:52 [IPC Client (1528709932) connection to /192.168.42.128:9000 from hadoop]: DEBUG ipc.Client: IPC Client (1528709932) connection to /192.168.42.128:9000 from hadoop got value #8
16/04/12 02:39:52 [main]: DEBUG ipc.ProtobufRpcEngine: Call: getFileInfo took 4ms
16/04/12 02:39:52 [main]: DEBUG CliDriver: CliDriver inited with classpath /work/hadoop/etc/hadoop:/work/hadoop/share/hadoop/common/lib/commons-compress-1.4.1.jar:/work/hadoop/share/hadoop/common/lib/commons-lang-2.6.jar:/work/hadoop/share/hadoop/common/lib/commons-math3-3.1.1.jar:/work/hadoop/share/hadoop/common/lib/jasper-runtime-5.5.23.jar:/work/hadoop/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/work/hadoop/share/hadoop/common/lib/commons-io-2.4.jar:/work/hadoop/share/hadoop/common/lib/commons-configuration-1.6.jar:/work/hadoop/share/hadoop/common/lib/commons-logging-1.1.3.jar:/work/hadoop/share/hadoop/common/lib/xz-1.0.jar:/work/hadoop/share/hadoop/common/lib/jersey-json-1.9.jar:/work/hadoop/share/hadoop/common/lib/paranamer-2.3.jar:/work/hadoop/share/hadoop/common/lib/apacheds-i18n-2.0.0-M15.jar:/work/hadoop/share/hadoop/common/lib/jackson-xc-1.9.13.jar:/work/hadoop/share/hadoop/common/lib/commons-httpclient-3.1.jar:/work/hadoop/share/hadoop/common/lib/asm-3.2.jar:/work/hadoop/share/hadoop/common/lib/jetty-util-6.1.26.jar:/work/hadoop/share/hadoop/common/lib/commons-codec-1.4.jar:/work/hadoop/share/hadoop/common/lib/zookeeper-3.4.6.jar:/work/hadoop/share/hadoop/common/lib/jersey-core-1.9.jar:/work/hadoop/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/work/hadoop/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/work/hadoop/share/hadoop/common/lib/api-asn1-api-1.0.0-M20.jar:/work/hadoop/share/hadoop/common/lib/hadoop-annotations-2.5.2.jar:/work/hadoop/share/hadoop/common/lib/servlet-api-2.5.jar:/work/hadoop/share/hadoop/common/lib/commons-el-1.0.jar:/work/hadoop/share/hadoop/common/lib/commons-collections-3.2.1.jar:/work/hadoop/share/hadoop/common/lib/mockito-all-1.8.5.jar:/work/hadoop/share/hadoop/common/lib/hamcrest-core-1.3.jar:/work/hadoop/share/hadoop/common/lib/jsch-0.1.42.jar:/work/hadoop/share/hadoop/common/lib/activation-1.1.jar:/work/hadoop/share/hadoop/common/lib/commons-net-3.1.jar:/work/hadoop/share/hadoop/common/lib/jettison-1.1.jar:/work/hadoop/share/hadoop/common/lib/httpcore-4.2.5.jar:/work/hadoop/share/hadoop/common/lib/slf4j-api-1.7.5.jar:/work/hadoop/share/hadoop/common/lib/log4j-1.2.17.jar:/work/hadoop/share/hadoop/common/lib/guava-11.0.2.jar:/work/hadoop/share/hadoop/common/lib/jersey-server-1.9.jar:/work/hadoop/share/hadoop/common/lib/jasper-compiler-5.5.23.jar:/work/hadoop/share/hadoop/common/lib/jackson-jaxrs-1.9.13.jar:/work/hadoop/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/work/hadoop/share/hadoop/common/lib/commons-cli-1.2.jar:/work/hadoop/share/hadoop/common/lib/netty-3.6.2.Final.jar:/work/hadoop/share/hadoop/common/lib/jetty-6.1.26.jar:/work/hadoop/share/hadoop/common/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/work/hadoop/share/hadoop/common/lib/jets3t-0.9.0.jar:/work/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar:/work/hadoop/share/hadoop/common/lib/jackson-core-asl-1.9.13.jar:/work/hadoop/share/hadoop/common/lib/java-xmlbuilder-0.4.jar:/work/hadoop/share/hadoop/common/lib/api-util-1.0.0-M20.jar:/work/hadoop/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/work/hadoop/share/hadoop/common/lib/jsp-api-2.1.jar:/work/hadoop/share/hadoop/common/lib/avro-1.7.4.jar:/work/hadoop/share/hadoop/common/lib/jsr305-1.3.9.jar:/work/hadoop/share/hadoop/common/lib/commons-digester-1.8.jar:/work/hadoop/share/hadoop/common/lib/xmlenc-0.52.jar:/work/hadoop/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/work/hadoop/share/hadoop/common/lib/junit-4.11.jar:/work/hadoop/share/hadoop/common/lib/hadoop-auth-2.5.2.jar:/work/hadoop/share/hadoop/common/lib/httpclient-4.2.5.jar:/work/hadoop/share/hadoop/common/lib/stax-api-1.0-2.jar:/work/hadoop/share/hadoop/common/lib/jackson-mapper-asl-1.9.13.jar:/work/hadoop/share/hadoop/common/hadoop-common-2.5.2-tests.jar:/work/hadoop/share/hadoop/common/hadoop-nfs-2.5.2.jar:/work/hadoop/share/hadoop/common/hadoop-common-2.5.2.jar:/work/hadoop/share/hadoop/hdfs:/work/hadoop/share/hadoop/hdfs/lib/commons-lang-2.6.jar:/work/hadoop/share/hadoop/hdfs/lib/jasper-runtime-5.5.23.jar:/work/hadoop/share/hadoop/hdfs/lib/commons-io-2.4.jar:/work/hadoop/share/hadoop/hdfs/lib/commons-logging-1.1.3.jar:/work/hadoop/share/hadoop/hdfs/lib/asm-3.2.jar:/work/hadoop/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/work/hadoop/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/work/hadoop/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/work/hadoop/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/work/hadoop/share/hadoop/hdfs/lib/commons-el-1.0.jar:/work/hadoop/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/work/hadoop/share/hadoop/hdfs/lib/guava-11.0.2.jar:/work/hadoop/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/work/hadoop/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/work/hadoop/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/work/hadoop/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/work/hadoop/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/work/hadoop/share/hadoop/hdfs/lib/jackson-core-asl-1.9.13.jar:/work/hadoop/share/hadoop/hdfs/lib/jsp-api-2.1.jar:/work/hadoop/share/hadoop/hdfs/lib/jsr305-1.3.9.jar:/work/hadoop/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/work/hadoop/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/work/hadoop/share/hadoop/hdfs/lib/jackson-mapper-asl-1.9.13.jar:/work/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.5.2.jar:/work/hadoop/share/hadoop/hdfs/hadoop-hdfs-nfs-2.5.2.jar:/work/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.5.2-tests.jar:/work/hadoop/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/work/hadoop/share/hadoop/yarn/lib/commons-lang-2.6.jar:/work/hadoop/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/work/hadoop/share/hadoop/yarn/lib/commons-io-2.4.jar:/work/hadoop/share/hadoop/yarn/lib/commons-logging-1.1.3.jar:/work/hadoop/share/hadoop/yarn/lib/xz-1.0.jar:/work/hadoop/share/hadoop/yarn/lib/jersey-json-1.9.jar:/work/hadoop/share/hadoop/yarn/lib/jackson-xc-1.9.13.jar:/work/hadoop/share/hadoop/yarn/lib/commons-httpclient-3.1.jar:/work/hadoop/share/hadoop/yarn/lib/asm-3.2.jar:/work/hadoop/share/hadoop/yarn/lib/zookeeper-3.4.8.jar:/work/hadoop/share/hadoop/yarn/lib/jetty-util-6.1.26.jar:/work/hadoop/share/hadoop/yarn/lib/jline-2.12.jar:/work/hadoop/share/hadoop/yarn/lib/commons-codec-1.4.jar:/work/hadoop/share/hadoop/yarn/lib/guice-3.0.jar:/work/hadoop/share/hadoop/yarn/lib/jersey-core-1.9.jar:/work/hadoop/share/hadoop/yarn/lib/leveldbjni-all-1.8.jar:/work/hadoop/share/hadoop/yarn/lib/jaxb-api-2.2.2.jar:/work/hadoop/share/hadoop/yarn/lib/aopalliance-1.0.jar:/work/hadoop/share/hadoop/yarn/lib/servlet-api-2.5.jar:/work/hadoop/share/hadoop/yarn/lib/commons-collections-3.2.1.jar:/work/hadoop/share/hadoop/yarn/lib/activation-1.1.jar:/work/hadoop/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/work/hadoop/share/hadoop/yarn/lib/jettison-1.1.jar:/work/hadoop/share/hadoop/yarn/lib/log4j-1.2.17.jar:/work/hadoop/share/hadoop/yarn/lib/guava-11.0.2.jar:/work/hadoop/share/hadoop/yarn/lib/jersey-server-1.9.jar:/work/hadoop/share/hadoop/yarn/lib/jackson-jaxrs-1.9.13.jar:/work/hadoop/share/hadoop/yarn/lib/jaxb-impl-2.2.3-1.jar:/work/hadoop/share/hadoop/yarn/lib/commons-cli-1.2.jar:/work/hadoop/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/work/hadoop/share/hadoop/yarn/lib/jetty-6.1.26.jar:/work/hadoop/share/hadoop/yarn/lib/jackson-core-asl-1.9.13.jar:/work/hadoop/share/hadoop/yarn/lib/javax.inject-1.jar:/work/hadoop/share/hadoop/yarn/lib/jsr305-1.3.9.jar:/work/hadoop/share/hadoop/yarn/lib/jersey-client-1.9.jar:/work/hadoop/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/work/hadoop/share/hadoop/yarn/lib/stax-api-1.0-2.jar:/work/hadoop/share/hadoop/yarn/lib/jackson-mapper-asl-1.9.13.jar:/work/hadoop/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.5.2.jar:/work/hadoop/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.5.2.jar:/work/hadoop/share/hadoop/yarn/hadoop-yarn-server-applicationhistoryservice-2.5.2.jar:/work/hadoop/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.5.2.jar:/work/hadoop/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.5.2.jar:/work/hadoop/share/hadoop/yarn/hadoop-yarn-server-common-2.5.2.jar:/work/hadoop/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.5.2.jar:/work/hadoop/share/hadoop/yarn/hadoop-yarn-common-2.5.2.jar:/work/hadoop/share/hadoop/yarn/hadoop-yarn-api-2.5.2.jar:/work/hadoop/share/hadoop/yarn/hadoop-yarn-server-tests-2.5.2.jar:/work/hadoop/share/hadoop/yarn/hadoop-yarn-client-2.5.2.jar:/work/hadoop/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/work/hadoop/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/work/hadoop/share/hadoop/mapreduce/lib/commons-io-2.4.jar:/work/hadoop/share/hadoop/mapreduce/lib/xz-1.0.jar:/work/hadoop/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/work/hadoop/share/hadoop/mapreduce/lib/asm-3.2.jar:/work/hadoop/share/hadoop/mapreduce/lib/guice-3.0.jar:/work/hadoop/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/work/hadoop/share/hadoop/mapreduce/lib/leveldbjni-all-1.8.jar:/work/hadoop/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/work/hadoop/share/hadoop/mapreduce/lib/hadoop-annotations-2.5.2.jar:/work/hadoop/share/hadoop/mapreduce/lib/hamcrest-core-1.3.jar:/work/hadoop/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/work/hadoop/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/work/hadoop/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/work/hadoop/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/work/hadoop/share/hadoop/mapreduce/lib/jackson-core-asl-1.9.13.jar:/work/hadoop/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/work/hadoop/share/hadoop/mapreduce/lib/javax.inject-1.jar:/work/hadoop/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/work/hadoop/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/work/hadoop/share/hadoop/mapreduce/lib/junit-4.11.jar:/work/hadoop/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/work/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.5.2-tests.jar:/work/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.5.2.jar:/work/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.5.2.jar:/work/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.5.2.jar:/work/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.5.2.jar:/work/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.5.2.jar:/work/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.5.2.jar:/work/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.5.2.jar:/work/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.5.2.jar:/work/hive/conf:/work/hive/lib/accumulo-core-1.6.0.jar:/work/hive/lib/accumulo-fate-1.6.0.jar:/work/hive/lib/accumulo-start-1.6.0.jar:/work/hive/lib/accumulo-trace-1.6.0.jar:/work/hive/lib/activation-1.1.jar:/work/hive/lib/ant-1.9.1.jar:/work/hive/lib/ant-launcher-1.9.1.jar:/work/hive/lib/antlr-2.7.7.jar:/work/hive/lib/antlr-runtime-3.4.jar:/work/hive/lib/apache-log4j-extras-1.2.17.jar:/work/hive/lib/asm-commons-3.1.jar:/work/hive/lib/asm-tree-3.1.jar:/work/hive/lib/avro-1.7.5.jar:/work/hive/lib/bonecp-0.8.0.RELEASE.jar:/work/hive/lib/calcite-avatica-1.2.0-incubating.jar:/work/hive/lib/calcite-core-1.2.0-incubating.jar:/work/hive/lib/calcite-linq4j-1.2.0-incubating.jar:/work/hive/lib/commons-beanutils-1.7.0.jar:/work/hive/lib/commons-beanutils-core-1.8.0.jar:/work/hive/lib/commons-cli-1.2.jar:/work/hive/lib/commons-codec-1.4.jar:/work/hive/lib/commons-collections-3.2.1.jar:/work/hive/lib/commons-compiler-2.7.6.jar:/work/hive/lib/commons-compress-1.4.1.jar:/work/hive/lib/commons-configuration-1.6.jar:/work/hive/lib/commons-dbcp-1.4.jar:/work/hive/lib/commons-digester-1.8.jar:/work/hive/lib/commons-httpclient-3.0.1.jar:/work/hive/lib/commons-io-2.4.jar:/work/hive/lib/commons-lang-2.6.jar:/work/hive/lib/commons-logging-1.1.3.jar:/work/hive/lib/commons-math-2.1.jar:/work/hive/lib/commons-pool-1.5.4.jar:/work/hive/lib/commons-vfs2-2.0.jar:/work/hive/lib/curator-client-2.6.0.jar:/work/hive/lib/curator-framework-2.6.0.jar:/work/hive/lib/curator-recipes-2.6.0.jar:/work/hive/lib/datanucleus-api-jdo-3.2.6.jar:/work/hive/lib/datanucleus-core-3.2.10.jar:/work/hive/lib/datanucleus-rdbms-3.2.9.jar:/work/hive/lib/derby-10.10.2.0.jar:/work/hive/lib/eigenbase-properties-1.1.5.jar:/work/hive/lib/geronimo-annotation_1.0_spec-1.1.1.jar:/work/hive/lib/geronimo-jaspic_1.0_spec-1.0.jar:/work/hive/lib/geronimo-jta_1.1_spec-1.1.1.jar:/work/hive/lib/groovy-all-2.1.6.jar:/work/hive/lib/guava-14.0.1.jar:/work/hive/lib/hamcrest-core-1.1.jar:/work/hive/lib/hive-accumulo-handler-1.2.1.jar:/work/hive/lib/hive-ant-1.2.1.jar:/work/hive/lib/hive-beeline-1.2.1.jar:/work/hive/lib/hive-cli-1.2.1.jar:/work/hive/lib/hive-common-1.2.1.jar:/work/hive/lib/hive-contrib-1.2.1.jar:/work/hive/lib/hive-exec-1.2.1.jar:/work/hive/lib/hive-hbase-handler-1.2.1.jar:/work/hive/lib/hive-hwi-1.2.1.jar:/work/hive/lib/hive-jdbc-1.2.1.jar:/work/hive/lib/hive-jdbc-1.2.1-standalone.jar:/work/hive/lib/hive-metastore-1.2.1.jar:/work/hive/lib/hive-serde-1.2.1.jar:/work/hive/lib/hive-service-1.2.1.jar:/work/hive/lib/hive-shims-0.20S-1.2.1.jar:/work/hive/lib/hive-shims-0.23-1.2.1.jar:/work/hive/lib/hive-shims-1.2.1.jar:/work/hive/lib/hive-shims-common-1.2.1.jar:/work/hive/lib/hive-shims-scheduler-1.2.1.jar:/work/hive/lib/hive-testutils-1.2.1.jar:/work/hive/lib/httpclient-4.4.jar:/work/hive/lib/httpcore-4.4.jar:/work/hive/lib/ivy-2.4.0.jar:/work/hive/lib/janino-2.7.6.jar:/work/hive/lib/jcommander-1.32.jar:/work/hive/lib/jdo-api-3.0.1.jar:/work/hive/lib/jetty-all-7.6.0.v20120127.jar:/work/hive/lib/jetty-all-server-7.6.0.v20120127.jar:/work/hive/lib/jline-2.12.jar:/work/hive/lib/joda-time-2.5.jar:/work/hive/lib/jpam-1.1.jar:/work/hive/lib/json-20090211.jar:/work/hive/lib/jsr305-3.0.0.jar:/work/hive/lib/jta-1.1.jar:/work/hive/lib/junit-4.11.jar:/work/hive/lib/libfb303-0.9.2.jar:/work/hive/lib/libthrift-0.9.2.jar:/work/hive/lib/log4j-1.2.16.jar:/work/hive/lib/mail-1.4.1.jar:/work/hive/lib/maven-scm-api-1.4.jar:/work/hive/lib/maven-scm-provider-svn-commons-1.4.jar:/work/hive/lib/maven-scm-provider-svnexe-1.4.jar:/work/hive/lib/mysql-connector-java-5.1.21.jar:/work/hive/lib/netty-3.7.0.Final.jar:/work/hive/lib/opencsv-2.3.jar:/work/hive/lib/oro-2.0.8.jar:/work/hive/lib/paranamer-2.3.jar:/work/hive/lib/parquet-hadoop-bundle-1.6.0.jar:/work/hive/lib/pentaho-aggdesigner-algorithm-5.1.5-jhyde.jar:/work/hive/lib/plexus-utils-1.5.6.jar:/work/hive/lib/regexp-1.3.jar:/work/hive/lib/servlet-api-2.5.jar:/work/hive/lib/snappy-java-1.0.5.jar:/work/hive/lib/ST4-4.0.4.jar:/work/hive/lib/stax-api-1.0.1.jar:/work/hive/lib/stringtemplate-3.2.1.jar:/work/hive/lib/super-csv-2.2.0.jar:/work/hive/lib/tempus-fugit-1.1.jar:/work/hive/lib/velocity-1.5.jar:/work/hive/lib/xz-1.0.jar:/work/hive/lib/zookeeper-3.4.8.jar:/work/spark/lib/spark-assembly-1.5.2-hadoop2.4.0.jar::/work/hadoop/contrib/capacity-scheduler/*.jar
hive> 16/04/12 02:40:02 [IPC Client (1528709932) connection to /192.168.42.128:9000 from hadoop]: DEBUG ipc.Client: IPC Client (1528709932) connection to /192.168.42.128:9000 from hadoop: closed
16/04/12 02:40:02 [IPC Client (1528709932) connection to /192.168.42.128:9000 from hadoop]: DEBUG ipc.Client: IPC Client (1528709932) connection to /192.168.42.128:9000 from hadoop: stopped, remaining connections 0

    >
    > show databases;
16/04/12 02:40:15 [main]: INFO log.PerfLogger: <PERFLOG method=Driver.run from=org.apache.hadoop.hive.ql.Driver>
16/04/12 02:40:15 [main]: INFO log.PerfLogger: <PERFLOG method=TimeToSubmit from=org.apache.hadoop.hive.ql.Driver>
16/04/12 02:40:15 [main]: INFO log.PerfLogger: <PERFLOG method=compile from=org.apache.hadoop.hive.ql.Driver>
16/04/12 02:40:16 [main]: DEBUG parse.VariableSubstitution: Substitution is on: show databases
16/04/12 02:40:16 [main]: INFO log.PerfLogger: <PERFLOG method=parse from=org.apache.hadoop.hive.ql.Driver>
16/04/12 02:40:16 [main]: INFO parse.ParseDriver: Parsing command: show databases
16/04/12 02:40:16 [main]: INFO parse.ParseDriver: Parse Completed
16/04/12 02:40:16 [main]: INFO log.PerfLogger: </PERFLOG method=parse start=1460428816022 end=1460428816523 duration=501 from=org.apache.hadoop.hive.ql.Driver>
16/04/12 02:40:16 [main]: WARN metastore.RetryingMetaStoreClient: MetaStoreClient lost connection. Attempting to reconnect.
org.apache.thrift.TApplicationException: Internal error processing get_open_txns
    at org.apache.thrift.TApplicationException.read(TApplicationException.java:111)
    at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:71)
    at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_open_txns(ThriftHiveMetastore.java:3789)
    at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_open_txns(ThriftHiveMetastore.java:3777)
    at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getValidTxns(HiveMetaStoreClient.java:1823)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:156)
    at com.sun.proxy.$Proxy5.getValidTxns(Unknown Source)
    at org.apache.hadoop.hive.ql.lockmgr.DbTxnManager.getValidTxns(DbTxnManager.java:314)
    at org.apache.hadoop.hive.ql.Driver.recordValidTxns(Driver.java:936)
    at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:406)
    at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:308)
    at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1122)
    at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1170)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1059)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1049)
    at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:213)
    at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:165)
    at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:376)
    at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:736)
    at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:681)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:621)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
16/04/12 02:40:17 [main]: INFO hive.metastore: Trying to connect to metastore with URI thrift://192.168.42.128:9083
16/04/12 02:40:17 [main]: DEBUG security.Groups: Returning cached groups for 'hadoop'
16/04/12 02:40:17 [main]: INFO hive.metastore: Connected to metastore.
FAILED: LockException [Error 10280]: Error communicating with the metastore
16/04/12 02:40:17 [main]: ERROR ql.Driver: FAILED: LockException [Error 10280]: Error communicating with the metastore
org.apache.hadoop.hive.ql.lockmgr.LockException: Error communicating with the metastore
    at org.apache.hadoop.hive.ql.lockmgr.DbTxnManager.getValidTxns(DbTxnManager.java:316)
    at org.apache.hadoop.hive.ql.Driver.recordValidTxns(Driver.java:936)
    at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:406)
    at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:308)
    at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1122)
    at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1170)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1059)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1049)
    at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:213)
    at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:165)
    at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:376)
    at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:736)
    at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:681)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:621)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
Caused by: org.apache.thrift.TApplicationException: Internal error processing get_open_txns
    at org.apache.thrift.TApplicationException.read(TApplicationException.java:111)
    at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:71)
    at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_open_txns(ThriftHiveMetastore.java:3789)
    at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_open_txns(ThriftHiveMetastore.java:3777)
    at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getValidTxns(HiveMetaStoreClient.java:1823)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:156)
    at com.sun.proxy.$Proxy5.getValidTxns(Unknown Source)
    at org.apache.hadoop.hive.ql.lockmgr.DbTxnManager.getValidTxns(DbTxnManager.java:314)
    ... 18 more

16/04/12 02:40:17 [main]: INFO log.PerfLogger: </PERFLOG method=compile start=1460428815993 end=1460428817582 duration=1589 from=org.apache.hadoop.hive.ql.Driver>
16/04/12 02:40:17 [main]: INFO metadata.Hive: Dumping metastore api call timing information for : compilation phase
16/04/12 02:40:17 [main]: DEBUG metadata.Hive: Total time spent in each metastore function (ms): {isCompatibleWith_(HiveConf, )=0, getAllDatabases_()=15, getFunctions_(String, String, )=41}
16/04/12 02:40:17 [main]: INFO log.PerfLogger: <PERFLOG method=releaseLocks from=org.apache.hadoop.hive.ql.Driver>
16/04/12 02:40:17 [main]: INFO log.PerfLogger: </PERFLOG method=releaseLocks start=1460428817583 end=1460428817583 duration=0 from=org.apache.hadoop.hive.ql.Driver>
16/04/12 02:40:17 [main]: INFO log.PerfLogger: <PERFLOG method=releaseLocks from=org.apache.hadoop.hive.ql.Driver>
16/04/12 02:40:17 [main]: INFO log.PerfLogger: </PERFLOG method=releaseLocks start=1460428817588 end=1460428817588 duration=0 from=org.apache.hadoop.hive.ql.Driver>
hive>



已有(5)人评论

跳转到指定楼层
wscl1213 发表于 2016-4-12 11:11:59
这些功能在hive0.13中是有的,相信hive0.14中也有。
hive1.2.1中,就不太清楚了。看楼主的错误,可能hive1.2.1还不支持
回复

使用道具 举报

linbowei 发表于 2016-4-12 11:29:57
wscl1213 发表于 2016-4-12 11:11
这些功能在hive0.13中是有的,相信hive0.14中也有。
hive1.2.1中,就不太清楚了。看楼主的错误,可能hive1 ...

我晕了,hive1.2.1比hive0.14高呀,难道hive1.2.1是稳定版本这些功能被去除了?
回复

使用道具 举报

leletuo2012 发表于 2016-4-13 09:18:51
目测是链接不到元数据库 hive各版本的参数不尽相同 真是闹心
回复

使用道具 举报

linbowei 发表于 2016-4-13 15:28:05
leletuo2012 发表于 2016-4-13 09:18
目测是链接不到元数据库 hive各版本的参数不尽相同 真是闹心

确实是难搞呀,要是不能delete和update,那数据要重跑的话,相当麻烦呀
回复

使用道具 举报

wscl1213 发表于 2016-4-13 20:11:54
linbowei 发表于 2016-4-13 15:28
确实是难搞呀,要是不能delete和update,那数据要重跑的话,相当麻烦呀

hive不支持这个啊。她是数据仓库
楼主可以间接,通过与hbase整合,在hbase里删除。
不过没有试过。楼主条件允许,可以尝试
回复

使用道具 举报

您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

关闭

推荐上一条 /2 下一条