分享

hadoop mr 或者 spark 操作 hbase时候就出现class com.google.protobuf.HBaseZeroC...

bob007 发表于 2015-8-18 19:16:23 [显示全部楼层] 只看大图 回帖奖励 阅读模式 关闭右栏 1 14343
hadoop mr 或者 spark 操作 hbase时候就出现这个错误
这是hbase的bug,可在jira上看到该问题:https://issues.apache.org/jira/browse/HBASE-10304

报错信息:
[mw_shl_code=bash,true]15/08/17 19:28:33 ERROR yarn.ApplicationMaster: User class threw exception: org.apache.hadoop.hbase.DoNotRetryIOException: java.lang.IllegalAccessError: class com.google.protobuf.HBaseZeroCopyByteString ca
nnot access its superclass com.google.protobuf.LiteralByteString
org.apache.hadoop.hbase.DoNotRetryIOException: java.lang.IllegalAccessError: class com.google.protobuf.HBaseZeroCopyByteString cannot access its superclass com.google.protobuf.LiteralByteString
        at org.apache.hadoop.hbase.client.RpcRetryingCaller.translateException(RpcRetryingCaller.java:210)
        at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:121)
        at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:90)
        at org.apache.hadoop.hbase.client.ClientScanner.nextScanner(ClientScanner.java:264)
        at org.apache.hadoop.hbase.client.ClientScanner.initializeScannerInConstruction(ClientScanner.java:169)
        at org.apache.hadoop.hbase.client.ClientScanner.<init>(ClientScanner.java:164)
        at org.apache.hadoop.hbase.client.ClientScanner.<init>(ClientScanner.java:107)
        at org.apache.hadoop.hbase.client.HTable.getScanner(HTable.java:736)
        at org.apache.hadoop.hbase.client.MetaScanner.metaScan(MetaScanner.java:178)
        at org.apache.hadoop.hbase.client.MetaScanner.metaScan(MetaScanner.java:82)
        at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.isTableAvailable(HConnectionManager.java:962)
        at org.apache.hadoop.hbase.client.HBaseAdmin.isTableAvailable(HBaseAdmin.java:1081)
        at org.apache.hadoop.hbase.client.HBaseAdmin.isTableAvailable(HBaseAdmin.java:1089)
        at com.umeng.dp.yuliang.play.HBaseToES$.main(HBaseToES.scala:28)
        at com.umeng.dp.yuliang.play.HBaseToES.main(HBaseToES.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:483)
Caused by: java.lang.IllegalAccessError: class com.google.protobuf.HBaseZeroCopyByteString cannot access its superclass com.google.protobuf.LiteralByteString
        at java.lang.ClassLoader.defineClass1(Native Method)
        at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
        at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
        at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
        at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        at org.apache.hadoop.hbase.protobuf.RequestConverter.buildRegionSpecifier(RequestConverter.java:930)
        at org.apache.hadoop.hbase.protobuf.RequestConverter.buildScanRequest(RequestConverter.java:434)
        at org.apache.hadoop.hbase.client.ScannerCallable.openScanner(ScannerCallable.java:297)
        at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:157)
        at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:57)
        at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:114)
        ... 18 more [/mw_shl_code]






已有(1)人评论

跳转到指定楼层
yuwenge 发表于 2015-8-18 19:19:31
本帖最后由 yuwenge 于 2015-8-18 19:21 编辑



hadoop yarn 解决方案:
1.提交作业方式
[mw_shl_code=bash,true]$ export HADOOP_CLASSPATH="/home/cluster/apps/hbase/lib/hbase-protocol-0.98.1-cdh5.1.0.jar"
$ ./hadoop-2.2.0/bin/hadoop --config /home/stack/conf_hadoop/ jar ./hbase/hbase-assembly/target/hbase-0.99.0-SNAPSHOT-job.jar  org.apache.hadoop.hbase.mapreduce.RowCounter usertable[/mw_shl_code]


2.增加HADOOP_CLASSPATH到linux环境变量中
增加如下内容到bashrc 或者 bash_profile 或者 profile ,这样是linux环境变量中就行
export HADOOP_CLASSPATH=$HADOOP_CLASSPATH:/home/cluster/apps/hbase/lib/hbase-protocol-0.98.1-cdh5.1.0.jar



spark 解决方案:


1.提交作业方式
-conf 增加spark.driver.extraClassPath & spark.executor.extraClassPath
[mw_shl_code=bash,true]spark-submit --class com.umeng.dp.yuliang.play.HBaseToES --master yarn-cluster --conf "spark.driver.extraClassPath=/home/cluster/apps/hbase/lib/hbase-protocol-0.98.1-cdh5.1.0.jar" --conf "spark.executor.extraClassPath=/home/cluster/apps/hbase/lib/hbase-protocol-0.98.1-cdh5.1.0.jar"   --jars /home/cluster/apps/hbase/lib/hbase-protocol-0.98.1-cdh5.1.0.jar ScalaMR-0.0.1-jar-with-dependencies.jar
[/mw_shl_code]

2.增加如下配置到$SPARK_HOME/conf/spark-defaults.conf文件
[mw_shl_code=bash,true]spark.driver.extraClassPath /home/cluster/apps/hbase/lib/hbase-protocol-0.98.1-cdh5.1.0.jar
spark.executor.extraClassPath /home/cluster/apps/hbase/lib/hbase-protocol-0.98.1-cdh5.1.0.jar[/mw_shl_code]


参考:stark_summer

英文解决方案:
https://issues.apache.org/jira/browse/HBASE-10304

1.png



回复

使用道具 举报

您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

关闭

推荐上一条 /2 下一条