分享

spark2.3 java.lang.NoSuchMethodError: io.netty.buffer.PooledByteBufAllocator....

[mw_shl_code=scala,true]import org.apache.spark.sql.SparkSession

object  yfgspark01 {

  def main(args: Array[String]): Unit = {
    val logfile = "D:\\1.txt"

    val fstring = "HANDLING MCE MEMORY ERROR"

    val spark = SparkSession.builder.master("local").appName("test1").getOrCreate()
    val logdata = spark.read.textFile(logfile).cache()
    val errnums = logdata.filter(line => line.contains(fstring)).count()
    println(errnums)
  }

}[/mw_shl_code]

执行后报错:
[mw_shl_code=applescript,true]18/04/28 16:04:44 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(admin-07); groups with view permissions: Set(); users  with modify permissions: Set(admin-07); groups with modify permissions: Set()
Exception in thread "main" java.lang.NoSuchMethodError: io.netty.buffer.PooledByteBufAllocator.metric()Lio/netty/buffer/PooledByteBufAllocatorMetric;
        at org.apache.spark.network.util.NettyMemoryMetrics.registerMetrics(NettyMemoryMetrics.java:80)
        at org.apache.spark.network.util.NettyMemoryMetrics.<init>(NettyMemoryMetrics.java:76)
        at org.apache.spark.network.client.TransportClientFactory.<init>(TransportClientFactory.java:109)
        at org.apache.spark.network.TransportContext.createClientFactory(TransportContext.java:99)
        at org.apache.spark.rpc.netty.NettyRpcEnv.<init>(NettyRpcEnv.scala:71)
        at org.apache.spark.rpc.netty.NettyRpcEnvFactory.create(NettyRpcEnv.scala:461)
        at org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:57)
        at org.apache.spark.SparkEnv$.create(SparkEnv.scala:249)
        at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:175)
        at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:256)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:423)
        at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2486)
        at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:930)
        at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:921)
        at scala.Option.getOrElse(Option.scala:121)
        at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:921)
        at yfgspark01$.main(yfgspark01.scala:14)
        at yfgspark01.main(yfgspark01.scala)

Process finished with exit code 1[/mw_shl_code]


在pom文件中加入如下配置:
[mw_shl_code=xml,true] <dependency>
            <groupId>io.netty</groupId>
            <artifactId>netty-buffer</artifactId>
            <version>4.1.17.Final</version>
        </dependency>[/mw_shl_code]

再次执行报错如下:
[mw_shl_code=applescript,true]18/04/28 16:07:05 INFO spark.SecurityManager: Changing modify acls groups to:
18/04/28 16:07:05 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(admin-07); groups with view permissions: Set(); users  with modify permissions: Set(admin-07); groups with modify permissions: Set()
Exception in thread "main" java.lang.AbstractMethodError: io.netty.util.concurrent.MultithreadEventExecutorGroup.newChild(Ljava/util/concurrent/Executor;[Ljava/lang/Object;)Lio/netty/util/concurrent/EventExecutor;
        at io.netty.util.concurrent.MultithreadEventExecutorGroup.<init>(MultithreadEventExecutorGroup.java:84)
        at io.netty.util.concurrent.MultithreadEventExecutorGroup.<init>(MultithreadEventExecutorGroup.java:58)
        at io.netty.util.concurrent.MultithreadEventExecutorGroup.<init>(MultithreadEventExecutorGroup.java:47)
        at io.netty.channel.MultithreadEventLoopGroup.<init>(MultithreadEventLoopGroup.java:49)
        at io.netty.channel.nio.NioEventLoopGroup.<init>(NioEventLoopGroup.java:61)
        at io.netty.channel.nio.NioEventLoopGroup.<init>(NioEventLoopGroup.java:52)
        at org.apache.spark.network.util.NettyUtils.createEventLoop(NettyUtils.java:51)
        at org.apache.spark.network.client.TransportClientFactory.<init>(TransportClientFactory.java:102)
        at org.apache.spark.network.TransportContext.createClientFactory(TransportContext.java:99)
        at org.apache.spark.rpc.netty.NettyRpcEnv.<init>(NettyRpcEnv.scala:71)
        at org.apache.spark.rpc.netty.NettyRpcEnvFactory.create(NettyRpcEnv.scala:461)
        at org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:57)
        at org.apache.spark.SparkEnv$.create(SparkEnv.scala:249)
        at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:175)
        at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:256)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:423)
        at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2486)
        at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:930)
        at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:921)
        at scala.Option.getOrElse(Option.scala:121)
        at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:921)
        at yfgspark01$.main(yfgspark01.scala:14)
        at yfgspark01.main(yfgspark01.scala)

Process finished with exit code 1
[/mw_shl_code]


版本信息:win10   SPARK2.3.0-CLOUDERA2   intellij idea 2018.1.2  jdk1.8  scala2.11.8

查了下有的说是jar包版本不对,但我到官网看了是4.1.17版本啊。
微信图片_20180428161341.png


这个错误会是哪里出现问题了呢?求帮助,谢谢。

已有(2)人评论

跳转到指定楼层
qcbb001 发表于 2018-4-28 18:49:32
本帖最后由 qcbb001 于 2018-4-28 18:51 编辑

两个问题
1.netty版本使用3.9.x,4.0应该不匹配2.查看其它地方,别包含多个版本

回复

使用道具 举报

咖啡酱 发表于 2018-5-13 16:15:48
我将依赖中的buffer改为all就好了
回复

使用道具 举报

您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

关闭

推荐上一条 /2 下一条