分享

Spark submit遇到问题

rilweic 发表于 2016-4-11 19:10:25 [显示全部楼层] 回帖奖励 阅读模式 关闭右栏 5 30945
Stack trace: ExitCodeException exitCode=10:
        at org.apache.hadoop.util.Shell.runCommand(Shell.java:538)
        at org.apache.hadoop.util.Shell.run(Shell.java:455)
        at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:715)
        at org.apache.hadoop.yarn.server.nodemanager.LinuxContainerExecutor.launchContainer(LinuxContainerExecutor.java:293)
        at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:302)
        at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:82)
        at java.util.concurrent.FutureTask.run(FutureTask.java:262)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)

Shell output: main : command provided 1
main : user is nobody
main : requested yarn user is root


Container exited with a non-zero exit code 10
Failing this attempt. Failing the application.
         ApplicationMaster host: N/A
         ApplicationMaster RPC port: -1
         queue: root.root
         start time: 1460372569052
         final status: FAILED
         tracking URL: http://c1n2:8088/cluster/app/application_1460352776638_0038
         user: root
Exception in thread "main" org.apache.spark.SparkException: Application finished with failed status
        at org.apache.spark.deploy.yarn.Client.run(Client.scala:656)
        at org.apache.spark.deploy.yarn.Client$.main(Client.scala:681)
        at org.apache.spark.deploy.yarn.Client.main(Client.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
运行Pi的时候一点问题都没有,但是自己写的就有问题了

已有(5)人评论

跳转到指定楼层
rilweic 发表于 2016-4-11 19:10:55
求救啊,大神们
回复

使用道具 举报

bioger_hit 发表于 2016-4-11 20:18:53
rilweic 发表于 2016-4-11 19:10
求救啊,大神们

代码的问题了
回复

使用道具 举报

bioger_hit 发表于 2016-4-11 20:55:26
把代码贴出来看看
回复

使用道具 举报

rilweic 发表于 2016-4-12 08:26:57
import com.chanct.idap.ssm.common.{HbaseUtils, PlatformConfig}
import org.apache.hadoop.hbase.TableName
import org.apache.hadoop.hbase.client.Put
import org.apache.hadoop.hbase.util.Bytes
import org.apache.spark.SparkConf
import org.apache.spark.streaming.kafka.KafkaUtils
import org.apache.spark.streaming.{Seconds, StreamingContext}

/**
* Created by lichao on 16-4-4.
*
*/
object KafKa2SparkStreaming2Hbase {
  def main(args: Array[String]) {
    val zkQuorum = "c1d8:2181,c1d9:2181,c1d10:2181"
    val group = "1"
    val topics = "scala_api_topic"
    val numThreads = 2

    val sparkConf = new SparkConf().setAppName("KafkaWordCount2Hbase").setMaster("local[2]").set("spark.eventLog.overwrite","true")
//    val sparkConf = new SparkConf().setAppName("KafkaWordCount2Hbase").set("spark.eventLog.overwrite","true")
    val ssc = new StreamingContext(sparkConf, Seconds(5))
    ssc.checkpoint("checkpoint")

    val topicpMap = topics.split(",").map((_, numThreads.toInt)).toMap
    val lines = KafkaUtils.createStream(ssc, zkQuorum, group, topicpMap).map(_._2)
    val words = lines.flatMap(_.split(" "))


    val pairs = words.map(word => (word, 1))

    val wordCounts = pairs.reduceByKey(_ + _)



    wordCounts.print()

    //    val store = wordCounts.foreachRDD(rdd => rdd.foreach(KafKa2SparkStreaming2Hbase.blah))
    wordCounts.foreachRDD {
      rdd => rdd.foreachPartition {
        partition =>
          //          println("\n\n\n\n=======================================\n\n\n\n")
          val conf = PlatformConfig.loadHbaseConf()
          val conn = HbaseUtils.getConnection(conf)

          val userTable = TableName.valueOf("WCTest")
          val table = conn.getTable(userTable)
          //          println("\n\n\n\n-----------------------\n\n\n\n"+conf+"==>"+table.getName)
          partition.foreach {
            w =>
              try {
                val put = new Put(Bytes.toBytes(System.currentTimeMillis().toString))
                put.addColumn(Bytes.toBytes("A"), Bytes.toBytes("value"), Bytes.toBytes(w._1.toString))
                put.addColumn(Bytes.toBytes("A"), Bytes.toBytes("count"), Bytes.toBytes(w._2.toString))
                table.put(put)

              } catch {
                case _: Exception => println("raw error!")
              }
          }
          table.close()
          conn.close()

      }
    }
    wordCounts.print()

    ssc.start()
    ssc.awaitTermination()
  }
}
回复

使用道具 举报

rilweic 发表于 2016-4-12 08:35:39
这个问题解决了,是打jar包的问题,删除META-INF目录下的.SF文件就OK了,但是现在出现另一个空指针的问题,我得再看看
回复

使用道具 举报

您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

关闭

推荐上一条 /2 下一条