分享

求教一个spark的问题 wordcount

Wyy_Ck 发表于 2017-9-25 23:40:01 [显示全部楼层] 回帖奖励 阅读模式 关闭右栏 1 5004
代码:
[mw_shl_code=scala,true]import org.apache.spark.SparkConf
import org.apache.spark.SparkContext


object wordCount {
  def main(args: Array[String]): Unit = {
    val conf = new SparkConf().setAppName("wordcount").setMaster("spark://master:7077")
    val sc = new SparkContext(conf)
    val file = sc.textFile("hdfs://master:9000/data")
    val count=file.flatMap(line => line.split(" ")).map(word => (word,1)).reduceByKey(_+_)
   
    count.collect()
  }
}[/mw_shl_code]


错误如下:
2、Exception in thread "main" org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage 0.0 (TID 4, 192.168.86.132, executor 1): java.lang.ClassCastException: cannot assign instance of scala.collection.immutable.List$SerializationProxy to field org.apache.spark.rdd.RDD.org$apache$spark$rdd$RDD$$dependencies_ of type scala.collection.Seq in instance of org.apache.spark.rdd.MapPartitionsRDD

报错是:count.collect()出错

已有(1)人评论

跳转到指定楼层
nextuser 发表于 2017-9-26 07:32:06
首先程序需要添加如下红字部分。
import org.apache.spark.SparkConf
import org.apache.spark.SparkContext


object wordCount {
  def main(args: Array[String]): Unit = {
    val conf = new SparkConf().setAppName("wordcount").setMaster("spark://master:7077")
    val sc = new SparkContext(conf)
    val file = sc.textFile("hdfs://master:9000/data")
    val count=file.flatMap(line => line.split(" ")).map(word => (word,1)).reduceByKey(_+_)

    count.collect()
    sc.stop
  }
}
不知道楼主采用什么提交方式,可以改变下方式。

以standalone模式提交

./spark-submit  --class  wordCount--master spark://pmaster:7077 /usr/mywork/project/scala/FirstSpark/out/artifacts/firstspark_jar/firstspark.jar

详细如下
用submit的方式提交

①点击 File - project structure -artifacts - jar - from modules with dependency,选择对应的module和main class。


②设置VM options = -Dspark.master=spark://master:7077

!!!注意:此模式下,sc.textFile(path)实际上是 hdfs://master:7077/path

③点击 build - build artifacts,生成jar包(位于自己在idea中指定的路径下,本文在项目/out/..目录下)

④zip -d /usr/mywork/project/scala/FirstSpark/out/artifacts/firstspark_jar/firstspark.jar META-INF/*.RSA META-INF/*.DSA META-INF/*.SF

(  /usr/mywork/project/scala/FirstSpark/out/artifacts/firstspark_jar/firstspark.jar是生成的jar路径 )

(参考 http://blog.csdn.net/dai451954706/article/details/50086295,如果不加入这一步,会出现错误:Exception in thread "main" java.lang.SecurityException: Invalid signature file digest for Manifest main attributes  



⑤以standalone模式提交

./spark-submit  --class  wordCount--master spark://pmaster:7077 /usr/mywork/project/scala/FirstSpark/out/artifacts/firstspark_jar/firstspark.jar

说明:
( wordCount代表要执行的任务的入口类;


spark://pmaster:7077 代表所提交的集群的master机器;


/usr/mywork/project/scala/FirstSpark/out/artifacts/firstspark_jar/firstspark.jar 代表所要提交的jar包)




参考
http://blog.csdn.net/ronaldo4511/article/details/53035494

回复

使用道具 举报

您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

关闭

推荐上一条 /2 下一条