<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<spark.version>2.1.0</spark.version>
<scala.version>2.10</scala.version>
<hadoop.version>2.6.0</hadoop.version>
</properties>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_${scala.version}</artifactId>
<version>${spark.version}</version>
</dependency><dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_${scala.version}</artifactId>
<version>${spark.version}</version>
</dependency>
上面是我依赖包的版本情况,但是我在执行一下语句是报错
[mw_shl_code=scala,true]val ssc = new StreamingContext(args(0), "HdfsWordCount", Seconds(args(2).toInt),
System.getenv("SPARK_HOME"), StreamingContext.jarOfClass(this.getClass))[/mw_shl_code]
网上都是这么创建的,编译时报错,Error:(20, 63) type mismatch; found : Option[String] required: Seq[String]Error occurred in an application involving default arguments. System.getenv("SPARK_HOME"), StreamingContext.jarOfClass(this.getClass))
类型有问题,然后我试了下下面的方式还是报错。[mw_shl_code=scala,true]val ssc = new StreamingContext(args(0), "HdfsWordCount", Seconds(args(2).toInt),
System.getenv("SPARK_HOME"), Seq(StreamingContext.jarOfClass(this.getClass).getOrElse(StreamingContext.getClass.getName)))[/mw_shl_code]
请大神指点
|
|