我pom.xml中引用了[mw_shl_code=applescript,true]<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.6.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.10</artifactId>
<version>1.6.1</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>2.7.1</version>
</dependency>[/mw_shl_code]
然后java代码中写了如下代码:
[mw_shl_code=applescript,true]public static void main(String[] args) {
String appName = "Spark Test";
String master = "yarn";
SparkConf conf = new SparkConf().setAppName(appName).setMaster(master);
JavaSparkContext sc = new JavaSparkContext(conf);
HiveContext sqlContext = new org.apache.spark.sql.hive.HiveContext(sc.sc);
}[/mw_shl_code]
其中的org.apache.spark.sql.hive.HiveContext没有这个class文件呀,这个jar该去哪找?
有没有java方面的在spark中调用执行hive的sql的demo?
谢谢大神~!
|
|