在spark-shell中,执行如下代码:
val sqlContext=new org.apache.spark.sql.SQLContext(sc)
import sqlContext._
case class Person(name:String,age:Int)
val people=sc.textFile("hdfs://ns1/data/people.txt").map(_.split(",")).map(p=>Person(p(0),p(1).trim.toInt))
people.registerTempTable("people")
报错:
<console>:31: error: value registerTempTable is not a member of org.apache.spark.rdd.RDD[Person]
people.registerTempTable("people")
scala> val peopleSchema = sqlContext.createSchemaRDD(people)
<console>:30: error: value createSchemaRDD is not a member of org.apache.spark.sql.SQLContext
val peopleSchema = sqlContext.createSchemaRDD(people)