CDH5.7采用的离线全安装,运行spark-shell的时候报错,命令如下:
val rdd1=sc.textFile("hdfs://master:8020/test/sparkData/SogouQ.sample")
rdd1.cache()
var rdd2=rdd1.flatMap(_.split(" ")).map(x=>(x,1)).reduceByKey(_+_)
rdd2.take(10)
简单的wordCount程序,但是报错如下:
16/06/01 19:41:02 WARN cluster.YarnScheduler: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
求解决办法
使用yarn application -list没有任务正在执行,如下:[root@master ~]# yarn application -list
16/06/02 16:16:53 INFO client.RMProxy: Connecting to ResourceManager at master/192.168.0.211:8032
Total number of applications (application-types: [] and states: [SUBMITTED, ACCEPTED, RUNNING]):0
Application-Id Application-Name Application-Type User Queue State Final-State Progress Tracking-URL
然后我试着重新spark执行还是报错