如图所示,根据CDH官方的介绍,运行spark-shell报错16/06/02 21:01:27 ERROR spark.SparkContext: Error initializing SparkContext.java.lang.IllegalArgumentException: Required executor memory (1024+384 MB) is above the max threshold (1024 MB) of this cluster! Please check the values of 'yarn.scheduler.maximum-allocation-mb' and/or 'yarn.nodemanager.resource.memory-mb'.
求解答