我想在Spark集群之外的一台机器上配置Spark的客户端,提交应用都在这台客户端上提交。我把集群中Spark的tar包拉下来放到客户端的那台上,并解压修改spark-env.sh文件,添加如下:
export HADOOP_HOME=/home/bigdata/deploy/hadoop-2.5.0-cdh5.3.8
export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop
export SPARK_CLASSPATH=$HADOOP_HOME/share/hadoop/common/lib/*
然后在这台客户端上提交我的应用,但报了如下的错误:
17/04/13 18:22:50 ERROR yarn.ApplicationMaster: Failed to connect to driver at 10.20.216.136:53779, retrying ...17/04/13 18:22:50 ERROR yarn.ApplicationMaster: Failed to connect to driver at 10.20.216.136:53779, retrying ...17/04/13 18:22:51 ERROR yarn.ApplicationMaster: Failed to connect to driver at 10.20.216.136:53779, retrying ...17/04/13 18:22:51 ERROR yarn.ApplicationMaster: Failed to connect to driver at 10.20.216.136:53779, retrying ...17/04/13 18:22:51 ERROR yarn.ApplicationMaster: Failed to connect to driver at 10.20.216.136:53779, retrying ...17/04/13 18:22:51 ERROR yarn.ApplicationMaster: Failed to connect to driver at 10.20.216.136:53779, retrying ...17/04/13 18:22:52 ERROR yarn.ApplicationMaster: Failed to connect to driver at 10.20.216.136:53779, retrying ...17/04/13 18:22:53 ERROR yarn.ApplicationMaster: Failed to connect to driver at 10.20.216.136:53779, retrying ...17/04/13 18:22:54 ERROR yarn.ApplicationMaster: Failed to connect to driver at 10.20.216.136:53779, retrying ...17/04/13 18:22:55 ERROR yarn.ApplicationMaster: Failed to connect to driver at 10.20.216.136:53779, retrying ...想问一下,如何配置一台spark的客户端
|
|