分享

Spark1.4.1 编译与安装


问题导读

1.spark官网如何下载spark源码?
2.如何编译源码?
3.编译部署中遇到了哪些问题,是如何解决的?






1、下载下载地址:
http://spark.apache.org/downloads.html

选择下载源码
1.jpg



2、源码编译1)解压
tar -zxvf spark-1.4.1.tgz
2、编译


进入根目录下,采用make-distribution.sh进行编译。
cd spark-1.4.1
sudo ./make-distribution.sh --tgz --skip-java-test -Pyarn -Phadoop-2.2-Dhadoop.version=2.2.0 -Phive -Phive-thriftserver -DskipTests clean package

如果中间有报错,请重新跑,多试几次,一般都能成功。

编译成功后,其安装文件在根目录下:
spark-1.4.1-bin-2.2.0.tgz



3、安装省略,和之前版本一样,就不写了。


4、报错问题集群启动时问题:
1)问题1 : worek节点不能启动

localhost:starting org.apache.spark.deploy.worker.Worker, logging to/home/lib/spark-1.4.1/sbin/../logs/org.apache.spark.deploy.worker.Worker-1-is xxxx.out
localhost:failed to launch org.apache.spark.deploy.worker.Worker:
localhost:      at org.apache.spark.launcher.SparkClassCommandBuilder.buildCommand(SparkClassCommandBuilder.java:98)
localhost:      atorg.apache.spark.launcher.Main.main(Main.java:74)
localhost:full log in/home/lib/spark-1.4.1/sbin/../logs/org.apache.spark.deploy.worker.Worker-1-is xxxx.out
localhost:Connection to localhost closed.
原因是系统自带java问题

rpm -qa | grep java
gcc-java-4.4.7-4.el6.x86_64
java_cup-0.10k-5.el6.x86_64
java-1.5.0-gcj-1.5.0.0-29.1.el6.x86_64
卸载
rpm -e --nodeps java_cup-0.10k-5.el6.x86_64
rpm -e --nodepsjava-1.5.0-gcj-1.5.0.0-29.1.el6.x86_64
2)问题2 :JAVA_HOME is not set

localhost: starting org.apache.spark.deploy.worker.Worker, logging to /home/lib/spark-1.4.1/sbin/../logs/spark-org.apache.spark.deploy.worker.Worker-1-is xxxx.out
localhost: failed to launch org.apache.spark.deploy.worker.Worker:
localhost:   JAVA_HOME is not set
localhost: full log in /lib/spark-1.4.1/sbin/../logs/org.apache.spark.deploy.worker.Worker-1-isxxxx.out
localhost: Connection to localhost closed.

找到出错的shell文件,加入export JAVA_HOME=... 即可
spark-env.sh,加入export JAVA_HOME=... 即可

启动成功后的界面:
2.png



出处:

没找到任何评论,期待你打破沉寂

您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

关闭

推荐上一条 /2 下一条