分享

spark启动问题

fengfengda 发表于 2017-11-3 15:03:04 [显示全部楼层] 回帖奖励 阅读模式 关闭右栏 7 7640
sparkStreaming程序使用standalone模式的集群提交模式启动1秒就自动关闭,检查stedrr发现以下信息

17/11/03 14:34:08 INFO executor.CoarseGrainedExecutorBackend: Started daemon with process name: 5390@hadoop-slave717/11/03 14:34:08 INFO util.SignalUtils: Registered signal handler for TERM17/11/03 14:34:08 INFO util.SignalUtils: Registered signal handler for HUP17/11/03 14:34:08 INFO util.SignalUtils: Registered signal handler for INT17/11/03 14:34:08 ERROR executor.CoarseGrainedExecutorBackend: RECEIVED SIG其中最后条出现异常信息,我的集群配置是export SPARK_WORKER_MEMORY=6gexport SPARK_WORKER_CORES=6程序启动时运行参数为--deploy-mode cluster --total-executor-cores 4怎么才能解决这个问题,求大神告知!

已有(7)人评论

跳转到指定楼层
fengfengda 发表于 2017-11-3 15:47:46
ERROR executor.CoarseGrainedExecutorBackend: RECEIVED SIG这样的异常信息怎么解决,具体怎么操作,我的是standalone不是yarn
回复

使用道具 举报

nextuser 发表于 2017-11-3 16:14:49
fengfengda 发表于 2017-11-3 15:47
ERROR executor.CoarseGrainedExecutorBackend: RECEIVED SIG这样的异常信息怎么解决,具体怎么操作,我的 ...

应该是内存的问题
--total-executor-cores 减小,尝试2或则1 试试

回复

使用道具 举报

fengfengda 发表于 2017-11-3 17:01:51
nextuser 发表于 2017-11-3 16:14
应该是内存的问题
--total-executor-cores 减小,尝试2或则1 试试

2和1也是一样
回复

使用道具 举报

w517424787 发表于 2017-11-3 22:30:52
要说明下集群spark相关组件版本信息,这样别人才好分析!
回复

使用道具 举报

fengfengda 发表于 2017-11-6 10:42:51
spark版本是 2.2.0
scala版本是 2.12.2

我看集群上的stderr错误信息是:
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/dev/app/spark/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/dev/app/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
17/11/06 10:28:35 INFO executor.CoarseGrainedExecutorBackend: Started daemon with process name: 20682@hadoop-slave1
17/11/06 10:28:35 INFO util.SignalUtils: Registered signal handler for TERM
17/11/06 10:28:35 INFO util.SignalUtils: Registered signal handler for HUP
17/11/06 10:28:35 INFO util.SignalUtils: Registered signal handler for INT
17/11/06 10:28:36 INFO spark.SecurityManager: Changing view acls to: dev
17/11/06 10:28:36 INFO spark.SecurityManager: Changing modify acls to: dev
17/11/06 10:28:36 INFO spark.SecurityManager: Changing view acls groups to:
17/11/06 10:28:36 INFO spark.SecurityManager: Changing modify acls groups to:
17/11/06 10:28:36 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(dev); groups with view permissions: Set(); users  with modify permissions: Set(dev); groups with modify permissions: Set()
17/11/06 10:28:36 INFO client.TransportClientFactory: Successfully created connection to /----:46042 after 61 ms (0 ms spent in bootstraps)
17/11/06 10:28:37 INFO spark.SecurityManager: Changing view acls to: dev
17/11/06 10:28:37 INFO spark.SecurityManager: Changing modify acls to: dev
17/11/06 10:28:37 INFO spark.SecurityManager: Changing view acls groups to:
17/11/06 10:28:37 INFO spark.SecurityManager: Changing modify acls groups to:
17/11/06 10:28:37 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(dev); groups with view permissions: Set(); users  with modify permissions: Set(dev); groups with modify permissions: Set()
17/11/06 10:28:37 INFO client.TransportClientFactory: Successfully created connection to /----:46042 after 1 ms (0 ms spent in bootstraps)
17/11/06 10:28:37 INFO storage.DiskBlockManager: Created local directory at /tmp/spark-7cc8cbfd-22e6-4b5b-8df3-5a208f7965cd/executor-e185d79a-4e7e-4aa1-83f2-55a927babb0e/blockmgr-c57cf74e-eefc-4272-8b1d-e7a626d28a6a
17/11/06 10:28:37 INFO memory.MemoryStore: MemoryStore started with capacity 366.3 MB
17/11/06 10:28:37 ERROR executor.CoarseGrainedExecutorBackend: RECEIVED SIGNAL TERM
(ip 被我注掉了)

很是困惑,找不到原因
回复

使用道具 举报

langke93 发表于 2017-11-6 12:40:06
fengfengda 发表于 2017-11-6 10:42
spark版本是 2.2.0
scala版本是 2.12.2

spark和hadoop分别是什么版本
回复

使用道具 举报

fengfengda 发表于 2017-11-6 16:19:17
问题解决了,是版本的问题,
本来测试环境spark是2.0.2的版本scala是2.11.8的版本
sparkStreaming代码对应的sbt
scalaVersion := "2.11.8"
libraryDependencies += "org.apache.spark" % "spark-streaming_2.11" % "2.0.2" % "provided"
这样是没有问题的(在测试环境以及版本不同的线上环境)
------------------------
线上环境的版本是
spark版本是 2.2.0
scala版本是 2.12.2
然后我把sbt依赖换成了
scalaVersion := "2.12.2"
libraryDependencies += "org.apache.spark" % "spark-streaming_2.11" % "2.2.0" % "provided"
然后运行就出现各种问题

真的是醉了,这个问题搞的我整了好久,但是还是不太明白是什么原因造成的,有没有哪位大神解答一下
回复

使用道具 举报

您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

关闭

推荐上一条 /2 下一条