分享

【求助】CDH5.4.7中自带的Spark在web UI中无法看到应用的日志

Cherise 发表于 2015-11-10 09:43:55 [显示全部楼层] 回帖奖励 阅读模式 关闭右栏 3 31353
本帖最后由 Cherise 于 2015-11-10 11:57 编辑

最近按照CDH的官方文档将集群从5.1.2升级到了5.4.7,相应的Spark从1.0.0升级到了1.3.0。
升级后,spark可正常使用,但遇到了一个很奇怪的问题,在Spark web UI页面无法看到executor的执行日志stderr,只有如下信息:
SLF4J: Class path contains multiple SLF4J bindings.SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-5.4.7-1.cdh5.4.7.p0.3/jars/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-5.4.7-1.cdh5.4.7.p0.3/jars/avro-tools-1.7.6-cdh5.4.7.jar!/org/slf4j/impl/StaticLoggerBinder.class]SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
从日志上看有SLF4J包的冲突,后来我将/opt/cloudera/parcels/CDH-5.4.7-1.cdh5.4.7.p0.3/jars/目录下的avro-tools-1.7.6-cdh5.4.7.jar重命名为avro-tools-1.7.6-cdh5.4.7.jar.bak,修改后没有上面的错误了,但却一点日志信息都没有了,当然/var/run/spark/work目录下的日志与之相同。
但在/var/log/spark/spark-worker-cdhdatanode1.log文件中却看到了以下信息:
2015-11-09 16:22:15,165 INFO org.apache.spark.executor.CoarseGrainedExecutorBackend: Registered signal handlers for [TERM, HUP, INT]
2015-11-09 16:22:16,873 INFO org.apache.spark.SecurityManager: Changing view acls to: spark,root
2015-11-09 16:22:16,874 INFO org.apache.spark.SecurityManager: Changing modify acls to: spark,root
2015-11-09 16:22:16,875 INFO org.apache.spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(spark, root); use
rs with modify permissions: Set(spark, root)
2015-11-09 16:22:17,308 INFO akka.event.slf4j.Slf4jLogger: Slf4jLogger started
2015-11-09 16:22:17,425 INFO Remoting: Starting remoting
2015-11-09 16:22:17,677 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://driverPropsFetcher@cdhdatanode1:44604]
2015-11-09 16:22:17,680 INFO Remoting: Remoting now listens on addresses: [akka.tcp://driverPropsFetcher@cdhdatanode1:44604]
2015-11-09 16:22:17,693 INFO org.apache.spark.util.Utils: Successfully started service 'driverPropsFetcher' on port 44604.
2015-11-09 16:22:17,951 INFO akka.remote.RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.
2015-11-09 16:22:17,952 INFO akka.remote.RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.
2015-11-09 16:22:17,964 INFO org.apache.spark.SecurityManager: Changing view acls to: spark,root
2015-11-09 16:22:17,964 INFO org.apache.spark.SecurityManager: Changing modify acls to: spark,root
2015-11-09 16:22:17,964 INFO org.apache.spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(spark, root); use
rs with modify permissions: Set(spark, root)
2015-11-09 16:22:18,023 INFO Remoting: Remoting shut down
2015-11-09 16:22:18,025 INFO akka.remote.RemoteActorRefProvider$RemotingTerminator: Remoting shut down.
2015-11-09 16:22:18,034 INFO akka.event.slf4j.Slf4jLogger: Slf4jLogger started
2015-11-09 16:22:18,045 INFO Remoting: Starting remoting
2015-11-09 16:22:18,060 INFO org.apache.spark.util.Utils: Successfully started service 'sparkExecutor' on port 40365.
2015-11-09 16:22:18,061 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkExecutor@cdhdatanode1:40365]
2015-11-09 16:22:18,061 INFO Remoting: Remoting now listens on addresses: [akka.tcp://sparkExecutor@cdhdatanode1:40365]
2015-11-09 16:22:18,069 INFO org.apache.spark.util.AkkaUtils: Connecting to MapOutputTracker: akka.tcp://sparkDriver@cdhnamenode:59211/user/MapOutputTracker
2015-11-09 16:22:18,334 INFO org.apache.spark.util.AkkaUtils: Connecting to BlockManagerMaster: akka.tcp://sparkDriver@cdhnamenode:59211/user/BlockManagerMaster
2015-11-09 16:22:18,485 INFO org.apache.spark.storage.DiskBlockManager: Created local directory at /tmp/spark-18da685b-6197-444c-9478-d250f6b59f4a/spark-8bc0a086-ea65-4c4e-9
4ec-c143539591d0/spark-17299c8e-86b5-47fa-b621-f21261677efc/blockmgr-5c385183-0196-4bf6-8f69-31ea24bdfa0c
2015-11-09 16:22:18,514 INFO org.apache.spark.storage.MemoryStore: MemoryStore started with capacity 267.3 MB
2015-11-09 16:22:18,825 INFO org.apache.spark.util.AkkaUtils: Connecting to OutputCommitCoordinator: akka.tcp://sparkDriver@cdhnamenode:59211/user/OutputCommitCoordinator
2015-11-09 16:22:19,010 INFO org.apache.spark.executor.CoarseGrainedExecutorBackend: Connecting to driver: akka.tcp://sparkDriver@cdhnamenode:59211/user/CoarseGrainedScheduler

怎么感觉executor的日志被写入到了worker的日志中??
有没有童鞋遇到过这种情况?还是我升级有问题?没有日志实在是很不方便啊!困惑几天了,望大家不吝赐教,谢谢!
PS:executor的stdout日志是可以正常写入的;将spark组件删除,重新添加spark(standalone)服务也不行

已有(3)人评论

跳转到指定楼层
bob007 发表于 2015-11-10 10:41:00
发生冲突,修改包的原因是什么?建议去掉另外一个试试,保留原先未更改内容
回复

使用道具 举报

Cherise 发表于 2015-11-10 11:30:10
bob007 发表于 2015-11-10 10:41
发生冲突,修改包的原因是什么?建议去掉另外一个试试,保留原先未更改内容

去掉slf4j-log4j12-1.7.5.jar,没有冲突信息了,但现在stderr中一点日志都没有了
回复

使用道具 举报

Cherise 发表于 2015-11-17 18:09:59
有没有高手知道该怎么解决呢?按照官方指引安装之后就这样了,无语啊,是CDH版本的问题,还是我安装的问题啊???
回复

使用道具 举报

您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

关闭

推荐上一条 /2 下一条