分享

spark运行自带示例看不到效果

xw2016 发表于 2016-5-26 23:21:29 [显示全部楼层] 只看大图 回帖奖励 阅读模式 关闭右栏 13 18037
进到/spark/bin目录后,运行:./run-example org.apache.spark.examples.SparkPi

打开网页:http://192.168.56.11:8080/

运行效果.png


Running Applications 和 Completed Applications 处并没有看到记录,求大神解答。

已有(13)人评论

跳转到指定楼层
xw2016 发表于 2016-5-26 23:22:42
补充一下,运行是没有报错的,日志如下:
[hadoop@yun01-nn-02 bin]$ ./run-example org.apache.spark.examples.SparkPi
16/05/26 01:06:51 INFO spark.SparkContext: Running Spark version 1.5.2
16/05/26 01:06:52 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/05/26 01:06:52 INFO spark.SecurityManager: Changing view acls to: hadoop
16/05/26 01:06:52 INFO spark.SecurityManager: Changing modify acls to: hadoop
16/05/26 01:06:52 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(hadoop); users with modify permissions: Set(hadoop)
16/05/26 01:06:54 INFO slf4j.Slf4jLogger: Slf4jLogger started
16/05/26 01:06:54 INFO Remoting: Starting remoting
16/05/26 01:06:55 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@192.168.56.12:46442]
16/05/26 01:06:55 INFO util.Utils: Successfully started service 'sparkDriver' on port 46442.
16/05/26 01:06:55 INFO spark.SparkEnv: Registering MapOutputTracker
16/05/26 01:06:55 INFO spark.SparkEnv: Registering BlockManagerMaster
16/05/26 01:06:55 INFO storage.DiskBlockManager: Created local directory at /tmp/blockmgr-9c21cd3a-e94d-499a-b6dd-839e5b067a0d
16/05/26 01:06:55 INFO storage.MemoryStore: MemoryStore started with capacity 534.5 MB
16/05/26 01:06:55 INFO spark.HttpFileServer: HTTP File server directory is /tmp/spark-3da00b2e-96b3-4de5-b54b-cb9eabcf9ca7/httpd-9e5c7079-1f66-4e08-9687-c0c3141e36f4
16/05/26 01:06:55 INFO spark.HttpServer: Starting HTTP Server
16/05/26 01:06:56 INFO server.Server: jetty-8.y.z-SNAPSHOT
16/05/26 01:06:56 INFO server.AbstractConnector: Started SocketConnector@0.0.0.0:41493
16/05/26 01:06:56 INFO util.Utils: Successfully started service 'HTTP file server' on port 41493.
16/05/26 01:06:56 INFO spark.SparkEnv: Registering OutputCommitCoordinator
16/05/26 01:06:57 INFO server.Server: jetty-8.y.z-SNAPSHOT
16/05/26 01:06:57 INFO server.AbstractConnector: Started SelectChannelConnector@0.0.0.0:4040
16/05/26 01:06:57 INFO util.Utils: Successfully started service 'SparkUI' on port 4040.
16/05/26 01:06:57 INFO ui.SparkUI: Started SparkUI at http://192.168.56.12:4040
16/05/26 01:06:59 INFO spark.SparkContext: Added JAR file:/application/hadoop/spark/lib/spark-examples-1.5.2-hadoop2.6.0.jar at http://192.168.56.12:41493/jars/ ... 5.2-hadoop2.6.0.jar with timestamp 1464196019678
16/05/26 01:07:00 WARN metrics.MetricsSystem: Using default name DAGScheduler for source because spark.app.id is not set.
16/05/26 01:07:00 INFO executor.Executor: Starting executor ID driver on host localhost
16/05/26 01:07:01 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 37608.
16/05/26 01:07:01 INFO netty.NettyBlockTransferService: Server created on 37608
16/05/26 01:07:01 INFO storage.BlockManagerMaster: Trying to register BlockManager
16/05/26 01:07:01 INFO storage.BlockManagerMasterEndpoint: Registering block manager localhost:37608 with 534.5 MB RAM, BlockManagerId(driver, localhost, 37608)
16/05/26 01:07:01 INFO storage.BlockManagerMaster: Registered BlockManager
16/05/26 01:07:02 INFO spark.SparkContext: Starting job: reduce at SparkPi.scala:36
16/05/26 01:07:02 INFO scheduler.DAGScheduler: Got job 0 (reduce at SparkPi.scala:36) with 2 output partitions
16/05/26 01:07:02 INFO scheduler.DAGScheduler: Final stage: ResultStage 0(reduce at SparkPi.scala:36)
16/05/26 01:07:02 INFO scheduler.DAGScheduler: Parents of final stage: List()
16/05/26 01:07:02 INFO scheduler.DAGScheduler: Missing parents: List()
16/05/26 01:07:02 INFO scheduler.DAGScheduler: Submitting ResultStage 0 (MapPartitionsRDD[1] at map at SparkPi.scala:32), which has no missing parents
16/05/26 01:07:03 INFO storage.MemoryStore: ensureFreeSpace(1888) called with curMem=0, maxMem=560497950
16/05/26 01:07:03 INFO storage.MemoryStore: Block broadcast_0 stored as values in memory (estimated size 1888.0 B, free 534.5 MB)
16/05/26 01:07:03 INFO storage.MemoryStore: ensureFreeSpace(1202) called with curMem=1888, maxMem=560497950
16/05/26 01:07:03 INFO storage.MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 1202.0 B, free 534.5 MB)
16/05/26 01:07:03 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 in memory on localhost:37608 (size: 1202.0 B, free: 534.5 MB)
16/05/26 01:07:03 INFO spark.SparkContext: Created broadcast 0 from broadcast at DAGScheduler.scala:861
16/05/26 01:07:03 INFO scheduler.DAGScheduler: Submitting 2 missing tasks from ResultStage 0 (MapPartitionsRDD[1] at map at SparkPi.scala:32)
16/05/26 01:07:03 INFO scheduler.TaskSchedulerImpl: Adding task set 0.0 with 2 tasks
16/05/26 01:07:03 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, localhost, PROCESS_LOCAL, 2163 bytes)
16/05/26 01:07:03 INFO executor.Executor: Running task 0.0 in stage 0.0 (TID 0)
16/05/26 01:07:03 INFO executor.Executor: Fetching http://192.168.56.12:41493/jars/ ... 5.2-hadoop2.6.0.jar with timestamp 1464196019678
16/05/26 01:07:04 INFO util.Utils: Fetching http://192.168.56.12:41493/jars/ ... 5.2-hadoop2.6.0.jar to /tmp/spark-3da00b2e-96b3-4de5-b54b-cb9eabcf9ca7/userFiles-b5874c5a-b47d-4e8d-83c4-637515ebbf99/fetchFileTemp5321192880486314517.tmp
16/05/26 01:07:07 INFO executor.Executor: Adding file:/tmp/spark-3da00b2e-96b3-4de5-b54b-cb9eabcf9ca7/userFiles-b5874c5a-b47d-4e8d-83c4-637515ebbf99/spark-examples-1.5.2-hadoop2.6.0.jar to class loader
16/05/26 01:07:07 INFO executor.Executor: Finished task 0.0 in stage 0.0 (TID 0). 1031 bytes result sent to driver
16/05/26 01:07:07 INFO scheduler.TaskSetManager: Starting task 1.0 in stage 0.0 (TID 1, localhost, PROCESS_LOCAL, 2163 bytes)
16/05/26 01:07:07 INFO executor.Executor: Running task 1.0 in stage 0.0 (TID 1)
16/05/26 01:07:07 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 4109 ms on localhost (1/2)
16/05/26 01:07:07 INFO executor.Executor: Finished task 1.0 in stage 0.0 (TID 1). 1031 bytes result sent to driver
16/05/26 01:07:07 INFO scheduler.TaskSetManager: Finished task 1.0 in stage 0.0 (TID 1) in 86 ms on localhost (2/2)
16/05/26 01:07:07 INFO scheduler.DAGScheduler: ResultStage 0 (reduce at SparkPi.scala:36) finished in 4.201 s
16/05/26 01:07:07 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool
16/05/26 01:07:07 INFO scheduler.DAGScheduler: Job 0 finished: reduce at SparkPi.scala:36, took 5.280692 s
Pi is roughly 3.14424
16/05/26 01:07:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/metrics/json,null}
16/05/26 01:07:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/kill,null}
16/05/26 01:07:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/api,null}
16/05/26 01:07:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/,null}
16/05/26 01:07:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/static,null}
16/05/26 01:07:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump/json,null}
16/05/26 01:07:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump,null}
16/05/26 01:07:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/json,null}
16/05/26 01:07:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors,null}
16/05/26 01:07:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment/json,null}
16/05/26 01:07:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment,null}
16/05/26 01:07:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd/json,null}
16/05/26 01:07:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd,null}
16/05/26 01:07:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/json,null}
16/05/26 01:07:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage,null}
16/05/26 01:07:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool/json,null}
16/05/26 01:07:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool,null}
16/05/26 01:07:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/json,null}
16/05/26 01:07:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage,null}
16/05/26 01:07:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/json,null}
16/05/26 01:07:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages,null}
16/05/26 01:07:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job/json,null}
16/05/26 01:07:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job,null}
16/05/26 01:07:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/json,null}
16/05/26 01:07:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs,null}
16/05/26 01:07:08 INFO ui.SparkUI: Stopped Spark web UI at http://192.168.56.12:4040
16/05/26 01:07:08 INFO scheduler.DAGScheduler: Stopping DAGScheduler
16/05/26 01:07:08 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
16/05/26 01:07:08 INFO storage.MemoryStore: MemoryStore cleared
16/05/26 01:07:08 INFO storage.BlockManager: BlockManager stopped
16/05/26 01:07:08 INFO storage.BlockManagerMaster: BlockManagerMaster stopped
16/05/26 01:07:08 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
16/05/26 01:07:08 INFO spark.SparkContext: Successfully stopped SparkContext
16/05/26 01:07:08 INFO remote.RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.
16/05/26 01:07:08 INFO remote.RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.
16/05/26 01:07:08 INFO remote.RemoteActorRefProvider$RemotingTerminator: Remoting shut down.
16/05/26 01:07:08 INFO util.ShutdownHookManager: Shutdown hook called
16/05/26 01:07:08 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-3da00b2e-96b3-4de5-b54b-cb9eabcf9ca7
回复

使用道具 举报

langke93 发表于 2016-5-27 07:33:39
xw2016 发表于 2016-5-26 23:22
补充一下,运行是没有报错的,日志如下:
[hadoop@yun01-nn-02 bin]$ ./run-example org.apache.spark.exa ...

刷新或则重启试试.
楼主确认提交成功了,并且运行。
内存好像没有用到
1.png
回复

使用道具 举报

xw2016 发表于 2016-5-27 08:41:04
只是没报错,看来运行没成功。不知怎么看
回复

使用道具 举报

nextuser 发表于 2016-5-27 09:53:26
xw2016 发表于 2016-5-27 08:41
只是没报错,看来运行没成功。不知怎么看



1.png

运行成功,有这个标示
回复

使用道具 举报

xw2016 发表于 2016-5-27 11:39:27
好,我查下日志。
回复

使用道具 举报

xw2016 发表于 2016-5-27 21:54:00
nextuser 发表于 2016-5-27 09:53
运行成功,有这个标示

运行日志1.png

有这个标志哦,但在网页上看还是没效果:
8080效果1.png
回复

使用道具 举报

nextuser 发表于 2016-5-27 22:00:04
xw2016 发表于 2016-5-27 21:54
有这个标志哦,但在网页上看还是没效果:

那就说明成功了,至于为什么不显示:
1.先看web相关的组件或则服务是否正常
2.是运行在yarn上,还是哪?
3.看下job history
回复

使用道具 举报

xw2016 发表于 2016-5-27 22:58:04
nextuser 发表于 2016-5-27 22:00
那就说明成功了,至于为什么不显示:
1.先看web相关的组件或则服务是否正常
2.是运行在yarn上,还是哪 ...

启动有问题,start-all.sh后,jsp,多了Master,但没有work
回复

使用道具 举报

xw2016 发表于 2016-5-27 23:16:01
nextuser 发表于 2016-5-27 22:00
那就说明成功了,至于为什么不显示:
1.先看web相关的组件或则服务是否正常
2.是运行在yarn上,还是哪 ...

spark_start1.png

如上图启动后没有work进程,什么原因呢?
spark-env.sh配置如下:
export JAVA_HOME=/application/hadoop/jdk
export SCALA_HOME=/application/hadoop/scala
export SPARK_WORKER_MEMORY=1g
export SPARK_MASTER_IP=yun01-nn-01
export HADOOP_CONF_DIR=/application/hadoop/hadoop/etc/hadoop
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib:$HADOOP_HOME/lib/native"

回复

使用道具 举报

12下一页
您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

关闭

推荐上一条 /2 下一条