补充一下,运行是没有报错的,日志如下:
[hadoop@yun01-nn-02 bin]$ ./run-example org.apache.spark.examples.SparkPi
16/05/26 01:06:51 INFO spark.SparkContext: Running Spark version 1.5.2
16/05/26 01:06:52 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/05/26 01:06:52 INFO spark.SecurityManager: Changing view acls to: hadoop
16/05/26 01:06:52 INFO spark.SecurityManager: Changing modify acls to: hadoop
16/05/26 01:06:52 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(hadoop); users with modify permissions: Set(hadoop)
16/05/26 01:06:54 INFO slf4j.Slf4jLogger: Slf4jLogger started
16/05/26 01:06:54 INFO Remoting: Starting remoting
16/05/26 01:06:55 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@192.168.56.12:46442]
16/05/26 01:06:55 INFO util.Utils: Successfully started service 'sparkDriver' on port 46442.
16/05/26 01:06:55 INFO spark.SparkEnv: Registering MapOutputTracker
16/05/26 01:06:55 INFO spark.SparkEnv: Registering BlockManagerMaster
16/05/26 01:06:55 INFO storage.DiskBlockManager: Created local directory at /tmp/blockmgr-9c21cd3a-e94d-499a-b6dd-839e5b067a0d
16/05/26 01:06:55 INFO storage.MemoryStore: MemoryStore started with capacity 534.5 MB
16/05/26 01:06:55 INFO spark.HttpFileServer: HTTP File server directory is /tmp/spark-3da00b2e-96b3-4de5-b54b-cb9eabcf9ca7/httpd-9e5c7079-1f66-4e08-9687-c0c3141e36f4
16/05/26 01:06:55 INFO spark.HttpServer: Starting HTTP Server
16/05/26 01:06:56 INFO server.Server: jetty-8.y.z-SNAPSHOT
16/05/26 01:06:56 INFO server.AbstractConnector: Started SocketConnector@0.0.0.0:41493
16/05/26 01:06:56 INFO util.Utils: Successfully started service 'HTTP file server' on port 41493.
16/05/26 01:06:56 INFO spark.SparkEnv: Registering OutputCommitCoordinator
16/05/26 01:06:57 INFO server.Server: jetty-8.y.z-SNAPSHOT
16/05/26 01:06:57 INFO server.AbstractConnector: Started SelectChannelConnector@0.0.0.0:4040
16/05/26 01:06:57 INFO util.Utils: Successfully started service 'SparkUI' on port 4040.
16/05/26 01:06:57 INFO ui.SparkUI: Started SparkUI at http://192.168.56.12:4040
16/05/26 01:06:59 INFO spark.SparkContext: Added JAR file:/application/hadoop/spark/lib/spark-examples-1.5.2-hadoop2.6.0.jar at http://192.168.56.12:41493/jars/ ... 5.2-hadoop2.6.0.jar with timestamp 1464196019678
16/05/26 01:07:00 WARN metrics.MetricsSystem: Using default name DAGScheduler for source because spark.app.id is not set.
16/05/26 01:07:00 INFO executor.Executor: Starting executor ID driver on host localhost
16/05/26 01:07:01 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 37608.
16/05/26 01:07:01 INFO netty.NettyBlockTransferService: Server created on 37608
16/05/26 01:07:01 INFO storage.BlockManagerMaster: Trying to register BlockManager
16/05/26 01:07:01 INFO storage.BlockManagerMasterEndpoint: Registering block manager localhost:37608 with 534.5 MB RAM, BlockManagerId(driver, localhost, 37608)
16/05/26 01:07:01 INFO storage.BlockManagerMaster: Registered BlockManager
16/05/26 01:07:02 INFO spark.SparkContext: Starting job: reduce at SparkPi.scala:36
16/05/26 01:07:02 INFO scheduler.DAGScheduler: Got job 0 (reduce at SparkPi.scala:36) with 2 output partitions
16/05/26 01:07:02 INFO scheduler.DAGScheduler: Final stage: ResultStage 0(reduce at SparkPi.scala:36)
16/05/26 01:07:02 INFO scheduler.DAGScheduler: Parents of final stage: List()
16/05/26 01:07:02 INFO scheduler.DAGScheduler: Missing parents: List()
16/05/26 01:07:02 INFO scheduler.DAGScheduler: Submitting ResultStage 0 (MapPartitionsRDD[1] at map at SparkPi.scala:32), which has no missing parents
16/05/26 01:07:03 INFO storage.MemoryStore: ensureFreeSpace(1888) called with curMem=0, maxMem=560497950
16/05/26 01:07:03 INFO storage.MemoryStore: Block broadcast_0 stored as values in memory (estimated size 1888.0 B, free 534.5 MB)
16/05/26 01:07:03 INFO storage.MemoryStore: ensureFreeSpace(1202) called with curMem=1888, maxMem=560497950
16/05/26 01:07:03 INFO storage.MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 1202.0 B, free 534.5 MB)
16/05/26 01:07:03 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 in memory on localhost:37608 (size: 1202.0 B, free: 534.5 MB)
16/05/26 01:07:03 INFO spark.SparkContext: Created broadcast 0 from broadcast at DAGScheduler.scala:861
16/05/26 01:07:03 INFO scheduler.DAGScheduler: Submitting 2 missing tasks from ResultStage 0 (MapPartitionsRDD[1] at map at SparkPi.scala:32)
16/05/26 01:07:03 INFO scheduler.TaskSchedulerImpl: Adding task set 0.0 with 2 tasks
16/05/26 01:07:03 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, localhost, PROCESS_LOCAL, 2163 bytes)
16/05/26 01:07:03 INFO executor.Executor: Running task 0.0 in stage 0.0 (TID 0)
16/05/26 01:07:03 INFO executor.Executor: Fetching http://192.168.56.12:41493/jars/ ... 5.2-hadoop2.6.0.jar with timestamp 1464196019678
16/05/26 01:07:04 INFO util.Utils: Fetching http://192.168.56.12:41493/jars/ ... 5.2-hadoop2.6.0.jar to /tmp/spark-3da00b2e-96b3-4de5-b54b-cb9eabcf9ca7/userFiles-b5874c5a-b47d-4e8d-83c4-637515ebbf99/fetchFileTemp5321192880486314517.tmp
16/05/26 01:07:07 INFO executor.Executor: Adding file:/tmp/spark-3da00b2e-96b3-4de5-b54b-cb9eabcf9ca7/userFiles-b5874c5a-b47d-4e8d-83c4-637515ebbf99/spark-examples-1.5.2-hadoop2.6.0.jar to class loader
16/05/26 01:07:07 INFO executor.Executor: Finished task 0.0 in stage 0.0 (TID 0). 1031 bytes result sent to driver
16/05/26 01:07:07 INFO scheduler.TaskSetManager: Starting task 1.0 in stage 0.0 (TID 1, localhost, PROCESS_LOCAL, 2163 bytes)
16/05/26 01:07:07 INFO executor.Executor: Running task 1.0 in stage 0.0 (TID 1)
16/05/26 01:07:07 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 4109 ms on localhost (1/2)
16/05/26 01:07:07 INFO executor.Executor: Finished task 1.0 in stage 0.0 (TID 1). 1031 bytes result sent to driver
16/05/26 01:07:07 INFO scheduler.TaskSetManager: Finished task 1.0 in stage 0.0 (TID 1) in 86 ms on localhost (2/2)
16/05/26 01:07:07 INFO scheduler.DAGScheduler: ResultStage 0 (reduce at SparkPi.scala:36) finished in 4.201 s
16/05/26 01:07:07 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool
16/05/26 01:07:07 INFO scheduler.DAGScheduler: Job 0 finished: reduce at SparkPi.scala:36, took 5.280692 s
Pi is roughly 3.14424
16/05/26 01:07:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/metrics/json,null}
16/05/26 01:07:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/kill,null}
16/05/26 01:07:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/api,null}
16/05/26 01:07:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/,null}
16/05/26 01:07:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/static,null}
16/05/26 01:07:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump/json,null}
16/05/26 01:07:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump,null}
16/05/26 01:07:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/json,null}
16/05/26 01:07:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors,null}
16/05/26 01:07:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment/json,null}
16/05/26 01:07:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment,null}
16/05/26 01:07:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd/json,null}
16/05/26 01:07:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd,null}
16/05/26 01:07:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/json,null}
16/05/26 01:07:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage,null}
16/05/26 01:07:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool/json,null}
16/05/26 01:07:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool,null}
16/05/26 01:07:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/json,null}
16/05/26 01:07:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage,null}
16/05/26 01:07:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/json,null}
16/05/26 01:07:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages,null}
16/05/26 01:07:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job/json,null}
16/05/26 01:07:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job,null}
16/05/26 01:07:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/json,null}
16/05/26 01:07:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs,null}
16/05/26 01:07:08 INFO ui.SparkUI: Stopped Spark web UI at http://192.168.56.12:4040
16/05/26 01:07:08 INFO scheduler.DAGScheduler: Stopping DAGScheduler
16/05/26 01:07:08 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
16/05/26 01:07:08 INFO storage.MemoryStore: MemoryStore cleared
16/05/26 01:07:08 INFO storage.BlockManager: BlockManager stopped
16/05/26 01:07:08 INFO storage.BlockManagerMaster: BlockManagerMaster stopped
16/05/26 01:07:08 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
16/05/26 01:07:08 INFO spark.SparkContext: Successfully stopped SparkContext
16/05/26 01:07:08 INFO remote.RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.
16/05/26 01:07:08 INFO remote.RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.
16/05/26 01:07:08 INFO remote.RemoteActorRefProvider$RemotingTerminator: Remoting shut down.
16/05/26 01:07:08 INFO util.ShutdownHookManager: Shutdown hook called
16/05/26 01:07:08 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-3da00b2e-96b3-4de5-b54b-cb9eabcf9ca7 |