分享

求助!spark 提交任务到集群报错

George-zqq 发表于 2016-12-20 09:17:56 [显示全部楼层] 回帖奖励 阅读模式 关闭右栏 2 7844
spark-submit --class main.scala.UserPlatformCount.PlatformInfoCounter --master spark://192.168.54.11:7077 --executor-me
mory 5G --total-executor-cores 2 /data/sparksql-train.jar  /data/platform.txt
16/12/20 10:11:43 INFO spark.SparkContext: Running Spark version 1.6.0
16/12/20 10:11:45 INFO spark.SecurityManager: Changing view acls to: root
16/12/20 10:11:45 INFO spark.SecurityManager: Changing modify acls to: root
16/12/20 10:11:45 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root)
16/12/20 10:11:45 INFO util.Utils: Successfully started service 'sparkDriver' on port 53728.
16/12/20 10:11:46 INFO slf4j.Slf4jLogger: Slf4jLogger started
16/12/20 10:11:46 INFO Remoting: Starting remoting
16/12/20 10:11:46 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriverActorSystem@192.168.54.11:49379]
16/12/20 10:11:46 INFO Remoting: Remoting now listens on addresses: [akka.tcp://sparkDriverActorSystem@192.168.54.11:49379]
16/12/20 10:11:46 INFO util.Utils: Successfully started service 'sparkDriverActorSystem' on port 49379.
16/12/20 10:11:47 INFO spark.SparkEnv: Registering MapOutputTracker
16/12/20 10:11:47 INFO spark.SparkEnv: Registering BlockManagerMaster
16/12/20 10:11:47 INFO storage.DiskBlockManager: Created local directory at /tmp/blockmgr-ee2d5da5-f905-4fac-9085-3dbefb30c49a
16/12/20 10:11:47 INFO storage.MemoryStore: MemoryStore started with capacity 530.3 MB
16/12/20 10:11:47 INFO spark.SparkEnv: Registering OutputCommitCoordinator
16/12/20 10:11:47 INFO util.Utils: Successfully started service 'SparkUI' on port 4040.
16/12/20 10:11:47 INFO ui.SparkUI: Started SparkUI at http://192.168.54.11:4040
16/12/20 10:11:47 INFO spark.SparkContext: Added JAR file:/data/sparksql-train.jar at spark://192.168.54.11:53728/jars/sparksql-train.jar with timestamp 1482199907764
16/12/20 10:11:48 INFO executor.Executor: Starting executor ID driver on host localhost
16/12/20 10:11:48 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 55461.
16/12/20 10:11:48 INFO netty.NettyBlockTransferService: Server created on 55461
16/12/20 10:11:48 INFO storage.BlockManager: external shuffle service port = 7337
16/12/20 10:11:48 INFO storage.BlockManagerMaster: Trying to register BlockManager
16/12/20 10:11:48 INFO storage.BlockManagerMasterEndpoint: Registering block manager localhost:55461 with 530.3 MB RAM, BlockManagerId(driver, localhost, 55461)
16/12/20 10:11:48 INFO storage.BlockManagerMaster: Registered BlockManager
16/12/20 10:11:49 INFO scheduler.EventLoggingListener: Logging events to hdfs://sp1.hadoop.com:8020/user/spark/applicationHistory/local-1482199907941
16/12/20 10:11:51 INFO storage.MemoryStore: Block broadcast_0 stored as values in memory (estimated size 201.5 KB, free 201.5 KB)
16/12/20 10:11:52 INFO storage.MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 24.0 KB, free 225.5 KB)
16/12/20 10:11:52 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 in memory on localhost:55461 (size: 24.0 KB, free: 530.3 MB)
16/12/20 10:11:52 INFO spark.SparkContext: Created broadcast 0 from textFile at PlatformInfoCounter.scala:30
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.spark.sql.SQLContext.createDataFrame(Lorg/apache/spark/rdd/RDD;Lorg/apache/spark/sql/types/StructType;)Lorg/apache/spark/sql/Dataset;
        at main.scala.UserPlatformCount.PlatformInfoCounter$.<init>(PlatformInfoCounter.scala:34)
        at main.scala.UserPlatformCount.PlatformInfoCounter$.<clinit>(PlatformInfoCounter.scala)
        at main.scala.UserPlatformCount.PlatformInfoCounter.main(PlatformInfoCounter.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
16/12/20 10:11:53 INFO spark.SparkContext: Invoking stop() from shutdown hook
16/12/20 10:11:53 INFO ui.SparkUI: Stopped Spark web UI at http://192.168.54.11:4040
16/12/20 10:11:53 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
16/12/20 10:11:53 INFO storage.MemoryStore: MemoryStore cleared
16/12/20 10:11:53 INFO storage.BlockManager: BlockManager stopped
16/12/20 10:11:53 INFO storage.BlockManagerMaster: BlockManagerMaster stopped
16/12/20 10:11:53 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
16/12/20 10:11:53 INFO spark.SparkContext: Successfully stopped SparkContext
16/12/20 10:11:53 INFO util.ShutdownHookManager: Shutdown hook called
16/12/20 10:11:53 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-52247639-b32f-4ae5-b056-3e741da5f395
16/12/20 10:11:53 INFO remote.RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.
16/12/20 10:11:53 INFO remote.RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.
16/12/20 10:11:53 INFO Remoting: Remoting shut down


这是怎么回事?


已有(2)人评论

跳转到指定楼层
nextuser 发表于 2016-12-20 15:31:48
import org.apache.spark.sql.SQLContext
上面的包没有导入进来。

首先确保,源码能通过编译,并且包含上面的包。
第二就是集群环境的,环境变量的配置,在提交的时候,确保能引用到。

回复

使用道具 举报

George-zqq 发表于 2016-12-20 17:55:10
nextuser 发表于 2016-12-20 15:31
import org.apache.spark.sql.SQLContext
上面的包没有导入进来。


谢谢,已经解决了,是版本冲突了


回复

使用道具 举报

您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

关闭

推荐上一条 /2 下一条