分享

spark-shell启动时报错

zyr 发表于 2018-3-20 11:22:05 [显示全部楼层] 回帖奖励 阅读模式 关闭右栏 1 5500
[spark@h101 newspark]$ ./bin/spark-shell
18/03/19 20:20:14 INFO spark.SecurityManager: Changing view acls to: spark
18/03/19 20:20:14 INFO spark.SecurityManager: Changing modify acls to: spark
18/03/19 20:20:14 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(spark); users with modify permissions: Set(spark)
18/03/19 20:20:14 INFO spark.HttpServer: Starting HTTP Server
18/03/19 20:20:14 INFO server.Server: jetty-8.y.z-SNAPSHOT
18/03/19 20:20:14 INFO server.AbstractConnector: Started SocketConnector@0.0.0.0:53511
18/03/19 20:20:14 INFO util.Utils: Successfully started service 'HTTP class server' on port 53511.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 1.3.0-SNAPSHOT
      /_/

Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_112)
Type in expressions to have them evaluated.
Type :help for more information.
18/03/19 20:20:17 INFO spark.SecurityManager: Changing view acls to: spark
18/03/19 20:20:17 INFO spark.SecurityManager: Changing modify acls to: spark
18/03/19 20:20:17 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(spark); users with modify permissions: Set(spark)
18/03/19 20:20:18 INFO slf4j.Slf4jLogger: Slf4jLogger started
18/03/19 20:20:18 INFO Remoting: Starting remoting
18/03/19 20:20:18 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@h101:47845]
18/03/19 20:20:18 INFO util.Utils: Successfully started service 'sparkDriver' on port 47845.
18/03/19 20:20:18 INFO spark.SparkEnv: Registering MapOutputTracker
18/03/19 20:20:18 INFO spark.SparkEnv: Registering BlockManagerMaster
18/03/19 20:20:18 INFO storage.DiskBlockManager: Created local directory at /tmp/spark-local-20180319202018-609c
18/03/19 20:20:18 INFO storage.MemoryStore: MemoryStore started with capacity 265.1 MB
18/03/19 20:20:18 INFO spark.HttpFileServer: HTTP File server directory is /tmp/spark-a233b036-a90c-4391-adfa-5317e5116c59
18/03/19 20:20:18 INFO spark.HttpServer: Starting HTTP Server
18/03/19 20:20:18 INFO server.Server: jetty-8.y.z-SNAPSHOT
18/03/19 20:20:18 INFO server.AbstractConnector: Started SocketConnector@0.0.0.0:34647
18/03/19 20:20:18 INFO util.Utils: Successfully started service 'HTTP file server' on port 34647.
18/03/19 20:20:18 INFO server.Server: jetty-8.y.z-SNAPSHOT
18/03/19 20:20:18 INFO server.AbstractConnector: Started SelectChannelConnector@0.0.0.0:4040
18/03/19 20:20:18 INFO util.Utils: Successfully started service 'SparkUI' on port 4040.
18/03/19 20:20:18 INFO ui.SparkUI: Started SparkUI at http://h101:4040
java.lang.IllegalArgumentException: /home/spark/zyrSparkSql/newspark cannot be a directory.
        at org.apache.spark.HttpFileServer.addFileToDir(HttpFileServer.scala:70)
        at org.apache.spark.HttpFileServer.addJar(HttpFileServer.scala:60)
        at org.apache.spark.SparkContext.addJar(SparkContext.scala:1170)
        at org.apache.spark.SparkContext$$anonfun$11.apply(SparkContext.scala:276)
        at org.apache.spark.SparkContext$$anonfun$11.apply(SparkContext.scala:276)
        at scala.collection.immutable.List.foreach(List.scala:318)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:276)
        at org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:986)
        at $iwC$$iwC.<init>(<console>:9)
        at $iwC.<init>(<console>:18)
        at <init>(<console>:20)
        at .<init>(<console>:24)
        at .<clinit>(<console>)
        at .<init>(<console>:7)
        at .<clinit>(<console>)
        at $print(<console>)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:852)
        at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1125)
        at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:674)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:705)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:669)
        at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:828)
        at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:873)
        at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:785)
        at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:123)
        at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:122)
        at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:270)
        at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:122)
        at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:60)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:945)
        at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:147)
        at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:60)
        at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:106)
        at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:60)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:962)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:916)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:916)
        at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:916)
        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1011)
        at org.apache.spark.repl.Main$.main(Main.scala:31)
        at org.apache.spark.repl.Main.main(Main.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:365)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)


求助!启动spark-shell时报上面所示的错误,该怎么解决啊!


已有(1)人评论

跳转到指定楼层
qcbb001 发表于 2018-3-20 12:23:40
把你的配置贴出来,尤其是带有这个路径的/home/spark/zyrSparkSql/newspark
回复

使用道具 举报

您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

关闭

推荐上一条 /2 下一条