本帖最后由 sstutu 于 2017-4-10 10:27 编辑 log4j版本应该没有问题 可能原因: 1.打包的问题 解决办法1: 1.将打好包的jar文件中的 META-INF/.RSA META-INF/.DSA META-INF/*.SF 文件删掉.,再次向集群提交任务. 解决办法2: pom.xml文件中标签下加入 META-INF/*.SF META-INF/*.DSA META-INF/*.RSA [mw_shl_code=xml,true]<excludes> <exclude>META-INF/*.SF</exclude> <exclude>META-INF/*.DSA</exclude> <exclude>META-INF/*.RSA</exclude> <exclude>junit:junit</exclude> <exclude>org.apache.maven:lib:tests</exclude> </excludes>[/mw_shl_code] 2.都是通过maven下载,手工下载的包可能出现这个问题。 |
sinv2015 发表于 2017-4-9 21:21 把你的spark,hadoop,log4j版本说下 |
langke93 发表于 2017-4-9 20:18 怎么会log4j版本有问题啊。 |
sinv2015 发表于 2017-4-9 19:19 log4j版本有问题 |
xuanxufeng 发表于 2017-4-9 18:21 重启了下虚拟机,重新跑了下程序,日志如下 Showing 4096 bytes. Click here for full log der.class]SLF4J: Found binding in [jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]17/04/09 04:16:03 INFO yarn.ApplicationMaster: Registered signal handlers for [TERM, HUP, INT]17/04/09 04:16:05 INFO yarn.ApplicationMaster: ApplicationAttemptId: appattempt_1491734373178_0004_00000217/04/09 04:16:06 INFO spark.SecurityManager: Changing view acls to: root17/04/09 04:16:06 INFO spark.SecurityManager: Changing modify acls to: root17/04/09 04:16:06 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root)17/04/09 04:16:06 INFO yarn.ApplicationMaster: Starting the user application in a separate Thread17/04/09 04:16:06 ERROR yarn.ApplicationMaster: Uncaught exception: java.lang.SecurityException: Invalid signature file digest for Manifest main attributes at sun.security.util.SignatureFileVerifier.processImpl(SignatureFileVerifier.java:287) at sun.security.util.SignatureFileVerifier.process(SignatureFileVerifier.java:240) at java.util.jar.JarVerifier.processEntry(JarVerifier.java:317) at java.util.jar.JarVerifier.update(JarVerifier.java:228) at java.util.jar.JarFile.initializeVerifier(JarFile.java:348) at java.util.jar.JarFile.getInputStream(JarFile.java:415) at sun.misc.JarIndex.getJarIndex(JarIndex.java:137) at sun.misc.URLClassPath$JarLoader$1.run(URLClassPath.java:674) at sun.misc.URLClassPath$JarLoader$1.run(URLClassPath.java:666) at java.security.AccessController.doPrivileged(Native Method) at sun.misc.URLClassPath$JarLoader.ensureOpen(URLClassPath.java:665) at sun.misc.URLClassPath$JarLoader.<init>(URLClassPath.java:638) at sun.misc.URLClassPath$3.run(URLClassPath.java:366) at sun.misc.URLClassPath$3.run(URLClassPath.java:356) at java.security.AccessController.doPrivileged(Native Method) at sun.misc.URLClassPath.getLoader(URLClassPath.java:355) at sun.misc.URLClassPath.getLoader(URLClassPath.java:332) at sun.misc.URLClassPath.getResource(URLClassPath.java:198) at java.net.URLClassLoader$1.run(URLClassLoader.java:358) at java.net.URLClassLoader$1.run(URLClassLoader.java:355) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:354) at java.lang.ClassLoader.loadClass(ClassLoader.java:425) at java.lang.ClassLoader.loadClass(ClassLoader.java:358) at org.apache.spark.deploy.yarn.ApplicationMaster.startUserApplication(ApplicationMaster.scala:472) at org.apache.spark.deploy.yarn.ApplicationMaster.runDriver(ApplicationMaster.scala:259) at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:144) at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$main$1.apply$mcV$sp(ApplicationMaster.scala:575) at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:60) at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:59) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548) at org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:59) at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:573) at org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)17/04/09 04:16:06 INFO yarn.ApplicationMaster: Final app status: FAILED, exitCode: 10, (reason: Uncaught exception: Invalid signature file digest for Manifest main attributes)17/04/09 04:16:06 INFO yarn.ApplicationMaster: Unregistering ApplicationMaster with FAILED (diag message: Uncaught exception: Invalid signature file digest for Manifest main attributes)17/04/09 04:16:06 INFO yarn.ApplicationMaster: Deleting staging directory .sparkStaging/application_1491734373178_0004 |
sinv2015 发表于 2017-4-9 18:06 源码与错误提示中的spark-hadoop-**包是否一致 看下这个url下的问题 http://Master:8088/cluster/app/application_1491721404028_0016 |
上面看不出多少问题。 由于application Master是管理container的,但是container失败了。为什么会失败?最有可能的就是内存出问题了。 所以楼主看看内存是否够,如果够了,配置是否正确等。 |