今天在自己的机器上重新装了hadoop2.7.2 ,zookeeper hadoop 都启动OK。scala2.11也OK了
但在启动spark时爆出如下错误,spark版本 spark-1.6.0-bin-without-hadoop.tgz
[mw_shl_code=xml,true]Spark Command: /usr/java/latest/bin/java -cp /home/hadoop/spark-1.6.0-bin-without-hadoop/conf/:/home/hadoop/spark-1.6.0-bin-without-hadoop/lib/spark-assembly-1.6.0-hadoop2.2.0.jar:/home/hadoop/hadoop-2.7.2/etc/hadoop/ -Xms1g -Xmx1g org.apache.spark.deploy.master.Master --ip 192.168.122.11 --port 7077 --webui-port 8080
========================================
Error: A JNI error has occurred, please check your installation and try again
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
at java.lang.Class.privateGetMethodRecursive(Class.java:3048)
at java.lang.Class.getMethod0(Class.java:3018)
at java.lang.Class.getMethod(Class.java:1784)
at sun.launcher.LauncherHelper.validateMainClass(LauncherHelper.java:544)
at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:526)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.conf.Configuration
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 7 more[/mw_shl_code]
#####################################################################################################################
spark-env.sh配置
[mw_shl_code=bash,true][root@test1 logs]# cat ../conf/spark-env.sh | grep -v ^# | grep -v ^$
export JAVA_HOME=/usr/java/latest
export SCALA_HOME=/usr/local/scala
export SPARK_MASTER_IP=192.168.122.11
export SPARK_WORKER_CORES=1
export SPARK_WORKER_MEMORY=512M
export HADOOP_CONF_DIR=/home/hadoop/hadoop-2.7.2/etc/hadoop[/mw_shl_code]
|
|