spanno 发表于 2013-10-16 13:40:29

用Sqoop导入HDFS时报错:Error: java.lang.ClassNotFoundException: org.apache.hadoop.mapre

环境:win7+Cygwin+hadoop0.20.2+sqoop1.2.0-CDH3B4
报错如下:
$ bin/sqoop import --connect jdbc:mysql://localhost:3306/users --username root --password 111111 --table admin
13/04/14 16:49:37 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
13/04/14 16:49:37 INFO tool.CodeGenTool: Beginning code generation
13/04/14 16:49:37 INFO manager.MySQLManager: Executing SQL statement: SELECT t.* FROM `admin` AS t LIMIT 1
13/04/14 16:49:37 INFO manager.MySQLManager: Executing SQL statement: SELECT t.* FROM `admin` AS t LIMIT 1
13/04/14 16:49:37 INFO orm.CompilationManager: HADOOP_HOME is G:\hadoop-0.20.2
13/04/14 16:49:37 INFO orm.CompilationManager: Found hadoop core jar at: G:\hadoop-0.20.2\hadoop-0.20.2-core.jar
13/04/14 16:49:38 ERROR orm.CompilationManager: Could not rename \tmp\sqoop-Administrator\compile\f68371a8ef51decd0bc92a8360e48a5b\admin.java to G:\sqoop-1.2.0-CDH3B4\.\admin.java
13/04/14 16:49:38 INFO orm.CompilationManager: Writing jar file: \tmp\sqoop-Administrator\compile\f68371a8ef51decd0bc92a8360e48a5b\admin.jar
13/04/14 16:49:38 WARN manager.MySQLManager: It looks like you are importing from mysql.
13/04/14 16:49:38 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
13/04/14 16:49:38 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
13/04/14 16:49:38 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
13/04/14 16:49:38 INFO mapreduce.ImportJobBase: Beginning import of admin
13/04/14 16:49:38 INFO manager.MySQLManager: Executing SQL statement: SELECT t.* FROM `admin` AS t LIMIT 1
13/04/14 16:49:39 INFO mapred.JobClient: Running job: job_201304141545_0007
13/04/14 16:49:40 INFO mapred.JobClient:map 0% reduce 0%
13/04/14 16:49:53 INFO mapred.JobClient: Task Id : attempt_201304141545_0007_m_000000_0, Status : FAILED
Error: java.lang.ClassNotFoundException: org.apache.hadoop.mapreduce.lib.db.DBWritable
      at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
      at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
      at java.security.AccessController.doPrivileged(Native Method)
      at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
      at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
      at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
      at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
      at java.lang.ClassLoader.defineClass1(Native Method)
      at java.lang.ClassLoader.defineClass(ClassLoader.java:791)
      at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
      at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
      at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
      at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
      at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
      at java.security.AccessController.doPrivileged(Native Method)
      at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
      at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
      at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
      at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
      at com.cloudera.sqoop.mapreduce.db.DBConfiguration.getInputClass(DBConfiguration.java:230)
      at com.cloudera.sqoop.mapreduce.db.DataDrivenDBInputFormat.createDBRecordReader(DataDrivenDBInputFormat.java:285)
      at com.cloudera.sqoop.mapreduce.db.DBInputFormat.createRecordReader(DBInputFormat.java:232)
      at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:588)
      at org.apache.hadoop.mapred.MapTask.run(MapTask.java:305)
      at org.apache.hadoop.mapred.Child.main(Child.java:170)
Error: java.lang.ClassNotFoundException: org.apache.hadoop.mapreduce.lib.db.DBWritable
提示找不到org.apache.hadoop.mapreduce.lib.db.DBWritable类
但是有这个类的包hadoop-core-0.20.2-CDH3B4.jar已经放在sqoop/lib目录下了
PS:同样的设置在linux虚拟机上运行没有问题,不知道错出在哪?
            
               
               

tntzbzc 发表于 2013-10-16 13:41:07


            没玩过CLOUDREA的Hadoop
不过从错误上看应该是JAVA CLASSPATH的问题,把sqoop的配置文件贴出来看一下
      

spanno 发表于 2013-10-16 13:41:44


            引用 1 楼 tntzbzc 的回复:没玩过CLOUDREA的Hadoop
不过从错误上看应该是JAVA CLASSPATH的问题,把sqoop的配置文件贴出来看一下
#!/bin/bash
#
# Licensed to Cloudera, Inc. under one or more
# contributor license agreements.See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# Cloudera, Inc. licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License.You may obtain a copy of the License at
#
#   http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# This is sourced in by bin/sqoop to set environment variables prior to
# invoking Hadoop.
bin="$1"
if [ -z "${bin}" ]; then
bin=`dirname $0`
bin=`cd ${bin} && pwd`
fi
if [ -z "$SQOOP_HOME" ]; then
export SQOOP_HOME=${bin}/..
fi
# Find paths to our dependency systems. If they are unset, use CDH defaults.
if [ -z "${HADOOP_HOME}" ]; then
HADOOP_HOME=/usr/lib/hadoop
fi
if [ -z "${HBASE_HOME}" ]; then
HBASE_HOME=/usr/lib/hbase
fi
if [ -z "${ZOOKEEPER_HOME}" ]; then
ZOOKEEPER_HOME=/usr/lib/zookeeper
fi
# Check: If we can't find our dependencies, give up here.
if [ ! -d "${HADOOP_HOME}" ]; then
echo "Error: $HADOOP_HOME does not exist!"
echo 'Please set $HADOOP_HOME to the root of your Hadoop installation.'
exit 1
fi
#if [ ! -d "${HBASE_HOME}" ]; then
#echo "Error: $HBASE_HOME does not exist!"
#echo 'Please set $HBASE_HOME to the root of your HBase installation.'
#exit 1
#fi
#if [ ! -d "${ZOOKEEPER_HOME}" ]; then
#echo "Error: $ZOOKEEPER_HOME does not exist!"
#echo 'Please set $ZOOKEEPER_HOME to the root of your ZooKeeper #installation.'
#exit 1
#fi
# Where to find the main Sqoop jar
SQOOP_JAR_DIR=$SQOOP_HOME
# If there's a "build" subdir, override with this, so we use
# the newly-compiled copy.
if [ -d "$SQOOP_JAR_DIR/build" ]; then
SQOOP_JAR_DIR="${SQOOP_JAR_DIR}/build"
fi
function add_to_classpath() {
dir=$1
for f in $dir/*.jar; do
    SQOOP_CLASSPATH=${SQOOP_CLASSPATH}:$f;
done
export SQOOP_CLASSPATH
}
# Add sqoop dependencies to classpath.
SQOOP_CLASSPATH=""
if [ -d "$SQOOP_HOME/lib" ]; then
add_to_classpath $SQOOP_HOME/lib
fi
# Add HBase to dependency list
add_to_classpath $HBASE_HOME
add_to_classpath $HBASE_HOME/lib
HBASE_CONF_DIR=${HBASE_CONF_DIR:-${HBASE_HOME}/conf}
SQOOP_CLASSPATH=${HBASE_CONF_DIR}:${SQOOP_CLASSPATH}
add_to_classpath $ZOOKEEPER_HOME
add_to_classpath $ZOOKEEPER_HOME/lib
SQOOP_CONF_DIR=${SQOOP_CONF_DIR:-${SQOOP_HOME}/conf}
SQOOP_CLASSPATH=${SQOOP_CONF_DIR}:${SQOOP_CLASSPATH}
# If there's a build subdir, use Ivy-retrieved dependencies too.
if [ -d "$SQOOP_HOME/build/ivy/lib/sqoop" ]; then
for f in $SQOOP_HOME/build/ivy/lib/sqoop/*/*.jar; do
    SQOOP_CLASSPATH=${SQOOP_CLASSPATH}:$f;
done
fi
add_to_classpath ${SQOOP_JAR_DIR}
export SQOOP_CLASSPATH
export SQOOP_CONF_DIR
export SQOOP_JAR_DIR
export HADOOP_CLASSPATH="${SQOOP_CLASSPATH}:${HADOOP_CLASSPATH}"
export HADOOP_HOME
export HBASE_HOME
SQOOP的配置文件,只是把HBASE和ZOOKEEPER的部分注释掉的,别的没有改动
      

tntzbzc 发表于 2013-10-16 13:42:35


            看了你的配置,感觉没有任何改动
由于HadoopHome的ClassPath会在SQOOP启动时自动加载
所以hadoop-core-0.20.2-CDH3B4.jar可以不用放到SQOOP/LIB下
ERROR orm.CompilationManager: Could not rename \tmp\sqoop-Administrator\compile\f68371a8ef51decd0bc92a8360e48a5b\admin.java to G:\sqoop-1.2.0-CDH3B4\.\admin.java
LZ,你有没有留意到这个错误,应该是权限问题,或者是CYGWIN的路径问题
建议你不要用CYGWIN玩HADOOP
装个VMware Workstation,然后装个LINUX玩,比如CENTOS
页: [1]
查看完整版本: 用Sqoop导入HDFS时报错:Error: java.lang.ClassNotFoundException: org.apache.hadoop.mapre