分享

sqoop2一开始job,出现大量连接文件

redliquid 发表于 2016-8-19 15:20:37 [显示全部楼层] 回帖奖励 阅读模式 关闭右栏 3 7748
sqoop2中,一开始job,start job -j X,就会在hadoop用户的家目录下产生大量的链接文件
lrwxrwxrwx   1 hadoop hadoop        77 Aug 20 15:17 mysql-connector-java-5.1.23.jar -> /tmp/hadoop-hadoop/mapred/local/1471677471843/mysql-connector-java-5.1.23.jar
lrwxrwxrwx   1 hadoop hadoop        63 Aug 20 15:17 joda-time-2.4.jar -> /tmp/hadoop-hadoop/mapred/local/1471677471844/joda-time-2.4.jar
lrwxrwxrwx   1 hadoop hadoop        70 Aug 20 15:17 connector-sdk-1.99.6.jar -> /tmp/hadoop-hadoop/mapred/local/1471677471845/connector-sdk-1.99.6.jar
lrwxrwxrwx   1 hadoop hadoop        82 Aug 20 15:17 sqoop-execution-mapreduce-1.99.6.jar -> /tmp/hadoop-hadoop/mapred/local/1471677471846/sqoop-execution-mapreduce-1.99.6.jar
lrwxrwxrwx   1 hadoop hadoop        62 Aug 20 15:17 guava-11.0.2.jar -> /tmp/hadoop-hadoop/mapred/local/1471677471847/guava-11.0.2.jar
lrwxrwxrwx   1 hadoop hadoop        69 Aug 20 15:17 sqoop-common-1.99.6.jar -> /tmp/hadoop-hadoop/mapred/local/1471677471848/sqoop-common-1.99.6.jar
lrwxrwxrwx   1 hadoop hadoop        65 Aug 20 15:17 json-simple-1.1.jar -> /tmp/hadoop-hadoop/mapred/local/1471677471849/json-simple-1.1.jar
lrwxrwxrwx   1 hadoop hadoop        77 Aug 20 15:17 sqoop-connector-hdfs-1.99.6.jar -> /tmp/hadoop-hadoop/mapred/local/1471677471850/sqoop-connector-hdfs-1.99.6.jar
lrwxrwxrwx   1 hadoop hadoop        85 Aug 20 15:17 sqoop-connector-generic-jdbc-1.99.6.jar -> /tmp/hadoop-hadoop/mapred/local/1471677471851/sqoop-connector-generic-jdbc-1.99.6.jar
lrwxrwxrwx   1 hadoop hadoop        67 Aug 20 15:17 sqoop-core-1.99.6.jar -> /tmp/hadoop-hadoop/mapred/local/1471677471852/sqoop-core-1.99.6.jar
[hadoop@hadoop0 /home/hadoop]$


这些环境变量在哪里配置的呢?

已有(3)人评论

跳转到指定楼层
NEOGX 发表于 2016-8-19 15:32:21
没见过,楼主是怎么配置的
回复

使用道具 举报

redliquid 发表于 2016-8-19 15:59:14
我hadoop用户的profile:
export PS1="[$(id -un)@$(hostname -s) \$PWD]$"
set -o vi
alias ls='ls'
alias cp='cp'
export JAVA_HOME=/jdk1.8.0
export PATH=.:$JAVA_HOME/bin:$PATH
export CLASSPATH=.:/jdk1.8.0/jre/lib
export PATH=$PATH:/home/hadoop/hadoop-2.7.2/bin:/home/hadoop/hadoop-2.7.2/sbin
export HADOOP_HOME=/home/hadoop/hadoop-2.7.2
export HADOOP_CONF=/home/hadoop/hadoop-2.7.2/etc/hadoop
export SQOOP_HOME=/home/hadoop/sqoop-1.99.6
export PATH=$SQOOP_HOME/bin:$PATH
export CATALINA_HOME=$SQOOP_HOME/server
export CATALINA_BASE=$SQOOP_HOME/server
export LOGDIR=/tmp/sqoop_logs
export HBASE_HOME=/home/hadoop/hbase-1.2.2
export HBASE_CONF=/home/hadoop/hbase-1.2.2/conf
export PATH=${HBASE_HOME}/bin:${PATH}
export HBASE_LIB=/home/hadoop/hbase-1.2.2/lib
export PS1="[$(id -un)@$(hostname -s) \$PWD]$"
export HIVE_HOME=/home/hadoop/hive-2.1.0
export PATH=$HIVE_HOME/bin:$PATH
export HIVE_CONF=/home/hadoop/hive-2.1.0/conf

vi $SQOOP_HOME/server/conf/sqoop.properties
org.apache.sqoop.submission.engine.mapreduce.configuration.directory=/home/hadoop/hadoop-2.7.2
org.apache.sqoop.log4j.appender.file.File=/tmp/sqoop_logs/sqoop.log
org.apache.sqoop.auditlogger.default.file=/tmp/sqoop_logs/default.audit
org.apache.sqoop.repository.sysprop.derby.stream.error.file=/tmp/sqoop_logs/derbyrepo.log
org.apache.sqoop.repository.jdbc.url=jdbc:derby:/home/hadoop/sqoop-1.99.6/repository/db;create=true

vi $SQOOP_HOME/server/conf/catalina.properties
common.loader=${catalina.base}/lib,${catalina.base}/lib/*.jar,${catalina.home}/lib,${catalina.home}/lib/*.jar,${catalina.home}/../lib/*.jar,/home/hadoop/hadoop-2.7.2/share/hadoop/common/*.jar,/home/hadoop/hadoop-2.7.2/share/hadoop/common/lib/*.jar,/home/hadoop/hadoop-2.7.2/share/hadoop/hdfs/*.jar,/home/hadoop/hadoop-2.7.2/share/hadoop/hdfs/lib/*.jar,/home/hadoop/hadoop-2.7.2/share/hadoop/mapreduce/*.jar,/home/hadoop/hadoop-2.7.2/share/hadoop/mapreduce/lib/*.jar,/home/hadoop/hadoop-2.7.2/share/hadoop/yarn/*.jar,/home/hadoop/hadoop-2.7.2/share/hadoop/yarn/lib/*.jar,/home/hadoop/hive-2.1.0/lib/*.jar,/home/hadoop/hadoop-2.7.2/share/hadoop/tools/lib/*.jar
回复

使用道具 举报

starrycheng 发表于 2016-8-19 19:14:08
redliquid 发表于 2016-8-19 15:59
我hadoop用户的profile:
export PS1="[$(id -un)@$(hostname -s) \$PWD]$"
set -o vi

应该是把相关文件复制到对应的路径中 json-simple-1.1.jar -> /tmp/hadoop-hadoop/mapred/local/1471677471849/json-simple-1.1.jar

回复

使用道具 举报

您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

关闭

推荐上一条 /2 下一条