分享

CDH中oozie调用sqoop,hive-import失败,求解决思路。

koalaa 发表于 2016-4-27 09:09:32 [显示全部楼层] 回帖奖励 阅读模式 关闭右栏 5 22003
cloudera5.6*+oozie+sqoop1.46。
开启kerberos。sqoop命令行执行import,OK。在hue的里面调用oozie,MR运行正常,hive-impor的时候异常挂掉啦。
有朋友遇到类似的情况么,求解决思路。



已有(5)人评论

跳转到指定楼层
bioger_hit 发表于 2016-4-27 10:24:55
异常就看下日志,即使遇到相同的,出错原因可能一样的几率也很小
回复

使用道具 举报

koalaa 发表于 2016-4-27 11:30:39
网上是说再oozie里面需要配置oozie.hive.defaults,hive.metastore.uris,hive.metastore.local不知道是不是跟这个有关。
而且观察发现在命令行执行的sqoop初始化hive的显示内容,
Logging initialized using configuration in jar:file:/yarn/nm/filecache/98/hive-common.jar!/hive-log4j.properties
Intercepting System.exit(1)
不太了解这块的机制,望指点。

5866 [uber-SubtaskRunner] WARN  org.apache.sqoop.tool.SqoopTool  - $SQOOP_CONF_DIR has not been set in the environment. Cannot check for additional configuration.
5897 [uber-SubtaskRunner] INFO  org.apache.sqoop.Sqoop  - Running Sqoop version: 1.4.6-cdh5.6.0
5915 [uber-SubtaskRunner] WARN  org.apache.sqoop.tool.BaseSqoopTool  - Setting your password on the command-line is insecure. Consider using -P instead.
5915 [uber-SubtaskRunner] INFO  org.apache.sqoop.tool.BaseSqoopTool  - Using Hive-specific delimiters for output. You can override
5915 [uber-SubtaskRunner] INFO  org.apache.sqoop.tool.BaseSqoopTool  - delimiters with --fields-terminated-by, etc.
5930 [uber-SubtaskRunner] WARN  org.apache.sqoop.ConnFactory  - $SQOOP_CONF_DIR has not been set in the environment. Cannot check for additional configuration.
5996 [uber-SubtaskRunner] INFO  org.apache.sqoop.manager.oracle.OraOopManagerFactory  - Data Connector for Oracle and Hadoop is disabled.
6013 [uber-SubtaskRunner] INFO  org.apache.sqoop.manager.SqlManager  - Using default fetchSize of 1000
6017 [uber-SubtaskRunner] INFO  org.apache.sqoop.tool.CodeGenTool  - Beginning code generation
6588 [uber-SubtaskRunner] INFO  org.apache.sqoop.manager.OracleManager  - Time zone has been set to GMT
6692 [uber-SubtaskRunner] INFO  org.apache.sqoop.manager.SqlManager  - Executing SQL statement: SELECT t.* FROM TEST_TABLE t WHERE 1=0
6727 [uber-SubtaskRunner] INFO  org.apache.sqoop.orm.CompilationManager  - HADOOP_MAPRED_HOME is /opt/cloudera/parcels/CDH-5.6.0-1.cdh5.6.0.p0.45/lib/hadoop-mapreduce
8748 [uber-SubtaskRunner] INFO  org.apache.sqoop.orm.CompilationManager  - Writing jar file: /tmp/sqoop-hue/compile/7f1f459928379d923057541cf4294f36/TEST_TABLE.jar
8764 [uber-SubtaskRunner] INFO  org.apache.sqoop.manager.OracleManager  - Time zone has been set to GMT
8789 [uber-SubtaskRunner] INFO  org.apache.sqoop.manager.OracleManager  - Time zone has been set to GMT
8805 [uber-SubtaskRunner] INFO  org.apache.sqoop.mapreduce.ImportJobBase  - Beginning import of TEST_TABLE
8829 [uber-SubtaskRunner] INFO  org.apache.sqoop.manager.OracleManager  - Time zone has been set to GMT
8850 [uber-SubtaskRunner] WARN  org.apache.sqoop.mapreduce.JobBase  - SQOOP_HOME is unset. May not be able to find all job dependencies.
10115 [uber-SubtaskRunner] INFO  org.apache.sqoop.mapreduce.db.DBInputFormat  - Using read commited transaction isolation
33664 [uber-SubtaskRunner] INFO  org.apache.sqoop.mapreduce.ImportJobBase  - Transferred 269.0625 KB in 24.8054 seconds (10.8469 KB/sec)
33669 [uber-SubtaskRunner] INFO  org.apache.sqoop.mapreduce.ImportJobBase  - Retrieved 2590 records.
33706 [uber-SubtaskRunner] INFO  org.apache.sqoop.manager.OracleManager  - Time zone has been set to GMT
33706 [uber-SubtaskRunner] INFO  org.apache.sqoop.manager.SqlManager  - Executing SQL statement: SELECT t.* FROM TEST_TABLE t WHERE 1=0
33763 [uber-SubtaskRunner] WARN  org.apache.sqoop.hive.TableDefWriter  - Column DB_TID had to be cast to a less precise type in Hive
33763 [uber-SubtaskRunner] WARN  org.apache.sqoop.hive.TableDefWriter  - Column PROD_TID had to be cast to a less precise type in Hive
33763 [uber-SubtaskRunner] WARN  org.apache.sqoop.hive.TableDefWriter  - Column TM_TID had to be cast to a less precise type in Hive
33763 [uber-SubtaskRunner] WARN  org.apache.sqoop.hive.TableDefWriter  - Column STATE had to be cast to a less precise type in Hive
33773 [uber-SubtaskRunner] INFO  org.apache.sqoop.hive.HiveImport  - Loading uploaded data into Hive
Heart beat
Intercepting System.exit(1)


<<< Invocation of Main class completed <<<

Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.SqoopMain], exit code [1]

Oozie Launcher failed, finishing Hadoop job gracefully

Oozie Launcher, uploading action data to HDFS sequence file: hdfs://master:8020/user/hue/oozie-oozi/0000003-160426232305860-oozie-oozi-W/Sqoop-copy--sqoop/action-data.seq

Oozie Launcher ends


回复

使用道具 举报

kingba 发表于 2018-2-5 10:27:56
同样遇到这个问题,请问解决了吗,麻烦告知一下方案,不胜感激。。
回复

使用道具 举报

kingba 发表于 2018-2-5 10:28:59
请问,这个问题解决了吗,我同样遇到这个问题,求指教,谢谢。。
回复

使用道具 举报

einhep 发表于 2018-2-5 14:49:03
本帖最后由 einhep 于 2018-2-5 14:55 编辑
kingba 发表于 2018-2-5 10:28
请问,这个问题解决了吗,我同样遇到这个问题,求指教,谢谢。。

33763 [uber-SubtaskRunner] WARN  org.apache.sqoop.hive.TableDefWriter  - Column DB_TID had to be cast to a less precise type in Hive
33763 [uber-SubtaskRunner] WARN  org.apache.sqoop.hive.TableDefWriter  - Column PROD_TID had to be cast to a less precise type in Hive
33763 [uber-SubtaskRunner] WARN  org.apache.sqoop.hive.TableDefWriter  - Column TM_TID had to be cast to a less precise type in Hive
33763 [uber-SubtaskRunner] WARN  org.apache.sqoop.hive.TableDefWriter  - Column STATE had to be cast to a less precise type in Hive
这个是因为导入的数据类型和被导入的数据类型,二者不一致,需要使用cast转换下。
比如时间time,在hive里面需要转换为时间戳。


sqoop import  --connect "jdbc:sqlserver://xxxx:1433;DatabaseName=test" --username xxx --password 123456 --query 'SELECT 。。。。。。。 CAST (DB_TID  AS hivetype), ... FROM AU_User  WHERE $CONDITIONS'也就是将DB_TID  转换为hive的数据类型。
回复

使用道具 举报

您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

关闭

推荐上一条 /2 下一条