分享

oozie 做sqoop导入报错

credit 发表于 2015-12-4 14:42:08 [显示全部楼层] 回帖奖励 阅读模式 关闭右栏 5 17186
  各位大神,在使用oozie 做sqoop导入时候报了一个错误,请问下有没有遇到的,有没有好的意见                                                 Log Length: 5997
  log4j:ERROR Could not find value for key log4j.appender.CLA  log4j:ERROR Could not instantiate appender named "CLA".  log4j:WARN No appenders could be found for logger (org.apache.hadoop.yarn.client.RMProxy).  log4j:WARN Please initialize the log4j system properly.  log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.  注: /tmp/sqoop-yarn/compile/9f3c81b0062cec6973184f1f95c215f9/JC_AJXX.java使用或覆盖了已过时的 API。  注: 有关详细信息, 请使用 -Xlint:deprecation 重新编译。  org.kitesdk.data.DatasetNotFoundException: Unknown dataset URI pattern: dataset:hive:/test/JC_AJXX  all_scheme are [URIPattern{pattern=file:/*path/:namespace/:dataset?absolute=true}, URIPattern{pattern=file:*path/:namespace/:dataset}, URIPattern{pattern=hdfs:/*path/:namespace/:dataset?absolute=true}, URIPattern{pattern=hdfs:*path/:namespace/:dataset}, URIPattern{pattern=webhdfs:/*path/:namespace/:dataset?absolute=true}]  Check that JARs for hive datasets are on the classpath  at org.kitesdk.data.spi.Registration.lookupDatasetUri(Registration.java:108)  at org.kitesdk.data.Datasets.create(Datasets.java:228)  at org.kitesdk.data.Datasets.create(Datasets.java:307)  at org.apache.sqoop.mapreduce.ParquetJob.createDataset(ParquetJob.java:107)  at org.apache.sqoop.mapreduce.ParquetJob.configureImportJob(ParquetJob.java:89)  at org.apache.sqoop.mapreduce.DataDrivenImportJob.configureMapper(DataDrivenImportJob.java:106)  at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:260)  at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:668)  at org.apache.sqoop.manager.OracleManager.importTable(OracleManager.java:444)  at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497)  at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)  at org.apache.sqoop.Sqoop.run(Sqoop.java:143)  at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)  at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)  at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)  at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)  at org.apache.sqoop.Sqoop.main(Sqoop.java:236)  at org.apache.oozie.action.hadoop.SqoopMain.runSqoopJob(SqoopMain.java:196)  at org.apache.oozie.action.hadoop.SqoopMain.run(SqoopMain.java:176)  at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:46)  at org.apache.oozie.action.hadoop.SqoopMain.main(SqoopMain.java:46)  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)  at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)  at java.lang.reflect.Method.invoke(Method.java:606)  at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:228)  at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)  at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)  at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)  at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.runSubtask(LocalContainerLauncher.java:370)  at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.runTask(LocalContainerLauncher.java:295)  at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.access$200(LocalContainerLauncher.java:181)  at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler$1.run(LocalContainerLauncher.java:224)  at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)  at java.util.concurrent.FutureTask.run(FutureTask.java:262)  at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)  at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)  at java.lang.Thread.run(Thread.java:745)  Intercepting System.exit(1)  Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.SqoopMain], exit code [1]  十二月 04, 2015 2:35:06 下午 com.google.inject.servlet.InternalServletModule$BackwardsCompatibleServletContextProvider get  警告: You are attempting to use a deprecated API (specifically, attempting to @Inject ServletContext inside an eagerly created singleton. While we allow this for backwards compatibility, be warned that this MAY have unexpected behavior if you have more than one injector (with ServletModule) running in the same JVM. Please consult the Guice documentation at http://code.google.com/p/google-guice/wiki/Servlets for more information.  十二月 04, 2015 2:35:07 下午 com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register  信息: Registering org.apache.hadoop.mapreduce.v2.app.webapp.JAXBContextResolver as a provider class  十二月 04, 2015 2:35:07 下午 com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register  信息: Registering org.apache.hadoop.yarn.webapp.GenericExceptionHandler as a provider class  十二月 04, 2015 2:35:07 下午 com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register  信息: Registering org.apache.hadoop.mapreduce.v2.app.webapp.AMWebServices as a root resource class  十二月 04, 2015 2:35:07 下午 com.sun.jersey.server.impl.application.WebApplicationImpl _initiate  信息: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM'  十二月 04, 2015 2:35:07 下午 com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider  信息: Binding org.apache.hadoop.mapreduce.v2.app.webapp.JAXBContextResolver to GuiceManagedComponentProvider with the scope "Singleton"  十二月 04, 2015 2:35:07 下午 com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider  信息: Binding org.apache.hadoop.yarn.webapp.GenericExceptionHandler to GuiceManagedComponentProvider with the scope "Singleton"  十二月 04, 2015 2:35:07 下午 com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider  信息: Binding org.apache.hadoop.mapreduce.v2.app.webapp.AMWebServices to GuiceManagedComponentProvider with the scope "PerRequest"

已有(5)人评论

跳转到指定楼层
nextuser 发表于 2015-12-4 18:58:45
hadoop-env 文件中如果是windows用户检查hadoop-env.cmd

HADOOP_IDENT_STRING=%username%


如果是Linux用户 hadoop-env.sh核实下面
HADOOP_IDENT_STRING=$USER

回复

使用道具 举报

credit 发表于 2015-12-4 19:02:25
nextuser 发表于 2015-12-4 18:58
hadoop-env 文件中如果是windows用户检查hadoop-env.cmd

HADOOP_IDENT_STRING=%username%

hadoop-env.sh 在那个路径下啊
回复

使用道具 举报

arsenduan 发表于 2015-12-4 21:17:05
credit 发表于 2015-12-4 19:02
hadoop-env.sh 在那个路径下啊


hadoop-env.sh现在位于conf
配置文件夹下

回复

使用道具 举报

credit 发表于 2015-12-7 13:13:50
arsenduan 发表于 2015-12-4 21:17
hadoop-env.sh现在位于conf
配置文件夹下

不好意思啊,完整路径是什么
回复

使用道具 举报

credit 发表于 2015-12-9 08:25:41
credit 发表于 2015-12-7 13:13
不好意思啊,完整路径是什么

您好,这边直接安装的cdh,不太知道您说的那个配置文件的具体位置
回复

使用道具 举报

您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

关闭

推荐上一条 /2 下一条