kettle big-data-plugin插件使用
指定KETTLE_PLUGIN_BASE_FOLDERS属性到plugin后还是啥找不到插件(执行本地文件拷贝到hdfs任务,用到Hadoop Copy Files插件)
System.setProperty("KETTLE_PLUGIN_BASE_FOLDERS", "D:\\env\\pdi-ce-8.2.0.0-342\\plugins\\");
Hadoop Copy Files - ERROR (version 8.2.0.0-342, build 8.2.0.0-342 from 2020-03-26 01.04.49 by wjw) : Can't run job due to plugin missing
将pentaho-big-data-impl-shim-hdfs、pentaho-big-data-kettle-plugins-hdfs拷贝到plugins目录后,报无法初始化
Unable to instantiate class
org.pentaho.big.data.kettle.plugins.hdfs.job.JobEntryHadoopCopyFiles
at org.pentaho.di.core.plugins.PluginRegistry.loadClass(PluginRegistry.java:496)
at org.pentaho.di.job.entry.JobEntryCopy.<init>(JobEntryCopy.java:145)
... 28 common frames omitted
Caused by: java.lang.InstantiationException: org.pentaho.big.data.kettle.plugins.hdfs.job.JobEntryHadoopCopyFiles
at java.lang.Class.newInstance(Class.java:427)
at org.pentaho.di.core.plugins.PluginRegistry.loadClass(PluginRegistry.java:491)
... 29 common frames omitted
Caused by: java.lang.NoSuchMethodException: org.pentaho.big.data.kettle.plugins.hdfs.job.JobEntryHadoopCopyFiles.<init>()
at java.lang.Class.getConstructor0(Class.java:3082)
at java.lang.Class.newInstance(Class.java:412)
... 30 common frames omitted
本帖最后由 阿飞 于 2020-3-30 12:18 编辑
System.setProperty("KETTLE_PLUGIN_BASE_FOLDERS", "D:\\env\\pdi-ce-8.2.0.0-342\\plugins\\");
D:\\env\\pdi-ce-8.2.0.0-342\\plugins\\改为D:\\env\\pdi-ce-8.2.0.0-342\\plugins,去掉最后的双斜杠,在看看效果。
作为开发运维人员,一定要注意细节。
阿飞 发表于 2020-3-30 12:16
System.setProperty("KETTLE_PLUGIN_BASE_FOLDERS", "D:\\env\\pdi-ce-8.2.0.0-342\\plugins\\");
D:\\env ...
去掉\\也是一样的效果,大神有运行成功的没有,代码及D:\env\pdi-ce-8.2.0.0-342\plugins下面的内容能否截图参考或者QQ(383049035)指导下
页:
[1]