现在运行wordcount是没问题了,我写了一个mapreduce从hbase读取数据,计算过后输出到HDFS,可是在提交任务后,会出现炸不到自定义的那个mapper类,可是我明明是指定了啊 Origin_MR是包名,Origin_Mapper 是类名字
2017-05-15 20:12:19,366 INFO [main] mapreduce.Job (Job.java:monitorAndPrintJob(1334)) - Running job: job_1494761713465_0045
2017-05-15 20:13:29,772 INFO [main] mapreduce.Job (Job.java:monitorAndPrintJob(1355)) - Job job_1494761713465_0045 running in uber mode : false
2017-05-15 20:13:29,772 INFO [main] mapreduce.Job (Job.java:monitorAndPrintJob(1362)) - map 0% reduce 0%
2017-05-15 20:14:38,654 INFO [main] mapreduce.Job (Job.java:printTaskEvents(1441)) - Task Id : attempt_1494761713465_0045_m_000000_0, Status : FAILED
Error: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class Origin_MR.cnot found
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1905)
at org.apache.hadoop.mapreduce.task.JobContextImpl.getMapperClass(JobContextImpl.java:186)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:722)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
Caused by: java.lang.ClassNotFoundException: Class Origin_MR.Origin_Mapper not found
at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:1811)
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1903)
... 8 more
Container killed by the ApplicationMaster.
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143
主函数的代码如下:
public static void main(String[] args) throws ClassNotFoundException, InterruptedException {
long starttime = System.currentTimeMillis();
String inputtable = "GJJY95020150613";
Configuration conf = HBaseConfiguration.create();
conf.set("hbase.zookeeper.quorum", "node1");
conf.set("fs.defaultFS", "hdfs://shuke");
conf.set("dfs.nameservices", "shuke");
conf.set("dfs.ha.namenodes.shuke", "nn1,nn2");
conf.set("dfs.namenode.rpc-address.shuke.nn1", "192.168.8.118:8020");
conf.set("dfs.namenode.rpc-address.shuke.nn2", "192.168.8.121:8020");
conf.set("dfs.client.failover.proxy.provider.shuke",
"org.apache.hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider");
// conf.set("mapred.jar", "D:\\Origin.jar");
try {
Job job = Job.getInstance(conf);
job.setJobName("Origin");
job.setJarByClass(Origin_job.class);
Scan scan = new Scan();
scan.setCaching(500);
scan.setCacheBlocks(false);
TableMapReduceUtil.initTableMapperJob(inputtable,scan,Origin_Mapper.class,Text.class,Text.class, job);
job.setReducerClass(Origin_Reducer.class);
job.setOutputKeyClass(Text.class);
job.setOutputValueClass(Text.class);
FileOutputFormat.setOutputPath(job, new Path("/usr/output/sky"));
boolean f = job.waitForCompletion(true);
if (f) {
System.out.println("job任务执行成功!");
}else {
System.out.println("执行job任务失败!");
}
System.out.println(System.currentTimeMillis()-starttime+"毫秒");
} catch (IOException e) {
System.out.println("创建job实例失败!");
e.printStackTrace();
}
}
|