分享

hadoop编程:出现 in the type Job is not applicable for the arguments 解决方案

pig2 发表于 2014-3-4 17:18:12 [显示全部楼层] 回帖奖励 阅读模式 关闭右栏 0 54302
本帖最后由 pig2 于 2014-3-5 02:49 编辑
可以带着下面问题来阅读:
1.出现The method setInputFormatClass(Class<? extends InputFormat>) in the type Job is not applicable for the arguments (Class<TextInputFormat>)错误的可能原因是什么,该如何解决?


The method setInputFormatClass(Class<? extends InputFormat>) in the type Job is not applicable for the arguments (Class<TextInputFormat>)很多时候出现这种情况,出现这种情况的原因是:
1.没有引用包
2.函数与参数不匹配造成的。


1.没有引用包
对于下面错误只需要添加import org.apache.hadoop.mapreduce.lib.input.TextInputFormat;就可以了。

package mapreduce;

import java.io.IOException;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.Mapper;
import org.apache.hadoop.mapreduce.Reducer;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;




public class mapreduce {

static final String INPUT_PATH = "hdfs://aboutyun:9000";

public static void main(String[] args) throws Exception {

Configuration conf = new Configuration();
final Job job = new Job(conf, mapreduce.class.getSimpleName());
// 寻找输入
FileInputFormat.setInputPaths(job, INPUT_PATH);
// 对输入数据进行格式化处理的类

job.setInputFormatClass(TextInputFormat.class);
}

static class MyMapper extends
Mapper<LongWritable, Text, Text, LongWritable> {
protected void map(LongWritable key, Text value, Context context)
throws IOException, InterruptedException {
final String[] splited = value.toString().split("\t");
for (String word : splited) {
context.write(new Text(word), new LongWritable(1L));

}

}

}

static class MyReduce extends
Reducer<Text, LongWritable, Text, LongWritable> {
@Override
protected void reduce(Text k2, java.lang.Iterable<LongWritable> v2s,
Context ctx) throws java.io.IOException, InterruptedException {
long times = 0L;
for (LongWritable count : v2s) {
times += count.get();
ctx.write(k2, new LongWritable(times));
}

}

}

}


2.参数不匹配
ob.setOutputFormatClass(PartitionByCountryMTOF.class);

显示The method setOutputFormatClass(Class<? extends OutputFormat>) in the type Job is

not applicable for the arguments (Class<MultiFile.PartitionByCountryMTOF>)

上面把PartitionByCountryMTOF.class修改成TextOutputFormat.class就可以了


import java.io.IOException;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.conf.Configured;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.*;
import org.apache.hadoop.mapred.JobClient;
import org.apache.hadoop.mapred.OutputCollector;
import org.apache.hadoop.mapred.Reporter;
import org.apache.hadoop.mapred.lib.MultipleTextOutputFormat;
import org.apache.hadoop.mapreduce.*;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.input.TextInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
import org.apache.hadoop.mapreduce.lib.output.TextOutputFormat;
import org.apache.hadoop.util.Tool;
import org.apache.hadoop.util.ToolRunner;



public class MultiFile extends Configured  {
    public static class MapClass extends Mapper<LongWritable, Text, NullWritable, Text> {
        public void map(LongWritable key, Text value,
                        OutputCollector<NullWritable, Text> output,
                        Reporter reporter) throws IOException {
            output.collect(NullWritable.get(), value);
        }
    }
    public static class PartitionByCountryMTOF
        extends MultipleTextOutputFormat<NullWritable,Text>
    {
        protected String generateFileNameForKeyValue(NullWritable key,
                                                     Text value,
                                                     String filename)
        {
            String[] arr = value.toString().split(",", -1);
            String country = arr[4].substring(1,3);
            return country + "/" + filename;
        }
    }

    public static void main(String[] args) throws Exception {
    Configuration conf = new Configuration();
        Job job = new Job(conf, "MultiFile");
        job.setJarByClass(MultiFile.class);
        Path in = new Path(args[0]);
        Path out = new Path(args[1]);
        FileInputFormat.setInputPaths(job, in);//输入路径
        FileOutputFormat.setOutputPath(job, out);//输出路径
        job.setJobName("MultiFile");//job名称
        job.setMapperClass(MapClass.class);
        job.setInputFormatClass(TextInputFormat.class);//输入格式
        job.setOutputFormatClass(PartitionByCountryMTOF.class);//输出格式设置
        job.setMapOutputValueClass(LongWritable.class);//
        job.setOutputKeyClass(NullWritable.class);输出key的类型
        job.setOutputValueClass(Text.class);//输出value的类型
        job.setNumReduceTasks(0);//设置reduce个数
        System.exit(job.waitForCompletion(true)?0:1);
    }
}


没找到任何评论,期待你打破沉寂

您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

关闭

推荐上一条 /2 下一条