刚开始 学习Hbase照着文档写了个简单的程序就出现这样的错误,请大神给指点一下哪出错了?14/10/09 10:19:21 INFO zookeeper.ZooKeeper: Client environment:zookeeper.version=3.4.3-1240972, built on 02/06/2012 10:48 GMT
14/10/09 10:19:21 INFO zookeeper.ZooKeeper: Client environment:host.name=2013-1016-1614
14/10/09 10:19:21 INFO zookeeper.ZooKeeper: Client environment:java.version=1.6.0_45
14/10/09 10:19:21 INFO zookeeper.ZooKeeper: Client environment:java.vendor=Sun Microsystems Inc.
14/10/09 10:19:21 INFO zookeeper.ZooKeeper: Client environment:java.home=C:\Program Files\Java\jre6
14/10/09 10:19:21 INFO zookeeper.ZooKeeper: Client environment:java.class.path=F:\软件\HadoopWorkPlat\workplace\.metadata\.plugins\org.apache.hadoop.eclipse\hadoop-conf-7063833016097855588;F:\软件\HadoopWorkPlat\workplace\HBASE_pro_01\bin;F:\软件\HadoopWorkPlat\workplace\HBASE_pro_01\lib\hadoop-core-1.0.0.jar;F:\软件\HadoopWorkPlat\workplace\HBASE_pro_01\lib\hbase-0.92.1.jar;F:\软件\HadoopWorkPlat\workplace\HBASE_pro_01\lib\log4j-1.2.15.jar;F:\软件\HadoopWorkPlat\workplace\HBASE_pro_01\lib\log4j-1.2.16.jar;F:\软件\HadoopWorkPlat\workplace\HBASE_pro_01\lib\slf4j-api-1.5.8.jar;F:\软件\HadoopWorkPlat\workplace\HBASE_pro_01\lib\slf4j-log4j12-1.5.8.jar;F:\软件\HadoopWorkPlat\workplace\HBASE_pro_01\lib\zookeeper-3.4.3.jar;F:\软件\HadoopWorkPlat\workplace\HBASE_pro_01\lib\commons-codec-1.4.jar;F:\软件\HadoopWorkPlat\workplace\HBASE_pro_01\lib\commons-configuration-1.6.jar;F:\软件\HadoopWorkPlat\workplace\HBASE_pro_01\lib\commons-lang-2.5.jar;F:\软件\HadoopWorkPlat\workplace\HBASE_pro_01\lib\commons-logging-1.1.1.jar
14/10/09 10:19:21 INFO zookeeper.ZooKeeper: Client environment:java.library.path=C:\Program Files\Java\jre6\bin;C:\Windows\Sun\Java\bin;C:\Windows\system32;C:\Windows;C:/Program Files/Java/jre1.8.0_20/bin/client;C:/Program Files/Java/jre1.8.0_20/bin;C:/Program Files/Java/jre1.8.0_20/lib/i386;C:\ProgramData\Oracle\Java\javapath;C:\Windows\system32;C:\Windows;C:\Windows\System32\Wbem;C:\Windows\System32\WindowsPowerShell\v1.0\;C:\Program Files\Microsoft SQL Server\90\Tools\binn\;C:\Program Files\Microsoft SQL Server\100\Tools\Binn\;C:\Program Files\Microsoft SQL Server\100\DTS\Binn\;F:\软件\HadoopWorkPlat\eclipse;;.
14/10/09 10:19:21 INFO zookeeper.ZooKeeper: Client environment:java.io.tmpdir=C:\Users\ADMINI~1\AppData\Local\Temp\
14/10/09 10:19:21 INFO zookeeper.ZooKeeper: Client environment:java.compiler=<NA>
14/10/09 10:19:21 INFO zookeeper.ZooKeeper: Client environment:os.name=Windows 7
14/10/09 10:19:21 INFO zookeeper.ZooKeeper: Client environment:os.arch=x86
14/10/09 10:19:21 INFO zookeeper.ZooKeeper: Client environment:os.version=6.1
14/10/09 10:19:21 INFO zookeeper.ZooKeeper: Client environment:user.name=root
14/10/09 10:19:21 INFO zookeeper.ZooKeeper: Client environment:user.home=C:\Users\Administrator
14/10/09 10:19:21 INFO zookeeper.ZooKeeper: Client environment:user.dir=F:\软件\HadoopWorkPlat\workplace\HBASE_pro_01
14/10/09 10:19:21 INFO zookeeper.ZooKeeper: Initiating client connection, connectString=192.168.93.128:2181 sessionTimeout=180000 watcher=hconnection
14/10/09 10:19:22 INFO zookeeper.ClientCnxn: Opening socket connection to server /192.168.93.128:2181
14/10/09 10:19:22 INFO zookeeper.RecoverableZooKeeper: The identifier of this process is 12136@2013-1016-1614
14/10/09 10:19:31 WARN client.ZooKeeperSaslClient: SecurityException: java.lang.SecurityException: 无法定位登录配置 occurred when trying to find JAAS configuration.
14/10/09 10:19:31 INFO client.ZooKeeperSaslClient: Client will not SASL-authenticate because the default JAAS configuration section 'Client' could not be found. If you are not using SASL, you may ignore this. On the other hand, if you expected SASL to work, please fix your JAAS configuration.
14/10/09 10:19:31 INFO zookeeper.ClientCnxn: Socket connection established to 192.168.93.128/192.168.93.128:2181, initiating session
14/10/09 10:19:31 INFO zookeeper.ClientCnxn: Session establishment complete on server 192.168.93.128/192.168.93.128:2181, sessionid = 0x148f29bbe720004, negotiated timeout = 180000
代码:
package org.apache.hadoop.examples;
import java.io.IOException;
import java.util.StringTokenizer;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.Mapper;
import org.apache.hadoop.mapreduce.Reducer;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
import org.apache.hadoop.util.GenericOptionsParser;
public class WordCount {
public static class TokenizerMapper
extends Mapper<Object, Text, Text, IntWritable>{
private final static IntWritable one = new IntWritable(1);
private Text word = new Text();
public void map(Object key, Text value, Context context
) throws IOException, InterruptedException {
StringTokenizer itr = new StringTokenizer(value.toString());
while (itr.hasMoreTokens()) {
word.set(itr.nextToken());
context.write(word, one);
}
}
}
public static class IntSumReducer
extends Reducer<Text,IntWritable,Text,IntWritable> {
private IntWritable result = new IntWritable();
public void reduce(Text key, Iterable<IntWritable> values,
Context context
) throws IOException, InterruptedException {
int sum = 0;
for (IntWritable val : values) {
sum += val.get();
}
result.set(sum);
context.write(key, result);
}
}
public static void main(String[] args) throws Exception {
Configuration conf = new Configuration();
conf.set("mapred.job.tracker", "192.168.93.128:9001");
String[] ars=new String[]{" /user/root/testdir","/user/root/newout"};
String[] otherArgs = new GenericOptionsParser(conf, ars).getRemainingArgs();
if (otherArgs.length != 2) {
System.err.println("Usage: wordcount");
System.exit(2);
}
Job job = new Job(conf, "word count");
job.setJarByClass(WordCount.class);
job.setMapperClass(TokenizerMapper.class);
job.setCombinerClass(IntSumReducer.class);
job.setReducerClass(IntSumReducer.class);
job.setOutputKeyClass(Text.class);
job.setOutputValueClass(IntWritable.class);
FileInputFormat.addInputPath(job, new Path(otherArgs[0]));
FileOutputFormat.setOutputPath(job, new Path(otherArgs[1]));
System.exit(job.waitForCompletion(true) ? 0 : 1);
}
}
|
|