public static void appen(String content, String path, FileSystem fs) {
// System.out.println("测试:"+content);
InputStream in = null;
OutputStream out = null;
try {
Path hdfs_path = new Path(path);
Configuration conf = new Configuration();
conf.setBoolean("dfs.support.append", true);
// FileSystem fs = FileSystem.get(URI.create(hdfs_path.toString()),
// conf);
fs.setReplication(hdfs_path, (short) 1);
in = new BufferedInputStream(new ByteArrayInputStream(
content.getBytes()));
out = fs.append(hdfs_path);
IOUtils.copyBytes(in, out, conf);
// fs.close();
} catch (Exception e) {
logger.error(e);
} finally {
IOUtils.closeStream(in);
try {
if (in != null) {
in.close();
}
if (out != null) {
out.close();
}
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
我也这问题 ,定时任务一直跑,第一个任务跑完没有问题,任务第二次跑就出像楼主一样的bug.
如果任务一直跑好好的,我把任务手动停了,再跑也会有这bug. 出现bug后,要过比较久或重启hadoop,就没有这问题。我怀疑文件没有正常关闭,但其实都有关闭 |