HDFS文件上传权限问题
自己部署的是单节点:import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileStatus;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
public class CopyFile {
public static void main(String[] args)throws Exception{
Configuration conf = new Configuration();
conf.addResource(new Path("home/zff/hadoop/conf/hadoop-default.xml"));
conf.addResource(new Path("home/zff/hadoop/conf/hadoop-site.xml"));
conf.addResource(new Path("home/zff/hadoop/conf/core-site.xml"));
FileSystem hdfs = FileSystem.get(conf);
Path src = new Path("/home/1.txt");
System.out.println(src);
Path dst = new Path("//");
hdfs.copyFromLocalFile(src, dst);
System.out.println("Upload to "+conf.get("fs.default.name"));
FileStatus files[] = hdfs.listStatus(dst);
for(FileStatus file:files){
System.out.println(file.getPath());
}
}
}
运行结果:
Exception in thread "main" java.io.FileNotFoundException: /1.txt (Permission denied)
at java.io.FileOutputStream.open(Native Method)
at java.io.FileOutputStream.(FileOutputStream.java:179)
at org.apache.hadoop.fs.RawLocalFileSystem$LocalFSFileOutputStream.(RawLocalFileSystem.java:188)
at org.apache.hadoop.fs.RawLocalFileSystem$LocalFSFileOutputStream.(RawLocalFileSystem.java:184)
at org.apache.hadoop.fs.RawLocalFileSystem.create(RawLocalFileSystem.java:255)
at org.apache.hadoop.fs.RawLocalFileSystem.create(RawLocalFileSystem.java:236)
at org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSOutputSummer.(ChecksumFileSystem.java:335)
at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:381)
at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:364)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:555)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:536)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:443)
at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:229)
at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:163)
at org.apache.hadoop.fs.LocalFileSystem.copyFromLocalFile(LocalFileSystem.java:67)
at org.apache.hadoop.fs.FileSystem.copyFromLocalFile(FileSystem.java:1143)
at com.zff.CopyFile.main(CopyFile.java:18) Path src = new Path("/home/1.txt");没有访问权限chmod一下就ok了 如果是在windows下eclipse开发,则可改为:Path src = new Path("D:/1.txt");,当然了,D盘要有这个文件才行。 [*]import org.apache.hadoop.conf.Configuration;
[*]import org.apache.hadoop.fs.FileSystem;
[*]import org.apache.hadoop.fs.Path;
[*]import org.apache.hadoop.io.IOUtils;
[*]import org.apache.hadoop.util.Progressable;
[*]
[*]import java.io.BufferedInputStream;
[*]import java.io.FileInputStream;
[*]import java.io.InputStream;
[*]import java.io.OutputStream;
[*]import java.net.URI;
[*]
[*]public class FileCopyWithProgress {
[*] public static void main(String[] args) throws Exception {
[*] String localSrc = "D:/core-site.xml";
[*] String dst = "hdfs://121.1.253.251:9000/out/core-site.xml";
[*]
[*] InputStream in = new BufferedInputStream(new FileInputStream(localSrc));
[*]
[*] Configuration conf = new Configuration();
[*] FileSystem fs = FileSystem.get(URI.create(dst), conf);
[*] OutputStream out = fs.create(new Path(dst), new Progressable() {
[*] @Override
[*] public void progress() {
[*] System.out.print(".");
[*] }
[*] });
[*]
[*] IOUtils.copyBytes(in, out, 4096, true);
[*]}
[*]}复制代码 你可以参照上述代码来拷贝文件 回复 5# 北冰洋的鱼儿
我是在ubuntu下面,用chmod还是不行 格式要类似这样:
$hadoop fs -chmod 777 /home/1.txt 实在不行就到服务器上修改hadoop的配置文件:conf/hdfs-core.xml, 找到 dfs.permissions 的配置项 , 将value值改为 false.
dfs.permissions
false
修改完记得重启hadoop 还有种方法是在代码中提交任务时设置配置文件:
configuration conf = new configuration();
conf.set("hdfs.dfs.permissions","false");
上面我说的集中方法,你都可以试一下
页:
[1]