分享

hadoop编程操作hdfs错误总结

admin 2014-2-24 15:42:10 发表于 总结型 [显示全部楼层] 回帖奖励 阅读模式 关闭右栏 0 6871




阅读本文可以带着下面问题:


1.什么情况下会出现连接不上集群,包端口错误?


2.hdfs权限检测在哪里可以配置?如何配置,具体哪个参数?

3.修改完配置文件,运行照样错误?是什么原因造成的?



当我们对hdfs操作的时候,我们可能会碰到如下错误
错误1:权限问题
  1. Exception in thread "main" org.apache.hadoop.security.AccessControlException: org.apache.hadoop.security.AccessControlException: Permission denied: user=,
  2. access=WRITE, inode="":root:supergroup:rwxr-xr-x
  3. at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
  4. at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
  5. at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
  6. at java.lang.reflect.Constructor.newInstance(Constructor.java:525)
  7. at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:95)
  8. at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:57)
  9. at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:1216)
  10. at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:321)
  11. at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1126)
  12. at hdfs.hdoopapi.main(hdoopapi.java:19)
  13. Caused by: org.apache.hadoop.ipc.RemoteException: org.apache.hadoop.security.AccessControlException: Permission denied: user=hyj, access=WRITE,
  14. inode="":root:supergroup:rwxr-xr-x
  15. at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:199)
  16. at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:180)
  17. at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:128)
  18. at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5214)
  19. at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:5188)
  20. at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal(FSNamesystem.java:2060)
  21. at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:2029)
  22. at org.apache.hadoop.hdfs.server.namenode.NameNode.mkdirs(NameNode.java:817)
  23. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  24. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
  25. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  26. at java.lang.reflect.Method.invoke(Method.java:606)
  27. at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:563)
  28. at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1388)
  29. at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1384)
  30. at java.security.AccessController.doPrivileged(Native Method)
  31. at javax.security.auth.Subject.doAs(Subject.java:415)
  32. at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
  33. at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1382)
  34. at org.apache.hadoop.ipc.Client.call(Client.java:1070)
  35. at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
  36. at sun.proxy.$Proxy1.mkdirs(Unknown Source)
  37. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  38. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
  39. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  40. at java.lang.reflect.Method.invoke(Method.java:601)
  41. at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82)
  42. at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)
  43. at sun.proxy.$Proxy1.mkdirs(Unknown Source)
  44. at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:1214)
  45. ... 3 more
复制代码
回答:产生这个错误的原因,是因为HDFS配置文件hdfs-site.xml中,配置了对权限的检查,我们需要把dfs.permissions,value值改为false。
同样友情提醒:改完配置文件,为了保证生效建议重启reboot
  1. <configuration>
  2. <property>
  3. <name>dfs.replication</name>
  4. <value>1</value>
  5. </property>
  6. <property>
  7. <name>hadoop.tmp.dir</name>
  8. <value>/home/work/hadoop_tmp</value>
  9. </property>
  10. <property>
  11. <name>dfs.permissions</name>
  12. <value>true</value>
  13. </property>
  14. </configuration>
复制代码
还存在另外一种情况:如下,就是我们对其配置,这个也会产生这个错误。
  1. <configuration>
  2. <property>
  3. <name>dfs.replication</name>
  4. <value>1</value>
  5. </property>
  6. <property>
  7. <name>hadoop.tmp.dir</name>
  8. <value>/home/work/hadoop_tmp</value>
  9. </property>
  10. </configuration>
复制代码
错误2:端口问题
  1. 14/02/24 14:10:45 INFO ipc.Client: Retrying connect to server: /192.168.159.10:9000. Already tried 0 time(s).
  2. 14/02/24 14:10:47 INFO ipc.Client: Retrying connect to server: /192.168.159.10:9000. Already tried 1 time(s).
  3. 14/02/24 14:10:49 INFO ipc.Client: Retrying connect to server: /192.168.159.10:9000. Already tried 2 time(s).
  4. 14/02/24 14:10:51 INFO ipc.Client: Retrying connect to server: /192.168.159.10:9000. Already tried 3 time(s).
  5. 14/02/24 14:10:53 INFO ipc.Client: Retrying connect to server: /192.168.159.10:9000. Already tried 4 time(s).
  6. 14/02/24 14:10:55 INFO ipc.Client: Retrying connect to server: /192.168.159.10:9000. Already tried 5 time(s).
  7. 14/02/24 14:10:57 INFO ipc.Client: Retrying connect to server: /192.168.159.10:9000. Already tried 6 time(s).
  8. 14/02/24 14:10:59 INFO ipc.Client: Retrying connect to server: /192.168.159.10:9000. Already tried 7 time(s).
  9. 14/02/24 14:11:01 INFO ipc.Client: Retrying connect to server: /192.168.159.10:9000. Already tried 8 time(s).
  10. 14/02/24 14:11:03 INFO ipc.Client: Retrying connect to server: /192.168.159.10:9000. Already tried 9 time(s).
  11. Exception in thread "main" java.net.ConnectException: Call to aboutyun/192.168.159.10:9000 failed on connection exception: java.net.ConnectException: Connection
  12. refused: no further information
  13. at org.apache.hadoop.ipc.Client.wrapException(Client.java:1099)
  14. at org.apache.hadoop.ipc.Client.call(Client.java:1075)
  15. at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
  16. at sun.proxy.$Proxy1.getProtocolVersion(Unknown Source)
  17. at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:396)
  18. at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:379)
  19. at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:119)
  20. at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:238)
  21. at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:203)
  22. at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89)
  23. at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1386)
  24. at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
  25. at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1404)
  26. at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:254)
  27. at hdfs.hdoopapi.getFileSystem(hdoopapi.java:29)
  28. at hdfs.hdoopapi.main(hdoopapi.java:17)
  29. Caused by: java.net.ConnectException: Connection refused: no further information
  30. at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
  31. at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:692)
  32. at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
  33. at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:489)
  34. at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:434)
  35. at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:560)
  36. at org.apache.hadoop.ipc.Client$Connection.access$2000(Client.java:184)
  37. at org.apache.hadoop.ipc.Client.getConnection(Client.java:1206)
  38. at org.apache.hadoop.ipc.Client.call(Client.java:1050)
  39. ... 14 more
复制代码
上面是的也是错误,看到很多说是修改配置文件,其实发生这个问题的原因
1.你们之间通信确实产生了问题
2.看一下集群是否已经启动(这个很重要)
试一下start-all.sh或许这个就能解决你的问题。
同样友情提醒:改完配置文件,为了保证生效建议重启reboot






来自群组: Hadoop技术组

没找到任何评论,期待你打破沉寂

您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

关闭

推荐上一条 /2 下一条