分享

hdfs 测试 出错

xukunddp 发表于 2013-10-25 10:45:56 [显示全部楼层] 回帖奖励 阅读模式 关闭右栏 1 5239
1个namenode+secondarynode+2 datanode
Configuration conf = new Configuration();  
         conf.set("hadoop.job.ugi", hdfsUser+","+hdfsSn);
         
         String  hdfsFile=hdfsUrl+hdfsFileDir+fileName;
            
             FileSystem fs = FileSystem.get(URI.create(hdfsFile), conf);  
                 
             BufferedInputStream in = new BufferedInputStream(itemStream);
             log.debug("upload file to dfs =================="+hdfsFile);
             FSDataOutputStream out = fs.create(new Path(hdfsFile));
             byte[] buffer = new byte[400];
               int length = 0;
               while((length = in.read(buffer))>0){
                   out.write(buffer,0,length);
               }
             in.close();
             out.close();
1.    Datanode节点中,datanode进程中断,tasktracker进程还在      
2.    Datanode节点中,datanode进程中断,tasktracker进程也中断
3      namenode节点中断
第一种情况不应该报错,它应该去找第二个datanode了。
后台都报错。
DEBUG [http-8080-7] (HadoopFileUtil.java:93) - upload file to dfs ==================hdfs://dev6.dev.ulechina..com/user/dfs/file1/1273459348316_nssdbm3.chk
java.net.ConnectException: Call to dev6.dev.ulechina.xxx.com/192.168.112.20:8020 failed on connection exception: java.net.ConnectException: Connection refused: no further information
    at org.apache.hadoop.ipc.Client.wrapException(Client.java:767)
    at org.apache.hadoop.ipc.Client.call(Client.java:743)
    at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
    at $Proxy5.delete(Unknown Source)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82)
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)
    at $Proxy5.delete(Unknown Source)
    at org.apache.hadoop.hdfs.DFSClient.delete(DFSClient.java:582)
    at org.apache.hadoop.hdfs.DistributedFileSystem.delete(DistributedFileSystem.java:227)
    at org.claros.intouch.webmail.services.DeleteAllDFSAttachmentsService.DeleteDFSFile(DeleteAllDFSAttachmentsService.java:121)
    at org.claros.intouch.webmail.services.DeleteAllDFSAttachmentsService.deleteAll(DeleteAllDFSAttachmentsService.java:100)
    at org.claros.intouch.webmail.services.DeleteAllDFSAttachmentsService.doGet(DeleteAllDFSAttachmentsService.java:63)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:617)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:717)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at org.claros.intouch.common.filters.GZIPFilter.doFilter(GZIPFilter.java:26)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)
    at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
    at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:128)
    at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102)
    at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
    at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:293)
    at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:849)
    at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:583)
    at org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:454)
    at java.lang.Thread.run(Thread.java:619)
Caused by: java.net.ConnectException: Connection refused: no further information
    at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
    at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:574)
    at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
    at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404)
    at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:304)
    at org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:176)
    at org.apache.hadoop.ipc.Client.getConnection(Client.java:860)
    at org.apache.hadoop.ipc.Client.call(Client.java:720)
    ... 31 more

已有(1)人评论

跳转到指定楼层
xiaolongwu1987 发表于 2013-10-25 10:45:56
回复 1# hhh2100
"

8020默认应该是namenode的端口,所以这里应该是namenode拒绝了你的连接,没见到连接不上datanode的错误。
你看看是不是namenode挂了。
回复

使用道具 举报

您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

关闭

推荐上一条 /2 下一条