Hadoop DFSClient getFileInfo(): An existing connection was forcibly closed by the remote host

522 views Asked by At

I am trying to copy file from local to hadoop in HDP (Hortonworks Data Platform) Cluster. But I am getting connection refused exception at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2116) Do I need to change any configuration of hadoop in cluster?

Sample Program :

    String hdfsUrl = "hdfs://ambari-agent-1:8020";
    Configuration configuration = new Configuration();
    configuration.set("dfs.client.use.datanode.hostname", "true");
    FileSystem fs = FileSystem.get(new URI(hdfsUrl), configuration);
    Path srcPath = new Path("C:\\HadoopPlugin\\SampleData\\salesrecord.csv");
    Path destPath = new Path("hdfs://ambari-agent-1:8020/user/hadoop/testResult/");
    fs.copyFromLocalFile(srcPath, destPath);
    System.out.println("Copied file successfully");

Exception Stacktrace :

Exception in thread "main" java.io.IOException: Failed on local exception: java.io.IOException: An existing connection was forcibly closed by the remote host; Host Details : local host is: "DT01070442/192.168.44.7"; destination host is: "ambari-agent-1":8020; 
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:776)
at org.apache.hadoop.ipc.Client.call(Client.java:1480)
at org.apache.hadoop.ipc.Client.call(Client.java:1407)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
at com.sun.proxy.$Proxy9.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:771)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
at com.sun.proxy.$Proxy10.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2116)
at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1305)
at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1301)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1317)
at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1424)
at org.apache.hadoop.fs.FileUtil.checkDest(FileUtil.java:496)
at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:348)
at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:338)
at org.apache.hadoop.fs.FileSystem.copyFromLocalFile(FileSystem.java:1965)
at org.apache.hadoop.fs.FileSystem.copyFromLocalFile(FileSystem.java:1933)
at org.apache.hadoop.fs.FileSystem.copyFromLocalFile(FileSystem.java:1898)
at practiceHadoop.CopyFile.main(CopyFile.java:19)
Caused by: java.io.IOException: An existing connection was forcibly closed by the remote host
at sun.nio.ch.SocketDispatcher.read0(Native Method)
at sun.nio.ch.SocketDispatcher.read(SocketDispatcher.java:43)
at sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:223)
at sun.nio.ch.IOUtil.read(IOUtil.java:197)
at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:380)
at org.apache.hadoop.net.SocketInputStream$Reader.performIO(SocketInputStream.java:57)
at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142)
at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:161)
at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:131)
at java.io.FilterInputStream.read(FilterInputStream.java:133)
at java.io.FilterInputStream.read(FilterInputStream.java:133)
at org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:515)
at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
at java.io.DataInputStream.readInt(DataInputStream.java:387)
at org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1079)
at org.apache.hadoop.ipc.Client$Connection.run(Client.java:974)
1

There are 1 answers

0
Soungno Kim On

This generally means that the remote side closed the connection(usually by sending a TCP/IP RST packet)

The likely causes are: Network problems - routers, firewalls, line Server(HOST) problems - lack of resources, OS problem Service problems - name node, jurnalnode, datanode service

In order to find the cause of the problems, you must fist check the logs for each step.

First, check the Hadoop service log

The Hadoop name node log can be found at /var/hadoop/hadoop-hdfs-namenode-xxx.log (Location and name may be different)

good luck!