Just like this:
hdfs_client = HdfsClient(hosts='10.1.103.49:50070')
root_files = hdfs_client.listdir('/')
print(root_files)
hdfs_client.copy_from_local('./Readme.md', '/image/Readme.md')
it can print root_files successfully, but copy_from_local cannot run. The error info is
HTTPConnectionPool(host='ubuntu-ms-7d25', port=50075): Max retries exceeded with url: /webhdfs/v1/image/Readme.md?op=CREATE&user.name=zzz&namenoderpcaddress=10.1.103.49:9000&overwrite=false&user.name=zzz (Caused by NameResolutionError("<urllib3.connection.HTTPConnection object at 0x7fe3d8c157b0>: Failed to resolve 'ubuntu-ms-7d25' ([Errno -2] Name or service not known)"))
I even don't know where "ubuntu-ms-7d25" come from, my hadoop conf files doesn't have this hostname.
So I want to know how to solve this problem?