I am trying to save file in Hadoop with python 2.7. I searched on the internet. I got some code to save a file in Hadoop but it is taking the entire folder while saving (total files in the folder are saving in Hadoop). But I need to save a specific file.
Here is the link to save a folder in Hadoop: http://www.hadoopy.com/en/latest/tutorial.html#putting-data-on-hdfs
Now what I need is save a particular file in Hadoop like abc.txt
.
Here is my code:
import hadoopy
hdfs_path = 'hdfs://192.168.x.xxx:xxxx/video/py5'
def main():
local_path = open('abc.txt').read()
hadoopy.writetb(hdfs_path, local_path)
if __name__ == '__main__':
main()
Here i am getting need more than one value to unpack
Any help would be appreciated.
The
hadoopy.writetb
seems to expects an iterable of two-values as its second argument. Try: