After installing cloudera HDC on fedora25 , I can create folders, but not files nor can I copy data from my local file system to HDFS.
This is the command I use:
sudo -u hdfs hadoop fs -copyFromLocal /home/mohammed/Documents/bbc.txt /kareem/corpora/
and this is what I get from the terminal:
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
copyFromLocal: '/home/mohammed/Documents/bbc.txt': No such file or directory
How to overcome this problem?
Your kind help is highly appreciated!
The problem is that your local path of
/home/mohammed
is not accessible as the userhdfs
you're sudo-ing as to run the whole command. Since the local linux user ofhdfs
cannot enter/home/mohammed
, the command throws aNo such file or directory
error and exits as a result of being unable to locate or read the provided file.In most packaged HDFS installations, the
hdfs
user is typically the super-user of the distributed filesystem and administrative commands are typically run as that user. However, doing work over data can and should be done as regular users after using thehdfs
user to provision permissions and ownerships for regular users.For your case, you can do the following as your
mohammed
user, if this account also has sudo privileges: