Newbie here. Trying to run the code from Nathan Marz's book Big Data DFS Datastore using Pail. What am I doing wrong? Trying to connect to an HDFS VM. Tried replacing hdfs with file. Any help appreciated.
public class AppTest
{
private App app = new App();
private String path = "hdfs:////192.168.0.101:8080/mypail";
@Before
public void init() throws IllegalArgumentException, IOException{
FileSystem fs = FileSystem.get(new Configuration());
fs.delete(new Path(path), true);
}
@Test public void testAppAccess() throws IOException{
Pail pail = Pail.create(path);
TypedRecordOutputStream os = pail.openWrite();
os.writeObject(new byte[] {1, 2, 3});
os.writeObject(new byte[] {1, 2, 3, 4});
os.writeObject(new byte[] {1, 2, 3, 4, 5});
os.close();
}
}
Get an error -
java.lang.IllegalArgumentException: Wrong FS: hdfs:/192.168.0.101:8080/mypail, expected: file:///
at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:645)
at org.apache.hadoop.fs.RawLocalFileSystem.pathToFile(RawLocalFileSystem.java:80)
at org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:529)
at org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:747)
On replacing HDFS with file as file:///
java.io.IOException: Mkdirs failed to create file:/192.168.0.101:8080/mypail (exists=false, cwd=file:/Users/joshi/git/projectcsr/projectcsr)
at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:442)
at
I came across the same problem and I solved it! You should add your
core-site.xml
to the hadoopConfiguration
object, something like this should work:I guess you could do the same also programmatically adding the property
fs.defaultFS
to thecfg
objectSource: http://opensourceconnections.com/blog/2013/03/24/hdfs-debugging-wrong-fs-expected-file-exception/