what interfaces should a file system provide to make it be supported by Spark?

50 views Asked by At

I have developed a distributed file system which provide interfaces like standard Linux file system. Now I want it to be supported by Spark which means Spark can read files from it and save files to it just like HDFS. Since I am not familiar with Spark, my question is what interfaces should I provide to Spark or what requirements should the file system meet to be successfully operated by Spark?

0

There are 0 answers