The right way to use BufferdOutputStream to write a large number of different small files

155 views Asked by At

I have a need to write a large number of different small files(about 30kb per file).Below is my java code:

 for(every file){
    File destFile = new File(fileName);
    try {
        FileOutputStream fos = new FileOutputStream(destFile);
        BufferedOutputStream bos = new BufferedOutputStream(fos);
        // System.out.println(dr.readLine());
        bos.write(result.getBytes(
                "UTF-8"));

    } catch (FileNotFoundException e) {
        // TODO Auto-generated catch block
        e.printStackTrace();
    } catch (UnsupportedEncodingException e) {
        // TODO Auto-generated catch block
        e.printStackTrace();
    } catch (IOException e) {
        // TODO Auto-generated catch block
        e.printStackTrace();
    }
}

Of course ,for every file,I have to new a File object.But is it necessary to new fileOutputStream and BufferdOutputStream object for every distinct file?Is there a more efficient way to write a large number of small files?

2

There are 2 answers

1
elbuild On

You can't share the FileOutputStream, as Sotirios says. The problem with your method is that you're creating a new File every iteration, without closing them. This may cause your program to overcome the limit of open files allowed.

On linux use:

cat /proc/sys/fs/file-max

to read the maximum number of files you can open.

and:

ulimit -n unlimited

to remove that limit (be careful and investigate other ulimit options).

0
t_ozawa On

No. If you use FileOutputStream wrapped into BufferedOutputStream, you must create both instance per file.

But if you use java 7, you can use Files#write(Path path, byte\[\] bytes, OpenOption... options) without creating both instance. This class has introduced from java 7.