I have successfully read item from the database using spring batch.
Now I have to write this into a xml file but here is a catch xml file size which can not be more than 100mb if it is then content should be written to another file .
Is there any way that we can limit file size while configuring writer step or will I have to implement custom writer. Any suggestions?
I don't believe there is a way of doing this using spring batch as it is.
The solution you should implement is dependent on the requirement on the file being 100mb max size.
Is it a system infrastructure requirement? or a requirement for the file receiving system?
You have 3 options:
As you stated, can create a custom writer for splitting the file when it reaches this max size. This is quite nasty as you will essentially have to write a completely new writer which will then have to be maintained by you, and the actual implementation would not be very nice either (have to write to the file, check size, then if exceeds 100mb then remove from the file again).
Guess the chunk size for how big the file is. This is just guessing that 1 record will be 1mb in size (example) therefore each file can only have 100 records in it maximum. This is quite a risky solution as it's not guaranteeing that the file is under 100mb in size, but i believe this can be achieved using standard spring batch.
Create an additional step to split the file up after having written the large file. This is my preferred solution. It is that you create a single large "temporary" file with all the contents (say 450mb). Then you have a secondary process (step) which splits this file into 5 separate files (4x100mb, 1x50mb). This can all be done within spring batch as well, and within the same batch job if required