Spring Batch + MongoItemReader, ItemProcessor, MongoItemWriter + not reading all records

940 views Asked by At

I'm using Spring Batch with Mongo database.

I need to get the documents based on status (status=PENDING), write in a Kafka queue and update documents field with new status(status=FILLED).

So I used a MongoItemReader, CompositeItemWriter (KafkaItemWriter, MongoItemWriter) to write in to Kafka queue and update it. But when I run the job I can see that some documents are skipped, and the number of skipped documents is equal to the chunk size.

For example, my collection has 15 documents, chunk size equals to 5 and the MongoItemReader reads the lines 1,2,3,4,5 then lines 11,12,13,14,15 (skipping the lines 6,7,8,9,10).

My ItemProcessor that performs modification to the POJOs Entities. Since MongoItemReader made a flush between each reading, the Entities are updated. But it seems that the cursor paging is also incremented (as can be seen in the log: row ID 6, 7, 8, 9 and 10 have been skipped). I have seen it has a work around in Relational Database by using "SqlPagingQueryProviderFactoryBean" but it not suites for NoSQL DB. I had tried everything to find the solution but nothing helped me.

So, how can I manage this issue ?

0

There are 0 answers