I'm inserting large volume of data into PostgreSQL database using spring boot approximate about 1 million records, these data comes in batches during API calls and I'm saving each batch as the whole in database but the only challenge facing right now is that only half of data is inserted into database about 500k records instead of whole 1 million records I don't know where I'm doing it wrong can you please provide some suggestions, these batches comes from the desktop application already divided into groups of 50000 records here below are my source codes for both spring boot application and desktop application
Service class @Service public class OrderProcessingService {
@Autowired
private OrderRepository orderRepository;
@Async
public void process_data(List<Order> order_list) {
visitRepository.saveAll(order_list);
}
}
Rest Api
@RestController
@RequestMapping("/api")
public class DataUploading {
@Autowired
private OrderProcessingService order_processing_service;
@PostMapping("/orders/{hfrcode}")
private void get_orders(@RequestBody List<Order> order) {
order_processing_service.process_data(order);
}
}
Desktop App data submission service
int batchSize = 40000;
for(int i=0; i< order_list.Count; i+= batchSize) {
var batch = order_list.Skip(i).Take(batchSize);
httpClient.PostAsJsonAsync("api/orders/" + hfrcode, batch);
}