I'm writing a tool to copy data into SQL Server using C# SqlBulkCopy. For reliability reasons, I'm loading each batch of data into a DataTable (so that later I can retry on the same batch if the insertion failed), and users can specify the batch size. Now the problem is if the user sets the batch size to a very large value, the memory consumption of DataTable would be huge and even causing OutOfMemoryException.
One solution I can think of is to sum up the data size as I cache the data into DataTable and control the size there. Is there any better approaches?
Thank you.