I have a U2/UniVerse database, and need to copy the data in one table into a SQL Server table. The table in question has approx 600,000 rows and just under 200 columns. I didn't create the table, and can't change it.
For other tables, I'm looping thru a UniDataSet
one record at a time and adding it to a DataTable
, and then using SqlBulkCopy
to copy the records to the SQL Server. This works fine, but with the large table I seem to be running out of memory when creating the DataTable
.
DataTable dt = new DataTable("myTempTable");
dt.Columns.Add("FirstColumn", typeof(string));
dt.Columns.Add("SecondColumn", typeof(string));
... //adding a bunch more columns here
dt.Columns.Add("LastColumn", typeof(string));
U2Connection con = GetU2Con();
UniSession us1 = con.UniSession;
UniSelectList s1 = us1.CreateUniSelectList(0);
UniFile f1 = us1.CreateUniFile("MyU2TableName")
s1.Select(f1);
UniDataSet uSet = f1.ReadRecords(s1.ReadListAsStringArray());
foreach (UniRecord uItem in uSet)
{
List<String> record = new List<String>(uItem.Record.ToString().Split(new string[] { "รพ" }, StringSplitOptions.None));
DataRow row = dt.NewRow();
row[0] = uItem.RecordID;
row[1] = record[0];
row[2] = record[1];
... //add the rest of the record
row[50] = record[49]
dt.Rows.Add(row);
}
con.Close();
So that copies the records from the UniDataSet
into a DataTable
. Then, I SqlBulkCopy
the DataTable
into a SQL table:
string SQLcon = GetSQLCon();
using (SqlBulkCopy sbc = new SqlBulkCopy(SQLcon))
{
sbc.DestinationTableName = "dbo.MySQLTableName";
sbc.BulkCopyTimeout = 0;
sbc.BatchSize = 1000; //I've tried anywhere from 50 to 50000
try
{
sbc.WriteToServer(dt);
}
catch
{
Console.WriteLine(ex.Message);
}
}
This works just fine for my U2 tables that have 50,000 or so rows, but it basically crashes the debugger (VS Express 2012) when the table has 500,000 rows. The PC I'm doing this on is Windows 7 x64 with 4GB ram. The VS process looks like it uses up to 3.5GB RAM before it crashes.
I'm hoping there's a way to write the UniDataSet right to SQL using SqlBulkCopy, but I'm not too familiar with the U2 .Net toolkit.
The problem I face is the UniDataSet records are multivalue, and I need to pick them apart before I can write them to SQL.
Thanks!
DataTable it gets too much bigger in memory, before inserting to Database. Why don't you split the bulk insert operation? For example read the first 50.000 results and the insert to Sql server database, clear the DataTable Memory and start again with the next 50.000 rows.