I get a 6M rows table need to sent to another server each day. it seems never finish when i do testing by:
insert into serverA.dbName.dbo.table1
select * from serverB.dbName.dbo.table1
is there such thing like "do 10000 rows each time"
something like
insert into serverA.dbName.dbo.table1
select * from serverB.dbName.dbo.table1
**by numOfRowEachTime=10000**
Otherwise i will write a loop.
6 million rows in a single transaction is a pretty sizable chunk to begin with, and DML statements across linked servers aren't great either. But two methods to consider (that have worked for me) are:
If there is an easy way to copy them in order (such as an ID field) then you can:
Write a proc on ServerB that does a loop similar to the following:
INSERT INTO Table1 SELECT ... FROM (either @XMLvariable.Nodes or OPENXML)
[ProcessStatus]
table is just to keep track of the current position in case the process fails and hence it can start where it left off, or you can even set a time limit or counter of X iterations and schedule to run from a SQL Agent job every N minutes or so (which is how I have always done it).Use SQLCLR to make use of the
SqlBulkCopy
class that is designed for this type of task. It is essentially the same API that is used by BCP, OPENROWSET(BULK...), and BULK INSERT. You can write your own CLR-based stored procedure, or there is one already done and available for free in the SQL# library (which I am the author of, but again, it is free). The stored procedure is calledDB_BulkCopy
and allows for setting a batch size.