I have a Netezza DB with one table of 4 billion records. I am working on a migration project to move the data to SQL Server. I created a simple SSIS ETL but that is running for a very long time and stops due to buffer memory issue. What is the efficient quicker way(s) of transferring such huge amount of data?
You can try to separate the source data into batches, for example 1,000,000 rows(depends on your memory) a batch to merge into Netezza table.