I am trying to insert 5.6m rows into SQLserver db from a ruby script and finding the fastest way. First solution was to write a SQL Insert statement and pass values inside like
("INSERT INTO fct_coupons_distributed (coupon_campaign_id, coupon_unique_code) VALUES #{batchData}"
but as i am using TinyTDS the limit is 1000 rows at a time. (suggest me if i can fix it any how. i would like 35k rows at a time in one SQL)
The second way i tried is using ActiveRecord Import gem which i am having problems with. so my data is in loop and after the first batch insertion it gives me an error like
ArgumentError: struct size differs
Ruby Script:
def create_coupons
campaign = [483, 482]
start_at = Time.now
puts start_at
column = [:coupon_campaign_id, :coupon_unique_code]
batchData = []
campaign.each_with_index do |camp_id, index|
puts "-----------------------------#{index + 1}/80-----------------------------------------"
70000.times do
coupon_code = ([*('A'..'Z'),*('0'..'9')]-%w(0 1 I O)).sample(12).join
batchData << [camp_id, coupon_code]
end
FctCoupon.import column, batchData
end
puts "start_at: #{start_at}"
puts "end_at: #{Time.now}"
end
so the problem here is that it inserts all 70k rows for the first loop but when the second loop happens for my campaign id it just gives me an error saying ArgumentError: struct size differs
same if i mention a batch_size FctCoupon.import column, batchData, batch_size: 35000 after inserting 35k rows it give the same error