use perl to parse a big file and insert data into mongo ,index before or after dump data into db?

412 views Asked by At

Now ,I am using mongodb to persist a very big-size file (90G), which has nearly 40,000,000 items.

I read and parse this file and insert all items into mongodb(my programming language is perl, batch_insert instead of insert, and I map one item to one mongodb document).

Before I insert ,I have already pre-created indexes(about 10 index keys).

I find that insert speed cannot meet my need(200 to 400 items per second).

I know ,too many index keys will definitely slow down my insert, especially when the size of collection becomes quite big.

So , I wonder if I can index them after I have dumped all the data into db.Any one can tell me if this way is available, or, if this way can definitely save my time?

1

There are 1 answers

1
alex On

You can try to drop indexes before big insert and then create indexes again after. It can be dramatically faster.