How to bulk copy one field to a first object of array and update the document in MongoDB?

250 views Asked by At

I want to copy price information in my document to prices[] array.

var entitiesCol = db.getCollection('entities');
entitiesCol.find({"type": "item"}).forEach(function(item){
    item.prices = [ {
        "value": item.price
    }];
    entitiesCol.save(item);
});

It takes too long time and some fields are not updated.

I am using Mongoose in server side and I can also use it.

What can I do for that?

1

There are 1 answers

5
chridam On

In the mongo shell, you can use the bulkWrite() method to carry out the updates in a fast and efficient manner. Consider the following example:

var entitiesCol = db.getCollection('entities'),
    counter = 0,
    ops = [];
entitiesCol.find({
    "type": "item",
    "prices.0": { "$exists": false }
}).snapshot().forEach(function(item){
    ops.push({
        "updateOne": {
            "filter": { "_id": item._id },
            "update": { 
                "$push": {
                    "prices": { "value": item.price }
                }
            }
        }
    });
    counter++;

    if (counter % 500 === 0) {
        entitiesCol.bulkWrite(ops);
        ops = [];
    }
})

if (counter % 500 !== 0) 
    entitiesCol.bulkWrite(ops);

The counter variable above is there to manage your bulk updates effectively if your collection is large. It allows you to batch the update operations and sends the writes to the server in batches of 500 which gives you a better performance as you are not sending every request to the server, just once in every 500 requests.

For bulk operations MongoDB imposes a default internal limit of 1000 operations per batch and so the choice of 500 documents is good in the sense that you have some control over the batch size rather than let MongoDB impose the default, i.e. for larger operations in the magnitude of > 1000 documents.