MongoDB bulk insertion in a collection not updates DB

19 views Asked by At

I have a query regarding MongoDB insertion of a new field . In a collection , there is an array of objects and I want to introduce a new field in the array. Usually, the array size in a record starts from 20. In case of 1000 records in a collection and each collection at least has 20 array objects , there is an update of 20000 times if we update one by one.

I have written a java program to insert the field with true / false value.

try (MongoClient mongoClient = MongoClients.create(connectionString)) {


                String result = EntityUtils.toString(entity);
                ObjectMapper objectMapper = new ObjectMapper();
                JsonNode jsonNode = objectMapper.readTree(result);
                ArrayNode jsonArray = (ArrayNode) jsonNode.get("data").get("measures");
                List<String> shortNames = new ArrayList<>();
                jsonArray.forEach(ele -> {
                    String values= ele.get("values").toString();
                    shortNames.add(values.substring(1, values.length() - 1));
                });

                MongoDatabase model = mongoClient.getDatabase("model");
                MongoCollection<Document> Collection = model .getCollection("BigMeasure");

                List<Document> iterDoc = customModelCollection.find().into(new ArrayList<>());

                for (Document document : iterDoc) {
                    System.out.println(" Name" + document.getString("name"));
                    List<Document> info= (List<Document>) document.get("info");
                    for (Document document1 : info) {
                     
                        Object obj = document1.get("varName");
                        String varName = obj.toString();
                        System.out.println("varName" + varName);
                        Collections.sort(shortNames);


                        if (varName != null) {

                            if (shortNames.stream().anyMatch( x -> x.equalsIgnoreCase(varName.trim()))) {
                                document1.put("isCorrectMeasure", true);
                            } else {
                                document1.put("isCorrectMeasure", false);
                            }


                            Bson dMatch = Filters.eq("varName", document1.get("varName").toString());
                            Bson filter = and(eq("_id", document.get("_id")), and(Filters.elemMatch("info", dMatch)));
                            UpdateOptions options = new UpdateOptions().upsert(true);
                            Bson update = set("info", info);
                            bigMeasureCollection.updateOne(filter, update, options);


                        }
                    }
                }

So , I modified the above one to

 Bson dMatch = Filters.eq("varName", document1.getString("varName"));
                                Bson filter = and(eq("_id", document.get("_id")), and(Filters.elemMatch("info", dMatch)));
                                bulkUpdates.add(new UpdateOneModel<>(filter, set("info", info)));

 if (!bulkUpdates.isEmpty()) {
                    BulkWriteOptions options = new BulkWriteOptions().ordered(false);
                    bigMeasureCollection.bulkWrite(bulkUpdates, options);
                }

The last one is executed fine in local. However , fails in QA database where more number of records are there .. Neither no exception is seen in the code nor no updates in DB.

Can anyone throw light on this?

0

There are 0 answers