I have a C# console application (.NET Framework 4.7.2) that's been running just fine for years. It iterates over a MongoDB collection. The MongoDB collection was recently moved into DocumentDB in AWS. Now, iterating over the collection takes forever. I'm using the C# MongoDB driver version: 2.23.1.0. The collection has around 300,000 documents. Whenever I get around 60,000 documents in, it slows down significantly. I'm using a batch size of 5,000 and each batch takes less than a second until I get around 60,000 which then takes a minute or longer. Each iteration after that continues to slow down. When I get to around 70,000 it's taking over 4 minutes. The collection is indexed on the _id field and, for test purposes, that's the only field being projected.
Below is some of the code I've tried. What am I missing?
var projectionBuilder = Builders<BsonDocument>.Projection;
List<string> proj = new List<string>() { "_id" };
var projection = projectionBuilder.Combine(proj.Select(field => projectionBuilder.Include(field)));
var findOptions = new FindOptions
{
BatchSize = 5000
};
var cursor = collection.Find(new BsonDocument(), findOptions)
.Project(projection)
.ToCursor();
//var cursor = collection.Find(new BsonDocument())
// .Sort("{ \"_id\" : 1.0}")
// .Skip(loadedCount)
// .Limit(5000)
// .Project(projection)
// .ToCursor();
//List<BsonDocument> documents = collection.Aggregate<BsonDocument>()
// .Skip(loadedCount)
// .Limit(5000).ToList();
while (cursor.MoveNext())
{
foreach (var doc in cursor.Current)
{
;
}
}