Mongodb large time series decimation / query improvments

148 views Asked by At

I have a time series collection for sensors data like :

{"date":{"$date":"2018-01-01T00:00:05.045Z"},"metaField":{"assets":{"city":"Meylan","couleur":"BlougeCeyBienBlouge","country":"France","machine_name":"totomachine","monitoring_id":666},"input_name":"inputname1","results":{"head":{"dimension":1,"id":"HEAD","is_complex":0,"magnitude_key":"Acceleration","measure_status":2,"name":"FFT 1: FFT1: OvPwre [1]-Input 1","overall_level_status":0,"physical_quantity":"","result_type":1,"size":9,"unit_label":null,"unit_name":null,"version":5,"weighted_ol_status":0,"weighting_window":0},"info":{"global_level":0,"id":"INFO","info_mask":0,"tacho":0,"version":1,"weighted_global_level":0},"module_id":10,"process_id":21,"scal":{"id":"SCAL","imag":0,"version":2}}},"_id":{"$oid":"623b3c53f492b4b87a47f0d6"},"value":51}

{"date":{"$date":"2018-01-01T00:00:10.122Z"},"metaField":{"assets":{"city":"Meylan","couleur":"BlougeCeyBienBlouge","country":"France","machine_name":"totomachine","monitoring_id":666},"input_name":"inputname1","results":{"head":{"dimension":1,"id":"HEAD","is_complex":0,"magnitude_key":"Acceleration","measure_status":2,"name":"FFT 1: FFT1: OvPwre [1]-Input 1","overall_level_status":0,"physical_quantity":"","result_type":1,"size":9,"unit_label":null,"unit_name":null,"version":5,"weighted_ol_status":0,"weighting_window":0},"info":{"global_level":0,"id":"INFO","info_mask":0,"tacho":0,"version":1,"weighted_global_level":0},"module_id":10,"process_id":21,"scal":{"id":"SCAL","imag":0,"version":2}}},"_id":{"$oid":"623b3c53f492b4b87a47f0dc"},"value":77}

{"date":{"$date":"2018-01-01T00:00:15.165Z"},"metaField":{"assets":{"city":"Meylan","couleur":"BlougeCeyBienBlouge","country":"France","machine_name":"totomachine","monitoring_id":666},"input_name":"inputname1","results":{"head":{"dimension":1,"id":"HEAD","is_complex":0,"magnitude_key":"Acceleration","measure_status":2,"name":"FFT 1: FFT1: OvPwre [1]-Input 1","overall_level_status":0,"physical_quantity":"","result_type":1,"size":9,"unit_label":null,"unit_name":null,"version":5,"weighted_ol_status":0,"weighting_window":0},"info":{"global_level":0,"id":"INFO","info_mask":0,"tacho":0,"version":1,"weighted_global_level":0},"module_id":10,"process_id":21,"scal":{"id":"SCAL","imag":0,"version":2}}},"_id":{"$oid":"623b3c53f492b4b87a47f0e2"},"value":100}

Indexes are : date -1, input_name 1, metaField.results.module_id 1 and metaField.results.process_id 1.

Could be hundred sensors, and pushing samples every 10s. And I want to query an array, from input_name, module_id and process_id. A decimation is needed if range date is large.

I first tried to aggregate value per year/month/day/hour (and minute if range time is not too large) :

{"$match": {"metaField.input_name": "inputname1", "metaField.results.module_id": 10, "metaField.results.process_id": 21, "date:"{"$gte": new ISODate("2018-01-01T01:01:01Z"), "$lt": new ISODate("2020-01-01T01:01:01Z")}},
"$group": {"_id": {
            "input_name": "$metaField.input_name",
            "hour": {"$hour": "$date"},
            "day": {"$dayOfMonth": "$date"},
            "month": {"$month": "$date"},
            "year": {"$year": "$date"}},
            "date": {"$first": "$date"},
            "value": {"$first": "$value"}},
"$sort": {"date": 1},
"$group": {"_id": "$_id.input_name",
                                        "data": {"$push": "$value"},
                                        "date": {"$push": "$date"}}}

But performance is not acceptable for a huge date range ... (taking 30s for 3 years range..).

Do you have any better idea, or a way to perform a decimation not based on date ? like retrieve 1/10 values ?

0

There are 0 answers