Fault Detection on time sequence of variable changing (trending) over the time

137 views Asked by At

I am pretty new on anomaly detection on time sequence so my question can be obvious for some of you. Today, I am using lstm and clustering techniques to detect anomalies on time sequences but those method can not identify anomalies that get worse slowly over the time (i think it called trending), i.e temprature of machine increase slowly over the month (lstm will learn this trend and predict the increase without any special error). There is such a method to detect this kind of faluts?

1

There are 1 answers

1
Has QUIT--Anony-Mousse On BEST ANSWER

With time series that is usually what you want: learning gradual change, detecting abrupt change. Otherwise, time plays little role.

You can try e.g. the SigniTrend model with a very slow learning rate (a long half-life time or whatever they called it. Ignore all the tokens, hashing and scalability in that paper, only get the EWMA+EWMVar part which I really like and use it on your time series).

If you set the learning rate really low, the threshold should move slow enough so that your "gradual" change may still be able to trigger them.

Or you ignore time completely. Split your data into a training set (that must not contain anomalies), learn mean and variance on that to find thresholds. Then classify any point outside these thresholds as abnormal (I.e. temperature > mean + 3 * standarddeviation). As this super naive approach does not learn, it will not follow a drift either. But then time does not play any further role.