Orateur
Description
Anomaly detection refers to the identification of rare events that differ significantly from the normal trend observed in the data distribution. When the number of variables to analyze is large, it can be difficult to understand the detected anomaly without explanation. In this work, we present the prototype of an explainable and real-time anomaly detection system, based on measurements from a multivariate datastream which can be assimilated to an infinite multivariate time series. The built system is composed of a set of anomaly detection methods combining deep neural networks and decision trees as well as an agnostic explainability method. In an unsupervised learning context, we also show how explainability provides insights to validate the system.