Command Palette
Search for a command to run...
Mamba Adaptive Anomaly Transformer with association discrepancy for time series
Abdellah Zakaria Sellam Ilyes Benaissa Abdelmalik Taleb-Ahmed Luigi Patrono Cosimo Distante

Abstract
Anomaly detection in time series is essential for industrial monitoring and environmental sensing, yet distinguishing anomalies from complex patterns remains challenging. Existing methods like the Anomaly Transformer and DCdetector have progressed, but they face limitations such as sensitivity to short-term contexts and inefficiency in noisy, non-stationary environments. To overcome these issues, we introduce MAAT, an improved architecture that enhances association discrepancy modeling and reconstruction quality. MAAT features Sparse Attention, efficiently capturing long-range dependencies by focusing on relevant time steps, thereby reducing computational redundancy. Additionally, a Mamba-Selective State Space Model is incorporated into the reconstruction module, utilizing a skip connection and Gated Attention to improve anomaly localization and detection performance. Extensive experiments show that MAAT significantly outperforms previous methods, achieving better anomaly distinguishability and generalization across various time series applications, setting a new standard for unsupervised time series anomaly detection in real-world scenarios.
Code Repositories
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| anomaly-detection-on-smd | MAAT | F1-score: 92.3 Recall: 95.82 precision: 89.03 |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.