HyperAIHyperAI

Command Palette

Search for a command to run...

5 months ago

PRformer: Pyramidal Recurrent Transformer for Multivariate Time Series Forecasting

Yongbo Yu; Weizhong Yu; Feiping Nie; Xuelong Li

PRformer: Pyramidal Recurrent Transformer for Multivariate Time Series Forecasting

Abstract

The self-attention mechanism in Transformer architecture, invariant to sequence order, necessitates positional embeddings to encode temporal order in time series prediction. We argue that this reliance on positional embeddings restricts the Transformer's ability to effectively represent temporal sequences, particularly when employing longer lookback windows. To address this, we introduce an innovative approach that combines Pyramid RNN embeddings(PRE) for univariate time series with the Transformer's capability to model multivariate dependencies. PRE, utilizing pyramidal one-dimensional convolutional layers, constructs multiscale convolutional features that preserve temporal order. Additionally, RNNs, layered atop these features, learn multiscale time series representations sensitive to sequence order. This integration into Transformer models with attention mechanisms results in significant performance enhancements. We present the PRformer, a model integrating PRE with a standard Transformer encoder, demonstrating state-of-the-art performance on various real-world datasets. This performance highlights the effectiveness of our approach in leveraging longer lookback windows and underscores the critical role of robust temporal representations in maximizing Transformer's potential for prediction tasks. Code is available at this repository: \url{https://github.com/usualheart/PRformer}.

Code Repositories

usualheart/prformer
Official
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
multivariate-time-series-forecasting-on-etth1PRformer
MAE: 0.383
MSE: 0.354
multivariate-time-series-forecasting-on-etth1-1PRformer
MAE: 0.410
MSE: 0.397
time-series-forecasting-on-electricity-192PRformer
MSE: 0.148
time-series-forecasting-on-electricity-336PRformer
MSE: 0.161
time-series-forecasting-on-electricity-720PRformer
MSE: 0.185
time-series-forecasting-on-electricity-96PRformer
MSE: 0.127
time-series-forecasting-on-etth1-192-1PRformer
MSE: 0.397
time-series-forecasting-on-etth1-336-1PRformer
MSE: 0.427
time-series-forecasting-on-etth1-720-1PRformer
MSE: 0.489
time-series-forecasting-on-etth1-96-1PRformer
MSE: 0.354
time-series-forecasting-on-etth2-192-1PRformer
MSE: 0.332
time-series-forecasting-on-etth2-336-1PRformer
MSE: 0.361
time-series-forecasting-on-etth2-720-1PRformer
MSE: 0.396
time-series-forecasting-on-etth2-96-1PRformer
MSE: 0.268
time-series-forecasting-on-ettm1-192-1PRformer
MSE: 0.324
time-series-forecasting-on-ettm1-336-1PRformer
MSE: 0.362
time-series-forecasting-on-ettm1-720-1PRformer
MSE: 0.426
time-series-forecasting-on-ettm1-96-1PRformer
MSE: 0.278
time-series-forecasting-on-ettm2-192-1PRformer
MSE: 0.219
time-series-forecasting-on-ettm2-336-1PRformer
MSE: 0.272
time-series-forecasting-on-ettm2-720-1PRformer
MSE: 0.359
time-series-forecasting-on-ettm2-96-1PRformer
MSE: 0.162
time-series-forecasting-on-traffic-192PRformer
MSE: 0.372
time-series-forecasting-on-traffic-336PRformer
MSE: 0.385
time-series-forecasting-on-traffic-720PRformer
MSE: 0.421
time-series-forecasting-on-traffic-96PRformer
MSE: 0.353
time-series-forecasting-on-weather-192PRformer
MSE: 0.188
time-series-forecasting-on-weather-336PRformer
MSE: 0.241
time-series-forecasting-on-weather-720PRformer
MSE: 0.326
time-series-forecasting-on-weather-96PRformer
MSE: 0.144

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp