HyperAI超神经
首页
资讯
最新论文
教程
数据集
百科
SOTA
LLM 模型天梯
GPU 天梯
顶会
开源项目
全站搜索
关于
中文
HyperAI超神经
Toggle sidebar
全站搜索…
⌘
K
首页
SOTA
Time Series Forecasting
Time Series Forecasting On Etth2 96 1
Time Series Forecasting On Etth2 96 1
评估指标
MSE
评测结果
各个模型在此基准测试上的表现结果
Columns
模型名称
MSE
Paper Title
Repository
MoLE-RLinear
0.273
Mixture-of-Linear-Experts for Long-term Time Series Forecasting
DiPE-Linear
0.275
Disentangled Interpretable Representation for Efficient Long-term Time Series Forecasting
PRformer
0.268
PRformer: Pyramidal Recurrent Transformer for Multivariate Time Series Forecasting
NLinear
0.277
Are Transformers Effective for Time Series Forecasting?
MoLE-DLinear
0.287
Mixture-of-Linear-Experts for Long-term Time Series Forecasting
LTBoost (drop_last=false)
0.263
LTBoost: Boosted Hybrids of Ensemble Linear and Gradient Algorithms for the Long-term Time Series Forecasting
PatchMixer
0.225
PatchMixer: A Patch-Mixing Architecture for Long-Term Time Series Forecasting
SegRNN
0.263
SegRNN: Segment Recurrent Neural Network for Long-Term Time Series Forecasting
xPatch
0.226
xPatch: Dual-Stream Time Series Forecasting with Exponential Seasonal-Trend Decomposition
TiDE
0.27
Long-term Forecasting with TiDE: Time-series Dense Encoder
FiLM
0.284
FiLM: Frequency improved Legendre Memory Model for Long-term Time Series Forecasting
TSMixer
0.276
TSMixer: Lightweight MLP-Mixer Model for Multivariate Time Series Forecasting
-
RLinear
0.262
Revisiting Long-term Time Series Forecasting: An Investigation on Linear Mapping
TEFN
0.288
Time Evidence Fusion Network: Multi-source View in Long-Term Time Series Forecasting
-
DLinear
0.289
Are Transformers Effective for Time Series Forecasting?
PatchTST/64
0.274
A Time Series is Worth 64 Words: Long-term Forecasting with Transformers
0 of 16 row(s) selected.
Previous
Next