TEDformer: Temporal Feature Enhanced Decomposed Transformer for Long-Term Series Forecasting
In recent years, Transformer-based models have achieved good results in the analysis and application of time series. In particular, the introduction of Autoformer has further improved the performance of the model in long-term sequence prediction. However, Transformer-based models, such as Autoformer...
Saved in:
| Main Authors: | , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
IEEE
2025-01-01
|
| Series: | IEEE Access |
| Subjects: | |
| Online Access: | https://ieeexplore.ieee.org/document/10156810/ |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | In recent years, Transformer-based models have achieved good results in the analysis and application of time series. In particular, the introduction of Autoformer has further improved the performance of the model in long-term sequence prediction. However, Transformer-based models, such as Autoformer, have not fully considered the local temporal features of the sequence, and have not addressed the impact of sequence anomalies on decomposition and the processing of trend terms. To address these issues, we combined the excellent performance of the time convolutional neural network (TCN) on time series data and the advantages of the STL inner-outer loop decomposition to design the TEDformer, a Transformer prediction model enhanced with global and local temporal features. The model decomposes the time series into trend and periodic terms using STL and extracts temporal features accordingly. We conducted experiments on six real-world datasets, and the results showed that our model improved by 10.8% on multivariate datasets and 15.7% on univariate datasets compared to state-of-the-art models. |
|---|---|
| ISSN: | 2169-3536 |