Should We Reconsider RNNs for Time-Series Forecasting?
(1) Background: In recent years, Transformer-based models have dominated the time-series forecasting domain, overshadowing recurrent neural networks (RNNs) such as Long Short-Term Memory (LSTM) and Gated Recurrent Unit (GRU). While Transformers demonstrate superior performance, their high computatio...
Saved in:
| Main Authors: | , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
MDPI AG
2025-04-01
|
| Series: | AI |
| Subjects: | |
| Online Access: | https://www.mdpi.com/2673-2688/6/5/90 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| _version_ | 1849711416074829824 |
|---|---|
| author | Vahid Naghashi Mounir Boukadoum Abdoulaye Banire Diallo |
| author_facet | Vahid Naghashi Mounir Boukadoum Abdoulaye Banire Diallo |
| author_sort | Vahid Naghashi |
| collection | DOAJ |
| description | (1) Background: In recent years, Transformer-based models have dominated the time-series forecasting domain, overshadowing recurrent neural networks (RNNs) such as Long Short-Term Memory (LSTM) and Gated Recurrent Unit (GRU). While Transformers demonstrate superior performance, their high computational cost limits their practical application in resource-constrained settings. (2) Methods: In this paper, we reconsider RNNs—specifically the GRU architecture—as an efficient alternative to time-series forecasting by leveraging this architecture’s sequential representation capability to capture cross-channel dependencies effectively. Our model also utilizes a feed-forward layer right after the GRU module to represent temporal dependencies, and aggregates it with the GRU layers to predict future values of a given time-series. (3) Results and conclusions: Our extensive experiments conducted on different real-world datasets show that our inverted GRU (iGRU) model achieves promising results in terms of error metrics and memory efficiency, challenging or surpassing state-of-the-art models on various benchmarks. |
| format | Article |
| id | doaj-art-592b0b08a46348b099ba3c21167f4b6f |
| institution | DOAJ |
| issn | 2673-2688 |
| language | English |
| publishDate | 2025-04-01 |
| publisher | MDPI AG |
| record_format | Article |
| series | AI |
| spelling | doaj-art-592b0b08a46348b099ba3c21167f4b6f2025-08-20T03:14:38ZengMDPI AGAI2673-26882025-04-01659010.3390/ai6050090Should We Reconsider RNNs for Time-Series Forecasting?Vahid Naghashi0Mounir Boukadoum1Abdoulaye Banire Diallo2Department of Computer Science, Université du Québec à Montréal, Montreal, QC H2L 2C4, CanadaDepartment of Computer Science, Université du Québec à Montréal, Montreal, QC H2L 2C4, CanadaDepartment of Computer Science, Université du Québec à Montréal, Montreal, QC H2L 2C4, Canada(1) Background: In recent years, Transformer-based models have dominated the time-series forecasting domain, overshadowing recurrent neural networks (RNNs) such as Long Short-Term Memory (LSTM) and Gated Recurrent Unit (GRU). While Transformers demonstrate superior performance, their high computational cost limits their practical application in resource-constrained settings. (2) Methods: In this paper, we reconsider RNNs—specifically the GRU architecture—as an efficient alternative to time-series forecasting by leveraging this architecture’s sequential representation capability to capture cross-channel dependencies effectively. Our model also utilizes a feed-forward layer right after the GRU module to represent temporal dependencies, and aggregates it with the GRU layers to predict future values of a given time-series. (3) Results and conclusions: Our extensive experiments conducted on different real-world datasets show that our inverted GRU (iGRU) model achieves promising results in terms of error metrics and memory efficiency, challenging or surpassing state-of-the-art models on various benchmarks.https://www.mdpi.com/2673-2688/6/5/90time-seriesgated recurrent unitstemporal dependenciescross-channel correlations |
| spellingShingle | Vahid Naghashi Mounir Boukadoum Abdoulaye Banire Diallo Should We Reconsider RNNs for Time-Series Forecasting? AI time-series gated recurrent units temporal dependencies cross-channel correlations |
| title | Should We Reconsider RNNs for Time-Series Forecasting? |
| title_full | Should We Reconsider RNNs for Time-Series Forecasting? |
| title_fullStr | Should We Reconsider RNNs for Time-Series Forecasting? |
| title_full_unstemmed | Should We Reconsider RNNs for Time-Series Forecasting? |
| title_short | Should We Reconsider RNNs for Time-Series Forecasting? |
| title_sort | should we reconsider rnns for time series forecasting |
| topic | time-series gated recurrent units temporal dependencies cross-channel correlations |
| url | https://www.mdpi.com/2673-2688/6/5/90 |
| work_keys_str_mv | AT vahidnaghashi shouldwereconsiderrnnsfortimeseriesforecasting AT mounirboukadoum shouldwereconsiderrnnsfortimeseriesforecasting AT abdoulayebanirediallo shouldwereconsiderrnnsfortimeseriesforecasting |