A prediction approach to COVID-19 time series with LSTM integrated attention mechanism and transfer learning

Abstract Background The prediction of coronavirus disease in 2019 (COVID-19) in broader regions has been widely researched, but for specific areas such as urban areas the predictive models were rarely studied. It may be inaccurate to apply predictive models from a broad region directly to a small ar...

Full description

Saved in:
Bibliographic Details
Main Authors: Bin Hu, Yaohui Han, Wenhui Zhang, Qingyang Zhang, Wen Gu, Jun Bi, Bi Chen, Lishun Xiao
Format: Article
Language:English
Published: BMC 2024-12-01
Series:BMC Medical Research Methodology
Subjects:
Online Access:https://doi.org/10.1186/s12874-024-02433-w
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1832571560437743616
author Bin Hu
Yaohui Han
Wenhui Zhang
Qingyang Zhang
Wen Gu
Jun Bi
Bi Chen
Lishun Xiao
author_facet Bin Hu
Yaohui Han
Wenhui Zhang
Qingyang Zhang
Wen Gu
Jun Bi
Bi Chen
Lishun Xiao
author_sort Bin Hu
collection DOAJ
description Abstract Background The prediction of coronavirus disease in 2019 (COVID-19) in broader regions has been widely researched, but for specific areas such as urban areas the predictive models were rarely studied. It may be inaccurate to apply predictive models from a broad region directly to a small area. This paper builds a prediction approach for small size COVID-19 time series in a city. Methods Numbers of COVID-19 daily confirmed cases were collected from November 1, 2022 to November 16, 2023 in Xuzhou city of China. Classical deep learning models including recurrent neural network (RNN), long and short-term memory (LSTM), gated recurrent unit (GRU) and temporal convolutional network (TCN) are initially trained, then RNN, LSTM and GRU are integrated with a new attention mechanism and transfer learning to improve the performance. Ten times ablation experiments are conducted to show the robustness of the performance in prediction. The performances among the models are compared by the mean absolute error, root mean square error and coefficient of determination. Results LSTM outperforms than others, and TCN has the worst generalization ability. Thus, LSTM is integrated with the new attention mechanism to construct an LSTMATT model, which improves the performance. LSTMATT is trained on the smoothed time series curve through frequency domain convolution augmentation, then transfer learning is adopted to transfer the learned features back to the original time series resulting in a TLLA model that further improves the performance. RNN and GRU are also integrated with the attention mechanism and transfer learning and their performances are also improved, but TLLA still performs best. Conclusions The TLLA model has the best prediction performance for the time series of COVID-19 daily confirmed cases, and the new attention mechanism and transfer learning contribute to improve the prediction performance in the flatten part and the jagged part, respectively.
format Article
id doaj-art-0492d84d60de4fcfb6621bb26cbe9268
institution Kabale University
issn 1471-2288
language English
publishDate 2024-12-01
publisher BMC
record_format Article
series BMC Medical Research Methodology
spelling doaj-art-0492d84d60de4fcfb6621bb26cbe92682025-02-02T12:30:21ZengBMCBMC Medical Research Methodology1471-22882024-12-0124111010.1186/s12874-024-02433-wA prediction approach to COVID-19 time series with LSTM integrated attention mechanism and transfer learningBin Hu0Yaohui Han1Wenhui Zhang2Qingyang Zhang3Wen Gu4Jun Bi5Bi Chen6Lishun Xiao7School of Public Health, Xuzhou Medical UniversitySchool of Public Health, Xuzhou Medical UniversityDepartment of Pulmonary and Critical Care Medicine, Affiliated Hospital of Xuzhou Medical UniversityUniversity of Nottingham, University BlvSchool of Public Health, Xuzhou Medical UniversityXuzhou Center for Disease Control and PreventionDepartment of Pulmonary and Critical Care Medicine, Affiliated Hospital of Xuzhou Medical UniversitySchool of Public Health, Xuzhou Medical UniversityAbstract Background The prediction of coronavirus disease in 2019 (COVID-19) in broader regions has been widely researched, but for specific areas such as urban areas the predictive models were rarely studied. It may be inaccurate to apply predictive models from a broad region directly to a small area. This paper builds a prediction approach for small size COVID-19 time series in a city. Methods Numbers of COVID-19 daily confirmed cases were collected from November 1, 2022 to November 16, 2023 in Xuzhou city of China. Classical deep learning models including recurrent neural network (RNN), long and short-term memory (LSTM), gated recurrent unit (GRU) and temporal convolutional network (TCN) are initially trained, then RNN, LSTM and GRU are integrated with a new attention mechanism and transfer learning to improve the performance. Ten times ablation experiments are conducted to show the robustness of the performance in prediction. The performances among the models are compared by the mean absolute error, root mean square error and coefficient of determination. Results LSTM outperforms than others, and TCN has the worst generalization ability. Thus, LSTM is integrated with the new attention mechanism to construct an LSTMATT model, which improves the performance. LSTMATT is trained on the smoothed time series curve through frequency domain convolution augmentation, then transfer learning is adopted to transfer the learned features back to the original time series resulting in a TLLA model that further improves the performance. RNN and GRU are also integrated with the attention mechanism and transfer learning and their performances are also improved, but TLLA still performs best. Conclusions The TLLA model has the best prediction performance for the time series of COVID-19 daily confirmed cases, and the new attention mechanism and transfer learning contribute to improve the prediction performance in the flatten part and the jagged part, respectively.https://doi.org/10.1186/s12874-024-02433-wCOVID-19Time seriesDeep learningLSTMTransfer learning
spellingShingle Bin Hu
Yaohui Han
Wenhui Zhang
Qingyang Zhang
Wen Gu
Jun Bi
Bi Chen
Lishun Xiao
A prediction approach to COVID-19 time series with LSTM integrated attention mechanism and transfer learning
BMC Medical Research Methodology
COVID-19
Time series
Deep learning
LSTM
Transfer learning
title A prediction approach to COVID-19 time series with LSTM integrated attention mechanism and transfer learning
title_full A prediction approach to COVID-19 time series with LSTM integrated attention mechanism and transfer learning
title_fullStr A prediction approach to COVID-19 time series with LSTM integrated attention mechanism and transfer learning
title_full_unstemmed A prediction approach to COVID-19 time series with LSTM integrated attention mechanism and transfer learning
title_short A prediction approach to COVID-19 time series with LSTM integrated attention mechanism and transfer learning
title_sort prediction approach to covid 19 time series with lstm integrated attention mechanism and transfer learning
topic COVID-19
Time series
Deep learning
LSTM
Transfer learning
url https://doi.org/10.1186/s12874-024-02433-w
work_keys_str_mv AT binhu apredictionapproachtocovid19timeserieswithlstmintegratedattentionmechanismandtransferlearning
AT yaohuihan apredictionapproachtocovid19timeserieswithlstmintegratedattentionmechanismandtransferlearning
AT wenhuizhang apredictionapproachtocovid19timeserieswithlstmintegratedattentionmechanismandtransferlearning
AT qingyangzhang apredictionapproachtocovid19timeserieswithlstmintegratedattentionmechanismandtransferlearning
AT wengu apredictionapproachtocovid19timeserieswithlstmintegratedattentionmechanismandtransferlearning
AT junbi apredictionapproachtocovid19timeserieswithlstmintegratedattentionmechanismandtransferlearning
AT bichen apredictionapproachtocovid19timeserieswithlstmintegratedattentionmechanismandtransferlearning
AT lishunxiao apredictionapproachtocovid19timeserieswithlstmintegratedattentionmechanismandtransferlearning
AT binhu predictionapproachtocovid19timeserieswithlstmintegratedattentionmechanismandtransferlearning
AT yaohuihan predictionapproachtocovid19timeserieswithlstmintegratedattentionmechanismandtransferlearning
AT wenhuizhang predictionapproachtocovid19timeserieswithlstmintegratedattentionmechanismandtransferlearning
AT qingyangzhang predictionapproachtocovid19timeserieswithlstmintegratedattentionmechanismandtransferlearning
AT wengu predictionapproachtocovid19timeserieswithlstmintegratedattentionmechanismandtransferlearning
AT junbi predictionapproachtocovid19timeserieswithlstmintegratedattentionmechanismandtransferlearning
AT bichen predictionapproachtocovid19timeserieswithlstmintegratedattentionmechanismandtransferlearning
AT lishunxiao predictionapproachtocovid19timeserieswithlstmintegratedattentionmechanismandtransferlearning