DPSTCN: Dynamic Pattern-Aware Spatio-Temporal Convolutional Networks for Traffic Flow Forecasting

Accurate forecasting of multivariate traffic flow poses formidable challenges, primarily due to the ever-evolving spatio-temporal dynamics and intricate spatial heterogeneity, where the heterogeneity signifies that the correlations among locations are not just related to distance. However, few of th...

Full description

Saved in:
Bibliographic Details
Main Authors: Zeping Dou, Danhuai Guo
Format: Article
Language:English
Published: MDPI AG 2024-12-01
Series:ISPRS International Journal of Geo-Information
Subjects:
Online Access:https://www.mdpi.com/2220-9964/14/1/10
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Accurate forecasting of multivariate traffic flow poses formidable challenges, primarily due to the ever-evolving spatio-temporal dynamics and intricate spatial heterogeneity, where the heterogeneity signifies that the correlations among locations are not just related to distance. However, few of the existing models are designed to fully and effectively integrate the above-mentioned features. To address these complexities head-on, this paper introduces a novel solution in the form of Dynamic Pattern-aware Spatio-Temporal Convolutional Networks (DPSTCN). Temporally, the model introduces a novel temporal module, containing a temporal convolutional network (TCN) enriched with an enhanced pattern-aware self-attention mechanism, adept at capturing temporal patterns, including local/global dependencies, dynamics, and periodicity. Spatially, the model constructs static and dynamic pattern-aware convolutions, leveraging geographical and area-functional information to effectively capture intricate spatial patterns, including dynamics and heterogeneity. Evaluations across four distinct traffic benchmark datasets consistently demonstrate the state-of-the-art capacity of our model compared to the existing eleven approaches, especially great improvements in RMSE (Root Mean Squared Error) value.
ISSN:2220-9964