A Topological Approach to Enhancing Consistency in Machine Learning via Recurrent Neural Networks

The analysis of continuous events for any application involves the discretization of an event into sequences with potential historical dependencies. These sequences represent time stamps or samplings of a continuous process collectively forming a time series dataset utilized for training recurrent n...

Full description

Saved in:
Bibliographic Details
Main Authors: Muhammed Adil Yatkin, Mihkel Kõrgesaar, Ümit Işlak
Format: Article
Language:English
Published: MDPI AG 2025-01-01
Series:Applied Sciences
Subjects:
Online Access:https://www.mdpi.com/2076-3417/15/2/933
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The analysis of continuous events for any application involves the discretization of an event into sequences with potential historical dependencies. These sequences represent time stamps or samplings of a continuous process collectively forming a time series dataset utilized for training recurrent neural networks (RNNs) such as Long Short-Term Memory (LSTM) and Gated Recurrent Unit (GRU) for pattern prediction. The challenge is to ensure that the estimates from the trained models are consistent in the same input domain for different discretizations of the same or similar continuous history-dependent events. In other words, if different time stamps are used during the prediction phase after training, the model is still expected to give consistent predictions based on the knowledge it has learned. To address this, we present a novel RNN transition formula intended to produce consistent estimates in a wide range of engineering applications. The approach was validated with synthetically generated datasets in 1D, 2D, and 3D spaces, intentionally designed to exhibit high non-linearity and complexity. Furthermore, we have verified our results with real-world datasets to ensure practical applicability and robustness. These assessments show the ability of the proposed method, which involves restructuring the mathematical structure and extending conventional RNN architectures, to provide reliable and consistent estimates for complex time series data.
ISSN:2076-3417