A Topological Approach to Enhancing Consistency in Machine Learning via Recurrent Neural Networks
The analysis of continuous events for any application involves the discretization of an event into sequences with potential historical dependencies. These sequences represent time stamps or samplings of a continuous process collectively forming a time series dataset utilized for training recurrent n...
Saved in:
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2025-01-01
|
Series: | Applied Sciences |
Subjects: | |
Online Access: | https://www.mdpi.com/2076-3417/15/2/933 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
_version_ | 1832589185243938816 |
---|---|
author | Muhammed Adil Yatkin Mihkel Kõrgesaar Ümit Işlak |
author_facet | Muhammed Adil Yatkin Mihkel Kõrgesaar Ümit Işlak |
author_sort | Muhammed Adil Yatkin |
collection | DOAJ |
description | The analysis of continuous events for any application involves the discretization of an event into sequences with potential historical dependencies. These sequences represent time stamps or samplings of a continuous process collectively forming a time series dataset utilized for training recurrent neural networks (RNNs) such as Long Short-Term Memory (LSTM) and Gated Recurrent Unit (GRU) for pattern prediction. The challenge is to ensure that the estimates from the trained models are consistent in the same input domain for different discretizations of the same or similar continuous history-dependent events. In other words, if different time stamps are used during the prediction phase after training, the model is still expected to give consistent predictions based on the knowledge it has learned. To address this, we present a novel RNN transition formula intended to produce consistent estimates in a wide range of engineering applications. The approach was validated with synthetically generated datasets in 1D, 2D, and 3D spaces, intentionally designed to exhibit high non-linearity and complexity. Furthermore, we have verified our results with real-world datasets to ensure practical applicability and robustness. These assessments show the ability of the proposed method, which involves restructuring the mathematical structure and extending conventional RNN architectures, to provide reliable and consistent estimates for complex time series data. |
format | Article |
id | doaj-art-5b8980d8de824acd8372964c7c57b1e7 |
institution | Kabale University |
issn | 2076-3417 |
language | English |
publishDate | 2025-01-01 |
publisher | MDPI AG |
record_format | Article |
series | Applied Sciences |
spelling | doaj-art-5b8980d8de824acd8372964c7c57b1e72025-01-24T13:21:23ZengMDPI AGApplied Sciences2076-34172025-01-0115293310.3390/app15020933A Topological Approach to Enhancing Consistency in Machine Learning via Recurrent Neural NetworksMuhammed Adil Yatkin0Mihkel Kõrgesaar1Ümit Işlak2School of Engineering, Kuressaare College, Tallinn University of Technology, 19086 Tallinn, EstoniaSchool of Engineering, Kuressaare College, Tallinn University of Technology, 19086 Tallinn, EstoniaFaculty of Arts and Sciences, Department of Mathematics, Boğaziçi University, 34342 Istanbul, TürkiyeThe analysis of continuous events for any application involves the discretization of an event into sequences with potential historical dependencies. These sequences represent time stamps or samplings of a continuous process collectively forming a time series dataset utilized for training recurrent neural networks (RNNs) such as Long Short-Term Memory (LSTM) and Gated Recurrent Unit (GRU) for pattern prediction. The challenge is to ensure that the estimates from the trained models are consistent in the same input domain for different discretizations of the same or similar continuous history-dependent events. In other words, if different time stamps are used during the prediction phase after training, the model is still expected to give consistent predictions based on the knowledge it has learned. To address this, we present a novel RNN transition formula intended to produce consistent estimates in a wide range of engineering applications. The approach was validated with synthetically generated datasets in 1D, 2D, and 3D spaces, intentionally designed to exhibit high non-linearity and complexity. Furthermore, we have verified our results with real-world datasets to ensure practical applicability and robustness. These assessments show the ability of the proposed method, which involves restructuring the mathematical structure and extending conventional RNN architectures, to provide reliable and consistent estimates for complex time series data.https://www.mdpi.com/2076-3417/15/2/933recurrent neural networks (RNNs)surrogate modellingconsistencyforming limit curves (FLCs)sequence to sequence learning |
spellingShingle | Muhammed Adil Yatkin Mihkel Kõrgesaar Ümit Işlak A Topological Approach to Enhancing Consistency in Machine Learning via Recurrent Neural Networks Applied Sciences recurrent neural networks (RNNs) surrogate modelling consistency forming limit curves (FLCs) sequence to sequence learning |
title | A Topological Approach to Enhancing Consistency in Machine Learning via Recurrent Neural Networks |
title_full | A Topological Approach to Enhancing Consistency in Machine Learning via Recurrent Neural Networks |
title_fullStr | A Topological Approach to Enhancing Consistency in Machine Learning via Recurrent Neural Networks |
title_full_unstemmed | A Topological Approach to Enhancing Consistency in Machine Learning via Recurrent Neural Networks |
title_short | A Topological Approach to Enhancing Consistency in Machine Learning via Recurrent Neural Networks |
title_sort | topological approach to enhancing consistency in machine learning via recurrent neural networks |
topic | recurrent neural networks (RNNs) surrogate modelling consistency forming limit curves (FLCs) sequence to sequence learning |
url | https://www.mdpi.com/2076-3417/15/2/933 |
work_keys_str_mv | AT muhammedadilyatkin atopologicalapproachtoenhancingconsistencyinmachinelearningviarecurrentneuralnetworks AT mihkelkorgesaar atopologicalapproachtoenhancingconsistencyinmachinelearningviarecurrentneuralnetworks AT umitislak atopologicalapproachtoenhancingconsistencyinmachinelearningviarecurrentneuralnetworks AT muhammedadilyatkin topologicalapproachtoenhancingconsistencyinmachinelearningviarecurrentneuralnetworks AT mihkelkorgesaar topologicalapproachtoenhancingconsistencyinmachinelearningviarecurrentneuralnetworks AT umitislak topologicalapproachtoenhancingconsistencyinmachinelearningviarecurrentneuralnetworks |