Training Neural Networks with a Procedure Guided by BNF Grammars
Artificial neural networks are parametric machine learning models that have been applied successfully to an extended series of classification and regression problems found in the recent literature. For the effective identification of the parameters of the artificial neural networks, a series of opti...
Saved in:
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2025-01-01
|
Series: | Big Data and Cognitive Computing |
Subjects: | |
Online Access: | https://www.mdpi.com/2504-2289/9/1/5 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
_version_ | 1832589113292750848 |
---|---|
author | Ioannis G. Tsoulos Vasileios Charilogis |
author_facet | Ioannis G. Tsoulos Vasileios Charilogis |
author_sort | Ioannis G. Tsoulos |
collection | DOAJ |
description | Artificial neural networks are parametric machine learning models that have been applied successfully to an extended series of classification and regression problems found in the recent literature. For the effective identification of the parameters of the artificial neural networks, a series of optimization techniques have been proposed in the relevant literature, which, although they present good results in many cases, either the optimization method used is not efficient and the training error of the network is trapped in sub-optimal values, or the neural network exhibits the phenomenon of overfitting which means that it has poor results when applied to data that was not present during the training. This paper proposes an innovative technique for constructing the weights of artificial neural networks based on appropriate BNF grammars, used in the evolutionary process of Grammatical Evolution. The new procedure locates an interval of values for the parameters of the artificial neural network, and the optimization method effectively locates the network parameters within this interval. The new technique was applied to a wide range of data classification and adaptation problems covering a number of scientific areas and the experimental results were more than promising. |
format | Article |
id | doaj-art-54301a1d388c4772b19182adfb6fa266 |
institution | Kabale University |
issn | 2504-2289 |
language | English |
publishDate | 2025-01-01 |
publisher | MDPI AG |
record_format | Article |
series | Big Data and Cognitive Computing |
spelling | doaj-art-54301a1d388c4772b19182adfb6fa2662025-01-24T13:22:31ZengMDPI AGBig Data and Cognitive Computing2504-22892025-01-0191510.3390/bdcc9010005Training Neural Networks with a Procedure Guided by BNF GrammarsIoannis G. Tsoulos 0Vasileios Charilogis1Department of Informatics and Telecommunications, University of Ioannina, 45110 Ioannina, GreeceDepartment of Informatics and Telecommunications, University of Ioannina, 45110 Ioannina, GreeceArtificial neural networks are parametric machine learning models that have been applied successfully to an extended series of classification and regression problems found in the recent literature. For the effective identification of the parameters of the artificial neural networks, a series of optimization techniques have been proposed in the relevant literature, which, although they present good results in many cases, either the optimization method used is not efficient and the training error of the network is trapped in sub-optimal values, or the neural network exhibits the phenomenon of overfitting which means that it has poor results when applied to data that was not present during the training. This paper proposes an innovative technique for constructing the weights of artificial neural networks based on appropriate BNF grammars, used in the evolutionary process of Grammatical Evolution. The new procedure locates an interval of values for the parameters of the artificial neural network, and the optimization method effectively locates the network parameters within this interval. The new technique was applied to a wide range of data classification and adaptation problems covering a number of scientific areas and the experimental results were more than promising.https://www.mdpi.com/2504-2289/9/1/5neural networksgenetic algorithmsgrammatical evolutionevolutionary algorithms |
spellingShingle | Ioannis G. Tsoulos Vasileios Charilogis Training Neural Networks with a Procedure Guided by BNF Grammars Big Data and Cognitive Computing neural networks genetic algorithms grammatical evolution evolutionary algorithms |
title | Training Neural Networks with a Procedure Guided by BNF Grammars |
title_full | Training Neural Networks with a Procedure Guided by BNF Grammars |
title_fullStr | Training Neural Networks with a Procedure Guided by BNF Grammars |
title_full_unstemmed | Training Neural Networks with a Procedure Guided by BNF Grammars |
title_short | Training Neural Networks with a Procedure Guided by BNF Grammars |
title_sort | training neural networks with a procedure guided by bnf grammars |
topic | neural networks genetic algorithms grammatical evolution evolutionary algorithms |
url | https://www.mdpi.com/2504-2289/9/1/5 |
work_keys_str_mv | AT ioannisgtsoulos trainingneuralnetworkswithaprocedureguidedbybnfgrammars AT vasileioscharilogis trainingneuralnetworkswithaprocedureguidedbybnfgrammars |