Convergence of Batch Split-Complex Backpropagation Algorithm for Complex-Valued Neural Networks

The batch split-complex backpropagation (BSCBP) algorithm for training complex-valued neural networks is considered. For constant learning rate, it is proved that the error function of BSCBP algorithm is monotone during the training iteration process, and the gradient of the error function tends to...

Full description

Saved in:
Bibliographic Details
Main Authors: Huisheng Zhang, Chao Zhang, Wei Wu
Format: Article
Language:English
Published: Wiley 2009-01-01
Series:Discrete Dynamics in Nature and Society
Online Access:http://dx.doi.org/10.1155/2009/329173
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1832563877035900928
author Huisheng Zhang
Chao Zhang
Wei Wu
author_facet Huisheng Zhang
Chao Zhang
Wei Wu
author_sort Huisheng Zhang
collection DOAJ
description The batch split-complex backpropagation (BSCBP) algorithm for training complex-valued neural networks is considered. For constant learning rate, it is proved that the error function of BSCBP algorithm is monotone during the training iteration process, and the gradient of the error function tends to zero. By adding a moderate condition, the weights sequence itself is also proved to be convergent. A numerical example is given to support the theoretical analysis.
format Article
id doaj-art-e61ae57dded743dda74e0ca4eb98eb0f
institution Kabale University
issn 1026-0226
1607-887X
language English
publishDate 2009-01-01
publisher Wiley
record_format Article
series Discrete Dynamics in Nature and Society
spelling doaj-art-e61ae57dded743dda74e0ca4eb98eb0f2025-02-03T01:12:19ZengWileyDiscrete Dynamics in Nature and Society1026-02261607-887X2009-01-01200910.1155/2009/329173329173Convergence of Batch Split-Complex Backpropagation Algorithm for Complex-Valued Neural NetworksHuisheng Zhang0Chao Zhang1Wei Wu2Applied Mathematics Department, Dalian University of Technology, Dalian 116024, ChinaApplied Mathematics Department, Dalian University of Technology, Dalian 116024, ChinaApplied Mathematics Department, Dalian University of Technology, Dalian 116024, ChinaThe batch split-complex backpropagation (BSCBP) algorithm for training complex-valued neural networks is considered. For constant learning rate, it is proved that the error function of BSCBP algorithm is monotone during the training iteration process, and the gradient of the error function tends to zero. By adding a moderate condition, the weights sequence itself is also proved to be convergent. A numerical example is given to support the theoretical analysis.http://dx.doi.org/10.1155/2009/329173
spellingShingle Huisheng Zhang
Chao Zhang
Wei Wu
Convergence of Batch Split-Complex Backpropagation Algorithm for Complex-Valued Neural Networks
Discrete Dynamics in Nature and Society
title Convergence of Batch Split-Complex Backpropagation Algorithm for Complex-Valued Neural Networks
title_full Convergence of Batch Split-Complex Backpropagation Algorithm for Complex-Valued Neural Networks
title_fullStr Convergence of Batch Split-Complex Backpropagation Algorithm for Complex-Valued Neural Networks
title_full_unstemmed Convergence of Batch Split-Complex Backpropagation Algorithm for Complex-Valued Neural Networks
title_short Convergence of Batch Split-Complex Backpropagation Algorithm for Complex-Valued Neural Networks
title_sort convergence of batch split complex backpropagation algorithm for complex valued neural networks
url http://dx.doi.org/10.1155/2009/329173
work_keys_str_mv AT huishengzhang convergenceofbatchsplitcomplexbackpropagationalgorithmforcomplexvaluedneuralnetworks
AT chaozhang convergenceofbatchsplitcomplexbackpropagationalgorithmforcomplexvaluedneuralnetworks
AT weiwu convergenceofbatchsplitcomplexbackpropagationalgorithmforcomplexvaluedneuralnetworks