Convergence of Batch Split-Complex Backpropagation Algorithm for Complex-Valued Neural Networks
The batch split-complex backpropagation (BSCBP) algorithm for training complex-valued neural networks is considered. For constant learning rate, it is proved that the error function of BSCBP algorithm is monotone during the training iteration process, and the gradient of the error function tends to...
Saved in:
Main Authors: | Huisheng Zhang, Chao Zhang, Wei Wu |
---|---|
Format: | Article |
Language: | English |
Published: |
Wiley
2009-01-01
|
Series: | Discrete Dynamics in Nature and Society |
Online Access: | http://dx.doi.org/10.1155/2009/329173 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Optimization of Backpropagation Neural Network under the Adaptive Genetic Algorithm
by: Junxi Zhang, et al.
Published: (2021-01-01) -
Analysis and Prediction of Body Test Results Based on Improved Backpropagation Neural Network Algorithm
by: Zhanju Ma, et al.
Published: (2022-01-01) -
Backpropagation Neural Network Implementation for Medical Image Compression
by: Kamil Dimililer
Published: (2013-01-01) -
An Adaboost-Backpropagation Neural Network for Automated Image Sentiment Classification
by: Jianfang Cao, et al.
Published: (2014-01-01) -
Development of Deep Convolutional Neural Network with Adaptive Batch Normalization Algorithm for Bearing Fault Diagnosis
by: Chao Fu, et al.
Published: (2020-01-01)