A new neurodynamic model with Adam optimization method for solving generalized eigenvalue problem
In this paper we proposed a new neurodynamic model with recurrent learning process for solving ill-condition Generalized eigenvalue problem (GEP) Ax = lambda Bx. our method is based on recurrent neural networks with customized energy function for finding smallest (largest) or all eigenpairs. We eval...
Saved in:
Main Authors: | , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
REA Press
2021-06-01
|
Series: | Big Data and Computing Visions |
Subjects: | |
Online Access: | https://www.bidacv.com/article_142589_fa224602c4b67cbe6ac9c0e5272481a1.pdf |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
_version_ | 1832579324467740672 |
---|---|
author | Ebrahim Ganjalipour Khadijeh Nemati Amir Hosein Refahi Sheikhani Hashem Saberi Najafi |
author_facet | Ebrahim Ganjalipour Khadijeh Nemati Amir Hosein Refahi Sheikhani Hashem Saberi Najafi |
author_sort | Ebrahim Ganjalipour |
collection | DOAJ |
description | In this paper we proposed a new neurodynamic model with recurrent learning process for solving ill-condition Generalized eigenvalue problem (GEP) Ax = lambda Bx. our method is based on recurrent neural networks with customized energy function for finding smallest (largest) or all eigenpairs. We evaluate our method on collected structural engineering data from Harwell Boeing collection with high dimensional parameter space and ill-conditioned sparse matrices. The experiments demonstrate that our algorithm using Adam optimizer, in comparison with other stochastic optimization methods like gradient descent works well in practice and improves complexity and accuracy of convergence. |
format | Article |
id | doaj-art-3af9e10e9c7c4ba19e23c3a901a045de |
institution | Kabale University |
issn | 2783-4956 2821-014X |
language | English |
publishDate | 2021-06-01 |
publisher | REA Press |
record_format | Article |
series | Big Data and Computing Visions |
spelling | doaj-art-3af9e10e9c7c4ba19e23c3a901a045de2025-01-30T12:21:15ZengREA PressBig Data and Computing Visions2783-49562821-014X2021-06-0112839510.22105/bdcv.2021.142589142589A new neurodynamic model with Adam optimization method for solving generalized eigenvalue problemEbrahim Ganjalipour0Khadijeh Nemati1Amir Hosein Refahi Sheikhani2Hashem Saberi Najafi3Department of Mathematics and Computer Sciences, Faculty of Mathematical Sciences, Lahijan Branch, Islamic Azad University, Lahijan, Iran.Department of Mathematics and Computer Sciences, Faculty of Mathematical Sciences, Lahijan Branch, Islamic Azad University, Lahijan, Iran.Department of Mathematics and Computer Sciences, Faculty of Mathematical Sciences, Lahijan Branch, Islamic Azad University, Lahijan, Iran.Department of Applied Mathematics, Ayandegan Institute of Higher Education, Tonekabon, Iran.In this paper we proposed a new neurodynamic model with recurrent learning process for solving ill-condition Generalized eigenvalue problem (GEP) Ax = lambda Bx. our method is based on recurrent neural networks with customized energy function for finding smallest (largest) or all eigenpairs. We evaluate our method on collected structural engineering data from Harwell Boeing collection with high dimensional parameter space and ill-conditioned sparse matrices. The experiments demonstrate that our algorithm using Adam optimizer, in comparison with other stochastic optimization methods like gradient descent works well in practice and improves complexity and accuracy of convergence.https://www.bidacv.com/article_142589_fa224602c4b67cbe6ac9c0e5272481a1.pdfrecurrent neural networkeigenpairsadam optimizerpositive definite matrixill-condition |
spellingShingle | Ebrahim Ganjalipour Khadijeh Nemati Amir Hosein Refahi Sheikhani Hashem Saberi Najafi A new neurodynamic model with Adam optimization method for solving generalized eigenvalue problem Big Data and Computing Visions recurrent neural network eigenpairs adam optimizer positive definite matrix ill-condition |
title | A new neurodynamic model with Adam optimization method for solving generalized eigenvalue problem |
title_full | A new neurodynamic model with Adam optimization method for solving generalized eigenvalue problem |
title_fullStr | A new neurodynamic model with Adam optimization method for solving generalized eigenvalue problem |
title_full_unstemmed | A new neurodynamic model with Adam optimization method for solving generalized eigenvalue problem |
title_short | A new neurodynamic model with Adam optimization method for solving generalized eigenvalue problem |
title_sort | new neurodynamic model with adam optimization method for solving generalized eigenvalue problem |
topic | recurrent neural network eigenpairs adam optimizer positive definite matrix ill-condition |
url | https://www.bidacv.com/article_142589_fa224602c4b67cbe6ac9c0e5272481a1.pdf |
work_keys_str_mv | AT ebrahimganjalipour anewneurodynamicmodelwithadamoptimizationmethodforsolvinggeneralizedeigenvalueproblem AT khadijehnemati anewneurodynamicmodelwithadamoptimizationmethodforsolvinggeneralizedeigenvalueproblem AT amirhoseinrefahisheikhani anewneurodynamicmodelwithadamoptimizationmethodforsolvinggeneralizedeigenvalueproblem AT hashemsaberinajafi anewneurodynamicmodelwithadamoptimizationmethodforsolvinggeneralizedeigenvalueproblem AT ebrahimganjalipour newneurodynamicmodelwithadamoptimizationmethodforsolvinggeneralizedeigenvalueproblem AT khadijehnemati newneurodynamicmodelwithadamoptimizationmethodforsolvinggeneralizedeigenvalueproblem AT amirhoseinrefahisheikhani newneurodynamicmodelwithadamoptimizationmethodforsolvinggeneralizedeigenvalueproblem AT hashemsaberinajafi newneurodynamicmodelwithadamoptimizationmethodforsolvinggeneralizedeigenvalueproblem |