Several Guaranteed Descent Conjugate Gradient Methods for Unconstrained Optimization
This paper investigates a general form of guaranteed descent conjugate gradient methods which satisfies the descent condition gkTdk≤-1-1/4θkgk2 θk>1/4 and which is strongly convergent whenever the weak Wolfe line search is fulfilled. Moreover, we present several specific guaranteed descent conju...
Saved in:
Main Authors: | San-Yang Liu, Yuan-Yuan Huang |
---|---|
Format: | Article |
Language: | English |
Published: |
Wiley
2014-01-01
|
Series: | Journal of Applied Mathematics |
Online Access: | http://dx.doi.org/10.1155/2014/825958 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
A Conjugate Gradient Method for Unconstrained Optimization Problems
by: Gonglin Yuan
Published: (2009-01-01) -
A Modified Hybrid Conjugate Gradient Method for Unconstrained Optimization
by: Minglei Fang, et al.
Published: (2021-01-01) -
The Global Convergence of a New Mixed Conjugate Gradient Method for Unconstrained Optimization
by: Yang Yueting, et al.
Published: (2012-01-01) -
An Efficient Modified AZPRP Conjugate Gradient Method for Large-Scale Unconstrained Optimization Problem
by: Ahmad Alhawarat, et al.
Published: (2021-01-01) -
A Conjugate Gradient Method with Global Convergence for Large-Scale Unconstrained Optimization Problems
by: Shengwei Yao, et al.
Published: (2013-01-01)