A new class of nonlinear conjugate gradient coefficients with exact and inexact line searches

Conjugate gradient (CG) methods have played an important role in solving large-scale unconstrained optimization. In this paper, we propose a new family of CG coefficients (βk) that possess sufficient descent conditions and global convergence properties. This new βk is an extension of the already pro...

Full description

Bibliographic Details
Published in:Applied Mathematics and Computation
Main Author: Rivaie M.; Mamat M.; Abashar A.
Format: Article
Language:English
Published: Elsevier Inc. 2015
Online Access:https://www.scopus.com/inward/record.uri?eid=2-s2.0-84938635473&doi=10.1016%2fj.amc.2015.07.019&partnerID=40&md5=0ac00abeae108129ee61f24fcdfb6dd2
Description
Summary:Conjugate gradient (CG) methods have played an important role in solving large-scale unconstrained optimization. In this paper, we propose a new family of CG coefficients (βk) that possess sufficient descent conditions and global convergence properties. This new βk is an extension of the already proven βkRMIL from Rivaie et al. [19] (A new class of nonlinear conjugate gradient coefficient with global convergence properties, Appl. Math. Comp. 218(2012) 11323-11332). Global convergence result is established using both exact and inexact line searches. Numerical results show that the performance of the new proposed formula is quite similar to βkRMIL and suited to both line searches. Importantly, the performance of this βk is more efficient and superior than the other well-known βk. © 2015 Elsevier Inc. All rights reserved.
ISSN:963003
DOI:10.1016/j.amc.2015.07.019