The convergence properties of some descent conjugate gradient algorithms for optimization models

The three-term conjugate gradient (CG) algorithms are among the efficient variants of CG method for convex and non-convex functions. This is because most three-term algorithms are constructed using the classical CG method whose numerical performance has been tested and convergence proved. In this pa...

Full description

Bibliographic Details
Published in:Journal of Mathematics and Computer Science
Main Author: Sulaiman I.M.; Mamat M.; Owoyemi A.E.; Ghazali P.L.; Rivaie M.; Malik M.
Format: Article
Language:English
Published: International Scientific Research Publications 2020
Online Access:https://www.scopus.com/inward/record.uri?eid=2-s2.0-85090667940&doi=10.22436%2fjmcs.022.03.02&partnerID=40&md5=b0ba4a56d8bb8279e0ee559661bc3158
Description
Summary:The three-term conjugate gradient (CG) algorithms are among the efficient variants of CG method for convex and non-convex functions. This is because most three-term algorithms are constructed using the classical CG method whose numerical performance has been tested and convergence proved. In this paper, we present a modification of RMIL+ CG method proposed by Dai [Z. Dai, Appl. Math. Comput., 267 (2016), 297–300] based on the convergence analysis of RMIL (2012) CG method. Interestingly, the modified method possesses sufficient descent condition and the global convergence prove was established using exact minimization condition. We further extended the results of the modified RMIL+ to construct a three-term CG algorithm and also show that the method satisfies the sufficient descent condition under the strong Wolfe line search. Preliminary numerical results are reported based on known benchmark problems which show that the proposed methods are efficient and promising compare to other CG methods. © 2020,International Scientific Research Publications. All rights reserved.
ISSN:2008949X
DOI:10.22436/jmcs.022.03.02