Summary: | The Conjugate Gradient (CG) method stands as an evolved computational technique designed for addressing unconstrained optimization problems. Its attractiveness stems from its simplicity, making it straightforward to implement, and its proven track record in effectively addressing real-world applications. Despite the recent surge in interest in this field, certain newer versions of the CG algorithm have failed to outperform the efficiency of their predecessors. Consequently, this paper introduces a fresh CG variant that upholds essential properties of the original CG methods, including sufficient descent and global convergence. In this paper, three types of new CG coefficients are presented with applications in optimizing data. Numerical experiments show that the proposed methods have succeeded in solving problems under exact line search conditions. © 2024, Semarak Ilmu Publishing. All rights reserved.
|