A Novel Conjugate Gradient Algorithm as a Convex Combination of Classical Conjugate Gradient Methods

https://doi.org/10.24017/science.2025.1.6

Abstract views: 0 / PDF downloads: 0

Authors

Abstract

Conjugate gradient (CG) algorithms are constructive for handling large-scale nonlinear optimization problems. One optimization technique intended to address unconstrained optimization issues effectively is the hybrid conjugate gradient (HCG) algorithm. The HCG algorithm aims to improve convergence properties while keeping computations simple by merging features from other conjugate gradient techniques. In this paper, a new hybrid conjugate gradient algorithm is proposed and analyzed, which is obtained as a convex combination of the Dai-Yuan (DY), Hestenes-Stiefel (HS) and Harger-Zhan (HZ) conjugate gradient methods. The primary objective is to improve convergence efficiency and computational performance. The proposed algorithm is designed to reduce the number of iterations and computational costs compared to traditional CG methods. Numerical experiments on standard unconstrained optimization criteria show that the hybrid method achieves faster convergence, often requiring much fewer iterations to reach a specified gradient norm tolerance or objective function value.  Additionally, the per-iteration computational cost remains competitive, as the convex combination framework introduces minimal overhead. Theoretical analysis proves the global convergence of the algorithm under standard assumptions. The results highlight the superior performance of the hybrid method in terms of the number of iterations and the total computational cost, especially for large-scale and unconditional problems.  This work advances the development of efficient and robust CG algorithms, offering a practical solution for unconstrained optimization challenges.

Keywords:

Unconstrained Optimization, Hybrid Conjugate Gradient , Convex Combination, Global Convergence, Sufficient Descent Condition

Author Biography

  • Hawraz Nadhim Jabbar , Mathematics Department, College of Science, Kirkuk University, Kirkuk, Iraq

    Associate Professor from Mathematics, College of Science, Kirkuk University, Kirkuk, Iraq

References

B. A. Hassan, I. A. R. Moghrabi, A. L. Ibrahim, and H. N. Jabbar, “Improved Conjugate Gradient Methods for Un-constrained Minimization Problems and Training Recurrent Neural Network,”Engineering Reports, vol. 7, no. 2, pp. 1-13, Feb. 2025, doi: 10.1002/eng2.70019. DOI: https://doi.org/10.1002/eng2.70019

D. H. Omar, A. L. Ibrahim, M. M. Hassan, B. G. Fathi, and D. A. Sulaiman, “Enhanced Conjugate Gradient Method for Unconstrained Optimization and Its Application in Neural Networks,” European Journal of Pure and Applied Math-ematics, vol. 17, no. 4, pp. 2692-2705, Oct. 2024, URL: https://www.ejpam.com/index.php/ejpam/article/view/5354/1727.

R. Fletcher, and C. Reeves, “Function minimization by conjugate gradients,” The Computer J ournal, vol. 7, pp. 149-154, 1964, DOI:10.1093/COMJNL/7.2.149. DOI: https://doi.org/10.1093/comjnl/7.2.149

S. A. Hamad, D. H. Omar, D. A. Sulaiman, and A. L. Ibrahim, “A New Conjugate Gradient Method Based on Lo-gistic Mapping for Unconstrained Optimization and Its Application in Regression Analysis,” Science Journal of Uni-versity of Zakho, vol. 12, no. 4, pp. 484-489, Nov. 2024, doi: 10.25271/sjuoz.2024.12.4.1310. DOI: https://doi.org/10.25271/sjuoz.2024.12.4.1310

J. E. Dennis, and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear Equations, Phil-adelphia, PA, USA: SIAM, 1983, ISBN:978-0-89871-364-0.

K. K. Abbo, and N. H. Hameed, “New Hybrid Conjugate Method as a Convex Combination of Liu-Storey and Dixon Methods,” Journal of Multidisciplinary Modeling and Optimization, vol. 1, no. 2, pp.91-99, 2018, URL: https://journals.indexcopernicus.com/search/article?articleId=2631251.

Y. N. Huda, and I. A. Huda, “A Modification of Dai-Yuan’s Conjugate Gradient Algorithms for Solving Uncon-strained Optimization,” Bulletin of the South Ural State University. Ser. Mathematical Modelling, Programming & Com-puter Software, vol. 15, no. 3, pp. 127-133, 2022, DOI:10.14529/mmp220309. DOI: https://doi.org/10.14529/mmp220309

M. R. Hestenes, and E. Stiefel, “Method of conjugate gradient for solving linear systems,” Journal of Research National Bureau Stand ards, vol. 49, pp. 409-436, 1952, DOI:10.6028/JRES.049.044. DOI: https://doi.org/10.6028/jres.049.044

S. S. Djordjevic, “New hybrid conjugate gradient method as a convex combination of FR and PRP Methods,” Filomat, vol. 30, no. 11, pp. 3083-3100, 2016, DOI:10.2298/FIL1611083D. DOI: https://doi.org/10.2298/FIL1611083D

R. Fletcher, Practical Methods of Optimization: Unconstrained Optimization, vol. 1, 2nd ed. New York, NY, USA: Willey, 1997, ISBN:9780471915478.

Y. Liu, and C. Storey, “Efficient generalized conjugate gradient algorithms,” Journal of Optim ization Theory and Application, vol. 69, no. 1, pp. 129-137, 1991, DOI: 10.1007/BF00940464. DOI: https://doi.org/10.1007/BF00940464

Y. Dai, and Y. Yuan, “A nonlinear conjugate gradient with a strong global convergence property,” SIAM J. Optim., vol. 10, pp. 177-182, 2000, DOI:10.1137/S1052623497318992. DOI: https://doi.org/10.1137/S1052623497318992

W. W. Hager, and H. Zhang, “A survey of nonlinear conjugate gradient methods,” Pacific journal of Optimization, vol.2, pp. 35-58, 2006, URL: https://people.clas.ufl.edu/hager/files/cg_survey.pdf.

S. B. Hanachi, B. Sellami, and M. Belloufi, “A New Family of Hybrid Conjugate Gradient Method for Unconstrained Optimization and Its Application to Regression Analysis,” RAIRO-Oper. Res, vol. 58, pp. 613-627, 2024, DOI:10.1051/ro/2023196. DOI: https://doi.org/10.1051/ro/2023196

N. Andrei, “Another nonlinear conjugate gradient algorithm for unconstrained optimization,” Optimization Methods and Softw are, vol. 24, no.1, pp. 89-104, 2009, DOI: 10.1080/10556780802393326. DOI: https://doi.org/10.1080/10556780802393326

S. B. Hanachi, B. Sellami, and M. Belloufi, “New Iterative Conjugate Gradient Method for Nonlinear Unconstrained Optimization,” RAIRO Operations Research, vol. 56, no.3, pp.2315-2327, 2022, DOI: DOI:10.1051/ro/2022109. DOI: https://doi.org/10.1051/ro/2022109

A. Hallal, M. Belloufi, and B. Sellami, “A new hybrid CG method as convex combination,” Mathematical Foundations of Computing, vol. 7, no. 4, pp. 522-530, 2024, Doi:10.3934/mfc.2023028. DOI: https://doi.org/10.3934/mfc.2023028

Y. Yu, Y. Wang, R. Deng, and Y. Yin, “New DY-HS hybrid conjugate gradient algorithm for solving optimization problem of unsteady partial differential equations with convection term,” Mathematical and Computers in Simulation, vol. 208, no.4, pp. 677-701, 2023, DOI: 10.1016/j.matcom.2023.01.033. DOI: https://doi.org/10.1016/j.matcom.2023.01.033

S. S. Djordjevic, “New Hybrid Conjugate Gradient Method as a Convex Combination of LS and CD methods,” Filo-mat, vol. 31, no. 6, pp. 1813-1825, 2017, DOI:10.2298/FIL1706813D. DOI: https://doi.org/10.2298/FIL1706813D

N. S. Mohamed, M. Mamat, M. Rivaie, and S. M. Shaharudin, “A new hybrid coefficient of conjugate gradient meth-od,” Indonesian Journal of Electrical Engineering and Computer Science, vol. 18, no. 3, pp. 1454-1463, 2020, DOI:10.11591/ijeecs.v18.i3.pp1454-1463. DOI: https://doi.org/10.11591/ijeecs.v18.i3.pp1454-1463

A. Hallal, M. Belloufi, and B. Sellami, “An Efficient New Hybrid CG-Method as Convex Combination of DY and CD and HS Algorithms,” RAIRO-Operations Res earch, vol. 56, no.6, pp. 4047-4056, 2022, DOI: 10.1051/ro/2022200. DOI: https://doi.org/10.1051/ro/2022200

W. W. Hager, and H. Zhang, “A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search,” SIAM Journal on Optimization ., vol. 16, no. 1, pp. 170-192, 2005, DOI:10.1137/030601880. DOI: https://doi.org/10.1137/030601880

B. A. Hassan, Z. M. Abdullah, and S. A. Hussein, “A new family of conjugate gradient methods to solve uncon-strained optimization problems,” Journal of Information and Optimization Sciences, vol. 43, no. 4, pp. 811-820, Aug. 2022, doi: 10.1080/02522667.2022.2103299. DOI: https://doi.org/10.1080/02522667.2022.2103299

N. Andrei, “Another hybrid conjugate gradient algorithm for unconstrained optimization,” Numerical Algo-rithms , vol. 47, pp. 143-156, 2008, DOI:10.1007/s11075-007-9152-9. DOI: https://doi.org/10.1007/s11075-007-9152-9

Y. Salih, M. A. Hamoda, Sukono, and M. Mamat, “The convergence properties of new hybrid conjugate gradient method,” IOP Conf. Series: Materials Science and Engineering, vol. 567, 2019, DOI:10.1088/1757-899X/567/1/012031. DOI: https://doi.org/10.1088/1757-899X/567/1/012031

M. J. D. Powell, “Restart procedures of the conjugate gradient method,” Mathematical Programming , vol. 12, no.1, pp. 241-254, 1977, DOI: 10.1007/BF01593790. DOI: https://doi.org/10.1007/BF01593790

D. Touati-Ahmed, and C. Storey, “Efficient hybrid conjugate gradient techniques,” Journal of Optimization Theory and Application s, vol. 64, pp. 379-397, 1990, DOI:10.1007/BF00939455. DOI: https://doi.org/10.1007/BF00939455

G. Zoutendijk, “Nonlinear programming computational methods,” in Integer and Nonlinear Programming, J. Abadie, Ed. Amsterdam, Netherlands: North-Holland, 1970, pp. 37-86, ISBN:0-444-10000-8.

X. Jiang, W. Liao, J. Yin, and J. Jian, “A new family of hybrid three-term conjugate gradient methods with applica-tions in image restoration,” Numerical Algorithms, vol. 91, pp. 161-191, 2022, DOI:10.1007/s11075-022-01258-2. DOI: https://doi.org/10.1007/s11075-022-01258-2

S. Badreddine, Y. Laskri, and R. Benzine, “A new two-parameter family of nonlinear conjugate gradient methods,” Optimization, vol. 64, no. 4, pp. 1-17, 2013, DOI:10.1080/02331934.2013.830118. DOI: https://doi.org/10.1080/02331934.2013.830118

E. D. Dolan, and J. J. More, “Benchmarking optimization software with performance profiles,” Mathematical Pro-gramming , vol. 91, pp. 201-213, 2002, DOI:10.1007/s101070100263. DOI: https://doi.org/10.1007/s101070100263

J. J. More, B. S. Garbow, and K. E. Hillstrom, “Testing unconstrained optimization software,” ACM Transactions on Mathematical Software (TOMS ), vol. 7, no. 1, pp. 17-41, 1981, DOI:10.1145/355934.355936. DOI: https://doi.org/10.1145/355934.355936

J. Mo, N. Gu, and Z. Wei, “Hybrid conjugate gradient methods for unconstrained optimization,” Optimization Meth-ods and Software, vol. 22, no. 2, pp. 297-307, 2007, DOI:10.1080/10556780500518653. DOI: https://doi.org/10.1080/10556780500518653

Downloads

How to Cite

[1]
S. S. Mohammed Zaki, H. N. . Jabbar, and S. S. . Haider, “A Novel Conjugate Gradient Algorithm as a Convex Combination of Classical Conjugate Gradient Methods”, KJAR, vol. 10, no. 1, pp. 83–98, Apr. 2025, doi: 10.24017/science.2025.1.6.

Article Metrics

Published

16-04-2025

Issue

Section

Pure and Applied Science