A New Modified Conjugate Gradient Method and Its Global Convergence Theorem

Section: Research Paper
Published
Mar 1, 2020
Pages
325-335

Abstract

In this article, we try to proposed a new conjugate gradient method for solving unconstrained optimization problems, we focus on conjugate gradient methods applied to the non-linear unconstrained optimization problems, the positive step sizeis obtained by a line search and the new scalar to the new direction for the conjugate gradient method is derived from the quadratic function and Taylor series and by using quasi newton condition and Newton direction while deriving the new formulae. We also prove that the search direction of the new conjugate gradient method satisfies the sufficient descent and all assumptions of the global convergence property are considered and proved .in order to complete the benefit of our research we should take into account studied the numerical results which are written in FORTRAN language when the objective function is compared our new algorithm with HS and PRP methods on the similar set of unconstrained optimization test problems which is very efficient and encouragement numerical results.

Download this PDF file

Statistics

How to Cite

[1]
A. Moayad Qasim and اسیل, “A New Modified Conjugate Gradient Method and Its Global Convergence Theorem”, EDUSJ, vol. 29, no. 1, pp. 325–335, Mar. 2020.