New Modification Nonlinear Conjugate Gradient Method for Optimization

Section: Research Paper
Published
Dec 1, 2019
Pages
270-281

Abstract

This study proposes a nonlinear Conjugate gradient algorithm that is widely used in optimization, especially for large scale optimization problems, because it does not require the storage of any matrices algorithm. This algorithm modifies Hideaki and Yasushis (HY) conjugate gradient algorithm. It satisfies a parameterized sufficient descent condition with a parameter , which is calculated using the conjugacy condition. The new proposed algorithm always produces descent search directions and it is shown to be convergent under some assumptions. The main idea of this work is to prove the global convergence for the modification nonlinear conjugate gradient method. The statistical results reveal the effectiveness of the proposed algorithm for problems of the given test.

Download this PDF file

Statistics

How to Cite

[1]
Z. Mohammed Abdullah and زياد, “New Modification Nonlinear Conjugate Gradient Method for Optimization”, EDUSJ, vol. 28, no. 4, pp. 270–281, Dec. 2019.