An Efficient Line Search Algorithm for Large Scale Optimization

Section: Research Paper
Published
Jun 25, 2025
Pages
35-49

Abstract

In this work we present a new algorithm of gradient descent type, in which the stepsize is computed by means of simple approximation of the Hessian Matrix to solve nonlinear unconstrained optimization function. The new proposed algorithm considers a new approximation of the Hessian based on the function values and its gradients in two successive points along the iterations one of them use Biggs modified formula to locate the new points. The corresponding algorithm belongs to the same class of superlinear convergent descent algorithms and it has been newly programmed to obtain the numerical results for a selected class of nonlinear test functions with various dimensions. Numerical experiments show that the new choice of the step-length required less computation work and greatly speeded up the convergence of the gradient algorithm especially, for large scaled unconstrained optimization problems.

Identifiers

Download this PDF file

Statistics

How to Cite

Y. Al-Bayati, A., عباس, & S. Latif, I. (2025). An Efficient Line Search Algorithm for Large Scale Optimization. AL-Rafidain Journal of Computer Sciences and Mathematics, 7(1), 35–49. Retrieved from https://rjps.uomosul.edu.iq/index.php/csmj/article/view/19550