Conjugate Gradient Back-propagation with Modified Polack –Rebier updates for training feed forward neural network

Section: Article
Published
Jun 25, 2025
Pages
164-173

Abstract

Several learning algorithms for feed-forward (FFN) neural networks have been developed, many of these algorithms are based on the gradient descent algorithm well-known in optimization theory which have poor performance in practical applications. In this paper we modify the Polak-Ribier conjugate gradient method to train feed forward neural network. Our modification is based on the secant equation (Quasi-Newton condition). The suggested algorithm is tested on some well known test problems and compared with other algorithms in this field.

Identifiers

Download this PDF file

Statistics

How to Cite

Al-Bayati, A., Abbas, A. Saleh, I., Ibrahem, K. Abbo, K., & Khalil. (2025). Conjugate Gradient Back-propagation with Modified Polack –Rebier updates for training feed forward neural network. IRAQI JOURNAL OF STATISTICAL SCIENCES, 11(2), 164–173. Retrieved from https://rjps.uomosul.edu.iq/index.php/stats/article/view/20952