Scaled Fletcher-Revees Method for Training Feed forward Neural Network

Section: Article
Published
Jun 25, 2025
Pages
237-245

Abstract

The training phase of a Back-Propagation (BP) network is an unconstrained optimization problem. The goal of the training is to search an optimal set of connection weights in the manner that the error of the network out put can be minimized. In this paper we developed the Classical Fletcher-Revees (CFRB) method for non-linear conjugate gradient to the scaled conjugate gradient (SFRB say) to train the feed forward neural network. Our development is based on the sufficient descent property and pure conjugacy conditions. Comparative results for (SFRB), (CFRB) and standard Back-Propagation (BP) are presented for some test problems.

Identifiers

Download this PDF file

Statistics

How to Cite

M. Khalaf, B., & K. Abbo, K. (2025). Scaled Fletcher-Revees Method for Training Feed forward Neural Network. IRAQI JOURNAL OF STATISTICAL SCIENCES, 11(2), 237–245. Retrieved from https://rjps.uomosul.edu.iq/index.php/stats/article/view/20948