Scaled Fletcher-Revees Method for Training Feed forward Neural Network
Abstract
The training phase of a Back-Propagation (BP) network is an unconstrained optimization problem. The goal of the training is to search an optimal set of connection weights in the manner that the error of the network out put can be minimized. In this paper we developed the Classical Fletcher-Revees (CFRB) method for non-linear conjugate gradient to the scaled conjugate gradient (SFRB say) to train the feed forward neural network. Our development is based on the sufficient descent property and pure conjugacy conditions. Comparative results for (SFRB), (CFRB) and standard Back-Propagation (BP) are presented for some test problems.