Many studies have mapped a bit-string genotype using a genetic algorithm to represent network architectures to improve performance of back-propagation networks (BPN). But the limitations of gradient search techniques applied to complex nonlinear optimization problems have often resulted in inconsistent and unpredictable performance. This study focuses oil how to collect and re-evaluate the weight matrices of a BPN while the genetic algorithm operations are processing in each generation to optimize the weight matrices. In this wily, overfitting. a drawback of BPNs that usually occurs during the later stage of neural network training with descending training error and ascending prediction error, can also be avoided. This study extends the parameters and topology of the neural network to enhance the feasibility of the solution space for complex nonlinear problems, The value of the proposed model is compared with previous studies Using a Monte Carlo study oil in-sample, interpolation, and extrapolation data for six test functions. (c) 2007 Elsevier Ltd. All rights reserved.
Relation:
EXPERT SYSTEMS WITH APPLICATIONS 36 (2): 1459-1465 Part 1