Many studies have mapped a bit-string genotype using a genetic algorithm to represent network architectures to improve performance of back-propagation networks (BPN). But the limitations of gradient search techniques applied to complex nonlinear optimization problems have often resulted in inconsistent and unpredictable performance. This study focuses on how to collect and re-evaluate the weight matrices of a BPN while the genetic algorithm operations are processing in each generation to optimize the weight matrices. In this way, overfitting, a drawback of BPNs that usually occurs during the later stage of neural network training with descending training error and ascending prediction error, can also be avoided. This study extends the parameters and topology of the neural network to enhance the feasibility of the solution space for complex nonlinear problems. The value of the proposed model is compared with previous studies using a Monte Carlo study on in-sample, interpolation, and extrapolation data for six test functions.