Avoiding Local Minima in ANN by Genetic Evolution
A novel approach is proposed to avoid the problem of local minima in neural networks learning when using the back-propagation algorithm. This problem is solved by applying a genetic evolution. The unification of both powerful technies can be viewed as an example of a mix between Darwinian and Lamarckian learning. The wine data, a high-dimensional classification problem, is given as an example.
KeywordsArtificial Neural Network Hide Layer Training Epoch Wine Data Good Classification Rate
Unable to display preview. Download preview PDF.