Self-organization of connection patterns within brain areas of animals begins prenatally, and has been shown to depend on internally generated patterns of neural activity. The neural structures continue to develop postnatally through externally driven patterns, when the sensory systems are exposed to stimuli from the environment. The internally generated patterns have been proposed to give the neural system an appropriate bias so that it can learn reliably from complex environmental stimuli. This paper evaluates the hypothesis that complex artificial learning systems can benefit from a similar approach, consisting of initial training with patterns from an evolved pattern generator, followed by training with the actual training set. To test this hypothesis, competitive learning networks were trained for recognizing handwritten digits. The results demonstrate how the approach can improve learning performance by discovering the appropriate initial weight biases, thereby compensating for weaknesses of the learning algorithm. Because of the smaller evolutionary search space, this approach was also found to require much fewer generations than direct evolution of network weights. Since discovering the right biases efficiently is critical for solving large-scale problems with learning, these results suggest that internal training pattern generation is an effective method for constructing complex systems.
|Number of pages||18|
|Journal||IEEE Transactions on Evolutionary Computation|
|Publication status||Published - 2007|