Extending Genetic Programming to Evolve Perceptron-Like Learning Programs
We extend genetic programming (GP) with a local memory and vectorization to evolve simple, perceptron-like programs capable of learning by error correction. The local memory allows for a scalar value or vector to be stored and manipulated within a local scope of GP tree. Vectorization consists in grouping input variables and processing them as vectors. We demonstrate these extensions, along with an island model, allow to evolve general perceptron-like programs, i.e. working for any number of inputs. This is unlike in standard GP, where inputs are represented explicitly as scalars, so that scaling up the problem would require to evolve a new solution. Moreover, we find vectorization allows to represent programs more compactly and facilitates the evolutionary search.
KeywordsGenetic programming evolutionary neural networks learning programs supervised learning
Unable to display preview. Download preview PDF.
- 4.Koza, J.R., Keane, M.A., Streeter, M.J.: What’s AI Done for Me Lately? Genetic Programming’s Human-Competitive Results. IEEE Intell. Syst., 25–31 (2003)Google Scholar
- 7.Poli, R., Langdon, W.B., McPhee, N.F.: A Field Guide to Genetic Programming. Lulu Press (2008)Google Scholar
- 8.Teller, A.: Turing Completeness in the Language of Genetic Programming with Indexed Memory. In: Proc. of the 1994 IEEE World Congr. on Comput. Intell., vol. 1, pp. 136–141 (1994)Google Scholar
- 11.Woodward, J.R., Bai, R.: Why Evolution Is Not a Good Paradigm for Program Induction: A Critique of Genetic Programming. In: Proc. of the first ACM/SIGEVO Summit on Genetic and Evolutionary Computation, pp. 593–600 (2009)Google Scholar