On Evolvability: The Swapping Algorithm, Product Distributions, and Covariance
Valiant recently introduced a learning theoretic framework for evolution, and showed that his swapping algorithm evolves monotone conjunctions efficiently over the uniform distribution. We continue the study of the swapping algorithm for monotone conjunctions. A modified presentation is given for the uniform distribution, which leads to a characterization of best approximations, a simplified analysis and improved complexity bounds. It is shown that for product distributions a similar characterization does not hold, and there may be local optima of the fitness function. However, the characterization holds if the correlation fitness function is replaced by covariance. Evolvability results are given for product distributions using the covariance fitness function, assuming either arbitrary tolerances, or a non-degeneracy condition for the distribution and a size bound on the target.
Unable to display preview. Download preview PDF.
- 2.Feldman, V.: Evolvability from learning algorithms. In: STOC 2008, pp. 619–628. ACM, New York (2008)Google Scholar
- 3.Feldman, V.: Robustness of Evolvability. In: COLT 2009 (2009)Google Scholar
- 4.Furst, M.L., Jackson, J.C., Smith, S.W.: Improved learning of AC0 functions. In: COLT 1991, pp. 317–325 (1991)Google Scholar
- 5.Hancock, T., Mansour, Y.: Learning monotone ku DNF formulas on product distributions. In: COLT 1991, pp. 179–183 (1991)Google Scholar
- 6.Kalai, A.T., Teng, S.-H.: Decision trees are PAC-learnable from most product distributions: a smoothed analysis. CoRR, abs/0812.0933 (2008)Google Scholar
- 8.Kearns, M.J., Vazirani, U.V.: An introduction to computational learning theory. MIT Press, Cambridge (1994)Google Scholar
- 9.Michael, L.: Evolvability via the Fourier Transform (2009)Google Scholar
- 11.Ros, J.P.: Learning Boolean functions with genetic algorithms: A PAC analysis. In: FGA, San Mateo, CA, pp. 257–275. Morgan Kaufmann, San Francisco (1993)Google Scholar