Advertisement

On Evolvability: The Swapping Algorithm, Product Distributions, and Covariance

  • Dimitrios I. Diochnos
  • György Turán
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5792)

Abstract

Valiant recently introduced a learning theoretic framework for evolution, and showed that his swapping algorithm evolves monotone conjunctions efficiently over the uniform distribution. We continue the study of the swapping algorithm for monotone conjunctions. A modified presentation is given for the uniform distribution, which leads to a characterization of best approximations, a simplified analysis and improved complexity bounds. It is shown that for product distributions a similar characterization does not hold, and there may be local optima of the fitness function. However, the characterization holds if the correlation fitness function is replaced by covariance. Evolvability results are given for product distributions using the covariance fitness function, assuming either arbitrary tolerances, or a non-degeneracy condition for the distribution and a size bound on the target.

Keywords

learning evolution 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Castro, J., Balcázar, J.L.: Simple PAC Learning of Simple Decision Lists. In: Zeugmann, T., Shinohara, T., Jantke, K.P. (eds.) ALT 1995. LNCS, vol. 997, pp. 239–248. Springer, Heidelberg (1995)CrossRefGoogle Scholar
  2. 2.
    Feldman, V.: Evolvability from learning algorithms. In: STOC 2008, pp. 619–628. ACM, New York (2008)Google Scholar
  3. 3.
    Feldman, V.: Robustness of Evolvability. In: COLT 2009 (2009)Google Scholar
  4. 4.
    Furst, M.L., Jackson, J.C., Smith, S.W.: Improved learning of AC0 functions. In: COLT 1991, pp. 317–325 (1991)Google Scholar
  5. 5.
    Hancock, T., Mansour, Y.: Learning monotone ku DNF formulas on product distributions. In: COLT 1991, pp. 179–183 (1991)Google Scholar
  6. 6.
    Kalai, A.T., Teng, S.-H.: Decision trees are PAC-learnable from most product distributions: a smoothed analysis. CoRR, abs/0812.0933 (2008)Google Scholar
  7. 7.
    Kearns, M.: Efficient noise-tolerant learning from statistical queries. J. ACM 45(6), 983–1006 (1998)MathSciNetCrossRefzbMATHGoogle Scholar
  8. 8.
    Kearns, M.J., Vazirani, U.V.: An introduction to computational learning theory. MIT Press, Cambridge (1994)Google Scholar
  9. 9.
    Michael, L.: Evolvability via the Fourier Transform (2009)Google Scholar
  10. 10.
    Reischuk, R., Zeugmann, T.: A Complete and Tight Average-Case Analysis of Learning Monomials. In: Meinel, C., Tison, S. (eds.) STACS 1999. LNCS, vol. 1563, pp. 414–423. Springer, Heidelberg (1999)CrossRefGoogle Scholar
  11. 11.
    Ros, J.P.: Learning Boolean functions with genetic algorithms: A PAC analysis. In: FGA, San Mateo, CA, pp. 257–275. Morgan Kaufmann, San Francisco (1993)Google Scholar
  12. 12.
    Servedio, R.A.: On learning monotone DNF under product distributions. Inf. Comput. 193(1), 57–74 (2004)MathSciNetCrossRefzbMATHGoogle Scholar
  13. 13.
    Simon, H.-U.: Learning decision lists and trees with equivalence-queries. In: Vitányi, P.M.B. (ed.) EuroCOLT 1995. LNCS, vol. 904, pp. 322–336. Springer, Heidelberg (1995)CrossRefGoogle Scholar
  14. 14.
    Valiant, L.G.: Evolvability. In: Kučera, L., Kučera, A. (eds.) MFCS 2007. LNCS, vol. 4708, pp. 22–43. Springer, Heidelberg (2007)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2009

Authors and Affiliations

  • Dimitrios I. Diochnos
    • 1
  • György Turán
    • 1
    • 2
  1. 1.Dept. of Mathematics, Statistics, and Computer ScienceUniversity of Illinois at ChicagoChicagoUSA
  2. 2.Research Group on Artificial Intelligence of the Hungarian Academy of SciencesUniversity of SzegedHungary

Personalised recommendations