Advertisement

New Lower Bounds for Statistical Query Learning

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 2375)

Abstract

We prove two lower bounds on the Statistical Query (SQ) learning model. The first lower bound is on weak-learning. We prove that for a concept class of SQ-dimension d, a running time of Ω(d/logd) is needed. The SQ-dimension of a concept class is defined to be the maximum number of concepts that are “uniformly correlated”, in that each pair of them have nearly the same correlation. This lower bound matches the upper bound in [BFJ+94], up to a logarithmic factor. We prove this lower bound against an “honest SQ-oracle”, which gives a stronger result than the ones against the more frequently used “adversarial SQ-oracles”. The second lower bound is more general. It gives a continuous trade-off between the “advantage” of an algorithm in learning the target function and the number of queries it needs to make, where the advantage of an algorithm is the probability it succeeds in predicting a label minus the probability it doesn’t. Both lower bounds extend and/or strengthen previous results, and solved an open problem left in [Y01].

Keywords

Parity Function Concept Class Target Concept Logarithmic Factor Statistical Query 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. [AS95]
    Javed A. Aslam and Scott E. Decatur, Specification and Simulation of Learning Algorithms for Efficiency and Noise-Tolerance, In COLT 1995, pages 437–446. ACM Press, July 1995.Google Scholar
  2. [B95]
    Christopher Bishop, Neural Networks for Pattern Recognition, Oxford University Press, 1995.Google Scholar
  3. [BFJ+94]
    Avrim Blum, Merrick Furst, Jeffrey Jackson, Michael Kearns, Yishay Man-sour, and Steven Rudich. Weakly Learning DNF and Characterizing Statistical Query Learning Using Fourier Analysis. STOC 1994, pages 253–262, 1994.Google Scholar
  4. [CC80]
    Christopher Chatfield and Alexander Collins, Introduction to Multivariate Analysis, Chapman and Hall, 1980.Google Scholar
  5. [HJ85]
    Roger Horn and Charles Johnson, Matrix Analysis, Cambridge University Press, 1985.Google Scholar
  6. [K59]
    S. Kullback, Information Theory and Statistics, New York: Dover Publications, 1959.zbMATHGoogle Scholar
  7. [KL51]
    S. Kullback and R. A. Leibler, On Information and Sufficiency, Annals of Mathematical Statistics 22, pp. 79–86, 1951.MathSciNetzbMATHCrossRefGoogle Scholar
  8. [J00]
    Jeff Jackson On the Efficiency of Noise-Tolerant PAC Algorithms Derived from Statistical Queries. In Proceedings of the 13th Annual Workshop on Computational Learning Theory, 2000.Google Scholar
  9. [K93]
    Michael Kearns. Efficient noise-tolerant learning from statistical queries. In Proceedings of the 25th Annual ACM Symposium on Theory of Computing, pp. 392–401, 1993.Google Scholar
  10. [S88]
    Gilbert Strang, Linear Algebra and Its Applications, third Edition, Har-court Bruce Jovanovich Inc., 1988.Google Scholar
  11. [V84]
    Leslie Valiant, A theory of the Leanable. In Communications of the ACM, 27(11): 1134–1142, November 1984.Google Scholar
  12. [Y01]
    Ke Yang, On Learning Correlated Boolean Concepts Using Statistical Query, In the Proceedings of the 12th International Conference on Algorithmic Learning Theory (ALT’01), LNAI 2225, pp. 59–76, 2001.Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2002

Authors and Affiliations

  • Ke Yang
    • 1
  1. 1.Computer Science DepartmentCarnegie Mellon UniversityPittsburghUSA

Personalised recommendations