Skip to main content

Learning Under a Fixed Probability Measure

  • Chapter
Learning and Generalisation

Part of the book series: Communications and Control Engineering ((CCE))

  • 1041 Accesses

Abstract

In this chapter, we study the problems of concept and function learning in the case where the samples are drawn in accordance with a known fixed distribution. Various necessary and/or sufficient conditions are presented for a concept class or a function class to be learnable. The principal results of the chapter can be summarized as follows: Suppose the input sequence to the learning algorithm is i.i.d. Then we have the following:

  1. 1.

    If a function class F (or a concept class C) has the property of uniform convergence of empirical means, then it is also ASEC learnable. However, the converse is not true in general — there exist function classes that are ASEC learnable even though they do not possess the UCEM property.

  2. 2.

    A function class is PUAC learnable if it possesses a property known as the “shrinking width” property. The shrinking width property is also a necessary condition in order for every consistent algorithm to be PU AC.

  3. 3.

    Similarly, there is a necessary and sufficient condition for a function family to be consistently PAC learnable.

  4. 4.

    It can be shown that PUAC learnability is equivalent to consistent PUAC learnabilit. In contrast, PAC learnability is not equivalent in general to consistent PAC learnability.

  5. 5.

    A function class (or a concept class) is learnable if it satisfies a property known as “finite metric entropy.”

  6. 6.

    In order for a concept class to be learnable, the finite metric entropy condition is necessary as well as sufficient; however, for a function class to be learnable, the finite metric entropy condition is sufficient but is not necessary in general.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 149.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 199.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 199.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2003 Springer-Verlag London

About this chapter

Cite this chapter

Vidyasagar, M. (2003). Learning Under a Fixed Probability Measure. In: Learning and Generalisation. Communications and Control Engineering. Springer, London. https://doi.org/10.1007/978-1-4471-3748-1_6

Download citation

  • DOI: https://doi.org/10.1007/978-1-4471-3748-1_6

  • Publisher Name: Springer, London

  • Print ISBN: 978-1-84996-867-6

  • Online ISBN: 978-1-4471-3748-1

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics