Abstract
Concept classes can canonically be represented by matrices with entries 1 and -1. We use the singular value decomposition of this matrix to determine the optimal margins of embeddings of the concept classes of singletons and of half intervals in homogeneous Euclidean half spaces. For these concept classes the singular value decomposition can be used to construct optimal embeddings and also to prove the corresponding best possible upper bounds on the margin. We show that the optimal margin for embedding n singletons is \( \frac{n} {{3n - 4}} \) and that the optimal margin for half intervals over \( \left\{ {1,...,n} \right\}{\mathbf{ }}is{\mathbf{ }}\frac{\pi } {{21nn}} + \Theta \left( {\frac{1} {{\left( {1nn} \right)^2 }}} \right) \) . For the upper bounds on the margins we generalize a bound given in [6]. We also discuss the concept classes of monomials to point out limitations of our approach.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Ben-David, S.. (2000). Personal Communication.
Ben-David, S., Eiron, N., & Simon, H.U. (2000). Limitations of learning via embeddings in Euclidean Hals-Spaces. The Fourteenth Annual Conference on Computational Learning Theory and The Fifth European Conference on Computational Learning Theory.
Ben-David, S., Eiron, N., & Simon, H.U. (2000). Unpublished manuscript.
Blumer, A., Ehrenfeucht, A., Haussler, D., & Warmuth, M.K. (1989). Learnability and the Vapnik-Chervonenkis dimension. Journal of the ACM, 100, 157–184.
Christianini, N., & Shawe-Taylor, J. (2000). An Introduction to Support Vector Machines. Cambridge, United Kingdom: Cambridge University Press.
Forster, J. (2001). A Linear Lower Bound on the Unbounded Error Probabilistic Communication Complexity. Sixteenth Annual IEEE Conference on Computational Complexity.
Horn, R.A., & Johnson, C.R. (1985). Matrix Analysis. Cambridge, United Kingdom: Cambridge University Press.
Kearns, M.J., & Vazirani, U.V. (1994). An Introduction to Computational Learning Theory. Cambridge, Massachusetts: Massachusetts Institute of Technology.
Krause, M. (1996). Geometric arguments yield better bounds for threshold circuits and distributed computing. Theoretical Computer Science, 156, 99–117.
Maass, W. & Turan, G. (1992). Lower Bound Methods and Separation Results for On-Line Learning Models. Machine Learning, 9, 107–145.
Novikoff, A.B. (1962). On convergence proofs on perceptrons. Symposium on the Mathematical Theory of Automata, 12, 615–622. Polytechnic Institute of Brooklyn.
Vapnik, V. (1998). Statistical Learning Theory. New York: John Wiley & Sons, Inc.
Vapnik, V.N., & Chervonenkis, A.Y. (1971). On the uniform convergence of relative frequencies of events to their probabilities. Theory of Probability and its Applications, 16, 264–280.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2001 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Forster, J., Schmitt, N., Simon, H.U. (2001). Estimating the Optimal Margins of Embeddings in Euclidean Half Spaces. In: Helmbold, D., Williamson, B. (eds) Computational Learning Theory. COLT 2001. Lecture Notes in Computer Science(), vol 2111. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-44581-1_26
Download citation
DOI: https://doi.org/10.1007/3-540-44581-1_26
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-42343-0
Online ISBN: 978-3-540-44581-4
eBook Packages: Springer Book Archive