Abstract
In this paper, we derive lower and upper bounds for the probability of error for a linear classifier, where the random vectors representing the underlying classes obey the multivariate normal distribution. The expression of the error is derived in the one-dimensional space, independently of the dimensionality of the original problem. Based on the two bounds, we propose an approximating expression for the error of a generic linear classifier. In particular, we derive the corresponding bounds and the expression for approximating the error of Fisher’s classifier. Our empirical results on synthetic data, including up to five-hundred-dimensional featured samples, show that the computations for the error are extremely fast and quite accurate; the approximation differs from the actual error by at most ε=0.0184340683.
Chapter PDF
Similar content being viewed by others
Keywords
- Covariance Matrice
- Actual Probability
- Actual Error
- Multivariate Normal Distribution
- Statistical Pattern Recognition
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
References
Cody, W.: A Portable FORTRAN Package of Special Function Routines and Test Drivers. ACM Transactions on Mathematical Software 19, 22–32 (1993)
Davies, P., Higham, N.: Numerically Stable Generation of Correlation Matrices and their Factors. Technical Report 354, Manchester, England (1999)
Duda, R., Hart, P., Stork, D.: Pattern Classification, 2nd edn. John Wiley and Sons, Inc., New York (2000)
Fukunaga, K.: Introduction to Statistical Pattern Recognition. Academic Press, London (1990)
Herbrich, R., Graepel, T.: A PAC-Bayesian Margin Bound for Linear Classifiers. IEEE Transactions on Information Theory 48(12), 3140–3150 (2002)
Kendall, M., Stuart, A.: Kendall’s Advanced Theory of Statistics, 6th edn. Distribution Theory, vol. I. Edward Arnold, London (1998)
Lee, C., Choi, E.: Bayes Error Evaluation of the Gaussian ML Classifier. IEEE Transactions on Geoscience and Remote Sensing 38(3), 1471–1475 (2000)
Raudys, S., Duin, R.: Expected Classification Error of the Fisher Linear Classifier with Pseudo-inverse Covariance Matrix. Pattern Recognition Letters 19, 385–392 (1999)
Rueda, L.: A One-dimensional Analysis for the Probability of Error of Linear Classifiers for Normally Distributed Classes (2004) (submitted for Publication), Electronically available at http://davinci.newcs.uwindsor.ca/~lrueda/papers/ErrorEstJnl.pdf
Rueda, L.: An Efficient Approach to Compute the Threshold for Multi-dimensional Linear Classifiers. Pattern Recognition 37(4), 811–826 (2004)
Vaswani, N.: A Linear Classifier for Gaussian Class Conditional Distributions with Unequal Covariance Matrices. In: Proceedings of the 16th International Conference on Pattern Recognition, vol. 2, pp. 60–63. Quebec, Canada (2002)
Webb, A.: Statistical Pattern Recognition, 2nd edn. John Wiley & Sons, N.York (2002)
Xu, Y., Yang, Y., Jin, Z.: A Novel Method for Fisher Discriminant Analysis. Pattern Recognition 37(2), 381–384 (2004)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2004 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Rueda, L. (2004). New Bounds and Approximations for the Error of Linear Classifiers. In: Sanfeliu, A., Martínez Trinidad, J.F., Carrasco Ochoa, J.A. (eds) Progress in Pattern Recognition, Image Analysis and Applications. CIARP 2004. Lecture Notes in Computer Science, vol 3287. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-30463-0_42
Download citation
DOI: https://doi.org/10.1007/978-3-540-30463-0_42
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-23527-9
Online ISBN: 978-3-540-30463-0
eBook Packages: Springer Book Archive