Abstract
The paper addresses the problem of pattern classification when distortions are present in the observed data. These are distortions occurring either in the process of classifying a test pattern or during the learning phase of a classifier design. In particular we consider the case when the measurements are corrupted by noise. Additive, multiplicative and generalized noise models are taken into account. A unified framework of the posed problems based on the Bayesian paradigm is given. Learning algorithms stemming from nonparametric curve estimation techniques are derived. We examine the effect of distortions in measurements on the proposed classification algorithms. A class of classification techniques being Bayes risk consistent is derived. The latter result makes use of the idea of deconvolution.
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
Download to read the full chapter text
Chapter PDF
References
W.Greblicki, Learning to recognize patterns with a probabilistic teacher, Pattern Recognition, 12, pp. 159–164, 1980.
T.Farago and L.Gyorfi, On the continuity of the error distortion function for multiplehypothesis decisions, IEEE Trans. Inform. Theory, 21, pp. 458–460, 1975.
J.Aitchison and I.J.Lauder, Statistical diagnosis from imprecise data, Biometrika, 66, pp.475–483, 1979.
W.H.Tsai and K.S. Fu, A pattern deformational model and Bayes error-correcting recognition system, IEEE Trans. on Systems, Man, and Cybernetics, 12, pp.745–756, 1979.
B.B.Chaudhuri, Bayes' error and its sensitivity in statistical pattern recognition in noisy environment, International Jour.System Sci., 13, pp.559–570, 1982.
B.B.Chaudhuri, C.A.Murthy and D.Duttamajumder, Bayes' error probability for noisy and imprecise measurement in pattern recognition, IEEE Trans. on Systems, Man, and Cybernetics, 13, pp.89–94,1983.
S.D.Gupta, Pattern recognition when feature variables are subject to error, Sankhya, Ser.B, 51, pp.287–294, 1989.
G.Lugosi, Pattern classification from distorted sample, Problems of Control and Information Theory, 20, pp. 1–9, 1991.
M.Pawlak, Pattern classification and deconvolution problems, Technical Report, 1998.
M.Pawlak, Learning with noisy patterns; discrete feature case, Technical Report, 1998.
J.Fan, On the optimal rates of convergence for nonparametric deconvolution problems, Ann. Statistics, 19, pp.1257–1272, 1991.
A.R.Webb, Functional approximation by feedforward networks: a least-squares approach to generalization, IEEE Trans. on Neural Networks, vol. 5, pp.363–371, 1994.
R.Reed, R.J.Marks and S.Oh, Similarities of error regularization, sigmoid gain scaling, smoothing, and training with jitter, IEEE Trans. on Neural Networks, vol. 6, pp.529–538, 1995.
L.Holmstrom and P.Koistinien, Using additive noise in backpropagation training, IEEE Trans. on Neural Networks, vol. 3, pp.24–381, 1992.
M.Pawlak, Kernel classification rules from missing data, IEEE Trans. Inform. Theory, 39, pp. 979–988, 1993.
L. Devroye, L. Gyorfi and G.Lugosi, A Probabilistic Theory of Pattern Recognition, Wiley, 1996.
W. Greblicki and M. Pawlak, Necessary and sufficient conditions for Bayes risk consistency of a recursive kernel classification rule, IEEE Trans. Inform. Theory, vol. IT-33, pp. 408–412, 1987.
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 1998 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Pawlak, M., Siu, D. (1998). Pattern classification with noisy features. In: Amin, A., Dori, D., Pudil, P., Freeman, H. (eds) Advances in Pattern Recognition. SSPR /SPR 1998. Lecture Notes in Computer Science, vol 1451. Springer, Berlin, Heidelberg. https://doi.org/10.1007/BFb0033310
Download citation
DOI: https://doi.org/10.1007/BFb0033310
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-64858-1
Online ISBN: 978-3-540-68526-5
eBook Packages: Springer Book Archive