The paper addresses the problem of pattern classification when distortions are present in the observed data. These are distortions occurring either in the process of classifying a test pattern or during the learning phase of a classifier design. In particular we consider the case when the measurements are corrupted by noise. Additive, multiplicative and generalized noise models are taken into account. A unified framework of the posed problems based on the Bayesian paradigm is given. Learning algorithms stemming from nonparametric curve estimation techniques are derived. We examine the effect of distortions in measurements on the proposed classification algorithms. A class of classification techniques being Bayes risk consistent is derived. The latter result makes use of the idea of deconvolution.


Feature Vector Classification Rule Pattern Classification Distortion Model Distorted Version 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


  1. 1.
    W.Greblicki, Learning to recognize patterns with a probabilistic teacher, Pattern Recognition, 12, pp. 159–164, 1980.CrossRefGoogle Scholar
  2. 2.
    T.Farago and L.Gyorfi, On the continuity of the error distortion function for multiplehypothesis decisions, IEEE Trans. Inform. Theory, 21, pp. 458–460, 1975.CrossRefGoogle Scholar
  3. 3.
    J.Aitchison and I.J.Lauder, Statistical diagnosis from imprecise data, Biometrika, 66, pp.475–483, 1979.Google Scholar
  4. 4.
    W.H.Tsai and K.S. Fu, A pattern deformational model and Bayes error-correcting recognition system, IEEE Trans. on Systems, Man, and Cybernetics, 12, pp.745–756, 1979.Google Scholar
  5. 5.
    B.B.Chaudhuri, Bayes' error and its sensitivity in statistical pattern recognition in noisy environment, International Jour.System Sci., 13, pp.559–570, 1982.Google Scholar
  6. 6.
    B.B.Chaudhuri, C.A.Murthy and D.Duttamajumder, Bayes' error probability for noisy and imprecise measurement in pattern recognition, IEEE Trans. on Systems, Man, and Cybernetics, 13, pp.89–94,1983.Google Scholar
  7. 7.
    S.D.Gupta, Pattern recognition when feature variables are subject to error, Sankhya, Ser.B, 51, pp.287–294, 1989.Google Scholar
  8. 8.
    G.Lugosi, Pattern classification from distorted sample, Problems of Control and Information Theory, 20, pp. 1–9, 1991.Google Scholar
  9. 9.
    M.Pawlak, Pattern classification and deconvolution problems, Technical Report, 1998.Google Scholar
  10. 10.
    M.Pawlak, Learning with noisy patterns; discrete feature case, Technical Report, 1998.Google Scholar
  11. 11.
    J.Fan, On the optimal rates of convergence for nonparametric deconvolution problems, Ann. Statistics, 19, pp.1257–1272, 1991.Google Scholar
  12. 12.
    A.R.Webb, Functional approximation by feedforward networks: a least-squares approach to generalization, IEEE Trans. on Neural Networks, vol. 5, pp.363–371, 1994.CrossRefGoogle Scholar
  13. 13.
    R.Reed, R.J.Marks and S.Oh, Similarities of error regularization, sigmoid gain scaling, smoothing, and training with jitter, IEEE Trans. on Neural Networks, vol. 6, pp.529–538, 1995.CrossRefGoogle Scholar
  14. 14.
    L.Holmstrom and P.Koistinien, Using additive noise in backpropagation training, IEEE Trans. on Neural Networks, vol. 3, pp.24–381, 1992.CrossRefGoogle Scholar
  15. 15.
    M.Pawlak, Kernel classification rules from missing data, IEEE Trans. Inform. Theory, 39, pp. 979–988, 1993.CrossRefGoogle Scholar
  16. 16.
    L. Devroye, L. Gyorfi and G.Lugosi, A Probabilistic Theory of Pattern Recognition, Wiley, 1996.Google Scholar
  17. 17.
    W. Greblicki and M. Pawlak, Necessary and sufficient conditions for Bayes risk consistency of a recursive kernel classification rule, IEEE Trans. Inform. Theory, vol. IT-33, pp. 408–412, 1987.CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1998

Authors and Affiliations

  • M. Pawlak
    • 1
    • 2
  • D. Siu
    • 1
    • 2
  1. 1.Department of Electrical and Computer EngineeringUniversity of ManitobaCanada
  2. 2.Motorola IncorporationHong-Kong

Personalised recommendations