Relaxation of Hard Classification Targets for LSE Minimization

  • Kar-Ann Toh
  • Xudong Jiang
  • Wei-Yun Yau
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3757)


In the spirit of stabilizing a solution to handle possible over-fitting of data which is especially common for high order models, we propose a relaxed target training method for regression models which are linear in parameters. This relaxation of training target from the conventional binary values to disjoint classification spaces provides good classification fidelity according to a threshold treatment during the decision process. A particular design to relax the training target is provided under practical consideration. Extension to multiple class problems is formulated before the method is applied to a plug-in full multivariate polynomial model and a reduced model on synthetic data sets to illustrate the idea. Additional experiments were performed using real-world data from the UCI[1] data repository to derive certain empirical evidence.


Pattern Classification Parameter Estimation Pattern Recognition Multivariate Polynomials and Machine Learning 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Blake, C.L., Merz, C.J.: UCI Repository of machine learning databases, University of California, Irvine, Dept. of Information and Computer Sciences (1998),
  2. 2.
    Duda, R.O., Hart, P.E., Stork, D.G.: Pattern Classification, 2nd edn. John Wiley & Sons, Inc., New York (2001)zbMATHGoogle Scholar
  3. 3.
    Schürmann, J.: Pattern Classification: A Unified View of Statistical and Neural Approaches. John Wiley & Sons, Inc., New York (1996)Google Scholar
  4. 4.
    Poggio, T., Rifkin, R., Mukherjee, S., Niyogi, P.: General Conditions for Predictivity in Learning Theory. Nature 428, 419–422 (2004)CrossRefGoogle Scholar
  5. 5.
    Baram, Y.: Soft Nearest Neighbor Classification. In: International Conference on Neural Networks (ICNN), vol. 3, pp. 1469–1473 (1997)Google Scholar
  6. 6.
    Baram, Y.: Partial Classification: The Benefit of Deferred Decision. IEEE Trans. Pattern Analysis and Machine Intelligence 20(8), 769–776 (1998)MathSciNetCrossRefGoogle Scholar
  7. 7.
    Toh, K.-A., Yau, W.-Y., Jiang, X.: A Reduced Multivariate Polynomial Model For Multimodal Biometrics And Classifiers Fusion. IEEE Trans. Circuits and Systems for Video Technology (Special Issue on Image- and Video-Based Biometrics) 14(2), 224–233 (2004)Google Scholar
  8. 8.
    Toh, K.-A., Tran, Q.-L., Srinivasan, D.: Benchmarking A Reduced Multivariate Polynomial Pattern Classifier. IEEE Trans. Pattern Analysis and Machine Intelligence 26(6), 740–755 (2004)CrossRefGoogle Scholar
  9. 9.
    Tipping, M.E.: Sparse Bayesian Learning and the Relevance Vector machine. Journal of Machine Learning Research 1, 211–244 (2001)MathSciNetCrossRefzbMATHGoogle Scholar
  10. 10.
    Tipping, M.E.: The Relevance Vector machine. In: Solla, S.A., Leen, T.K., Müller, K.-R. (eds.) Advances in Neural Information Processing Systems, vol. 12, pp. 652–658 (2000)Google Scholar
  11. 11.
    Figueiredo, M.A.T.: Adaptive Sparseness for Supervised Learning. IEEE Trans. Pattern Analysis and Machine Intelligence 25(9), 1150–1159 (2003)CrossRefGoogle Scholar
  12. 12.
    The MathWorks, Matlab And Simulink (2003),
  13. 13.
    Ma, J., Zhao, Y., Ahalt, S.: OSU SVM Classifier Matlab Toolbox (ver 3.00), The Ohio State University (2002),
  14. 14.
    Tipping, M.: Sparse Bayesian Learning and the Relevance Vector Machine, Microsoft Research (2004),
  15. 15.
    Vapnik, V.N.: Statistical Learning Theory. Wiley-Interscience Pub., Hoboken (1998)zbMATHGoogle Scholar
  16. 16.
    Soares, C., Brazdil, P.B., Kuba, P.: A meta-learning method to select the kernel width in support vector regression. Machine Learning 54(3), 195–209 (2004)CrossRefzbMATHGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2005

Authors and Affiliations

  • Kar-Ann Toh
    • 1
  • Xudong Jiang
    • 2
  • Wei-Yun Yau
    • 3
  1. 1.Biometrics Engineering Research Center (BERC), School of Electrical & Electronic EngineeringYonsei UniversitySeoulKorea
  2. 2.School of Electrical & Electronic EngineeringNanyang Technological UniversitySingapore
  3. 3.Institute for Infocomm ResearchSingapore

Personalised recommendations