Advertisement

Two Types of Partial Least Squares Method in Linear Discriminant Analysis

  • Hyun Bin Kim
  • Yutaka Tanaka
Conference paper
Part of the Studies in Classification, Data Analysis, and Knowledge Organization book series (STUDIES CLASS)

Summary

Partial least squares linear discriminant function (PLSD) is a new discriminant function proposed by Kim and Tanaka (1995a). PLSD uses the idea of partial least squares (PLS) method, which was originally developed in multiple regression analysis, in discriminant analysis. In this paper, two types of PLSD are investigated and evaluated in a simulation study. In the first type named PLSDA(all), a common pooled within-group covariance matrix of all groups is used in modeling PLSD to discriminate all pairs of groups. In the second type named PLSDT(two), pooled within-group covariance matrices based on the related two groups are used in modeling PLSD to discriminate pairs of groups. As the results of the simulation study PLSDA has the better performance than PLSDT in all situations when the covariance matrices are equal in all groups, while PLSDT is better than PLSDA in well conditioned situations when the covariance matrices are different among the groups.

Keywords

Partial Little Square Covariance Matrice Partial Little Square Regression Principal Component Regression Covariance Vector 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Efron, B., and Morris, C. (1976), Multivariate Empirical Bayes and Estimation of Covariance Matrices, The Annals of Statistics, Vol. 4, pp. 22–32.MathSciNetMATHCrossRefGoogle Scholar
  2. Flury, B., Schmid, M. J. and Narayanan, A. (1994), Error Rates in Quadratic Discrimination with Constraints on the Covariance Matrices, Journal of Classification, Vol. 11, pp. 101–120.MathSciNetMATHCrossRefGoogle Scholar
  3. Frank, I. E. and Friedman, J. H. (1993), A Statistical View of Some Chemometrics Regression Tools, Technometrics, Vol. 35, No. 2, pp. 109–148.MATHCrossRefGoogle Scholar
  4. Friedman, J. H. (1989), Regularized Discriminant Analysis, Journal of the American Statistical Association, Vol. 84, pp. 165–175.MathSciNetCrossRefGoogle Scholar
  5. James, W., and Stein, C. (1961), Estimation with Quadratic Loss, Proceedings of the Fourth Berkeley Symposium on Mathematical Statistics and Probability, Vol. 1, pp. 361–379, Berkeley: University of California Press.Google Scholar
  6. Kim, H. B., and Tanaka, Y. (1994), A Numerical Study of Partial Least Squares Regression with an Emphasis on the Comparison with Principal Component Regression, Proceedings of the Eighth Japan and Korea Joint Conference of Statistics, pp. 83–88, Okayama, Japan.Google Scholar
  7. Kirn, H. B., and Tanaka, Y. (1995a), Linear Discriminant Function Using Partial Least Squares Method, Proceedings of International Conference on Statistical Methods and Statistical Computing for Quality and Productivity Improvement (ICSQP ‘85), Vol. 2, pp. 875881, Seoul, Korea.Google Scholar
  8. Kim, H. B., and Tanaka, Y. (1995b), Generating Artificial Data with Preassigned Degree of Multicollinearity by Using Singular Value Decomposition, The Journal of Japanese Society of Computational Statistics, Vol. 8., pp. 1–8.Google Scholar
  9. Kim, H. B., and Tanaka, Y. (1996), Application of Partial Least Squares Linear Discriminant Function to Writer Identification in Pattern Recognition, Journal of the Faculty of Environmental Science and Technology Okayama University, Vol. 1. pp. 65–76.Google Scholar
  10. O’Sullivan, F. (1986), A Statistical Perspective on Ill-Posed Inverse Problems, Statistical Science, Vol. 1, pp. 502–527.MathSciNetMATHCrossRefGoogle Scholar
  11. Rao, C. R. (1973), Linear Statistical Inference and Its Applications, 2nd Edition, John Wiley and Sons, Inc., New York.Google Scholar
  12. Stigler, S. M. (1990), The 1988 Neyman Memorial Lecture: A Galtonian Perspective on Shrinkage Estimators, Statistical Science, Vol. 5, pp. 147–155.MathSciNetMATHGoogle Scholar
  13. Titterington, D. M. (1985), Common Structure of Smoothing Techniques in Statistics, International Statistical Review, Vol. 53, pp. 141–170.MathSciNetMATHCrossRefGoogle Scholar
  14. Wold, H. (1975), Soft Modeling by Latent Variables; the Non-linear Iterative Partial Least Squares Approach, In Perspectives in Probability and Statistics, Papers in Honou of M. S. Bartlett, Edited J. Gani, Academic Press, Inc., London.Google Scholar
  15. Wold, S., \Vold, H., Dunn, W. J., and Ruhe, A. (1984), The Collinearity Problem in Linear Regression. The Partial Least Squares (PLS) Approach to Generalized Inverse, SIAM Journal on Scientific and Statistical Computing, Vol. 5, pp. 735–743.MATHGoogle Scholar
  16. Yoshimura, M., Yoshimura, I. and Kim, H. B. (1993), A Text-Independent Off-Line Writer Identification Method for Japanese and Korean Sentences, IEICE Trans. Inf. and Syst., Vol. E76-D, No 4, pp. 454–461.Google Scholar

Copyright information

© Springer Japan 1998

Authors and Affiliations

  • Hyun Bin Kim
    • 1
  • Yutaka Tanaka
    • 2
  1. 1.System Engineering Research InstituteYusung-Gu, TaejonKorea
  2. 2.Department of Environmental and Mathematical SciencesOkayama UniversityTsushima, Okayama 700Japan

Personalised recommendations