Skip to main content
Log in

Ridge-forward quadratic discriminant analysis in high-dimensional situations

  • Published:
Journal of Systems Science and Complexity Aims and scope Submit manuscript

Abstract

Quadratic discriminant analysis is a classical and popular classification tool, but it fails to work in high-dimensional situations where the dimension p is larger than the sample size n. To address this issue, the authors propose a ridge-forward quadratic discriminant (RFQD) analysis method via screening relevant predictors in a successive manner to reduce misclassification rate. The authors use extended Bayesian information criterion to determine the final model and prove that RFQD is selection consistent. Monte Carlo simulations are conducted to examine its performance.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Shao J, Wang Y, Deng X, et al., Sparse linear discriminant analysis by thresholding for high dimensional data, The Annals of Statistics, 2011, 39: 1241–1265.

    Article  MathSciNet  MATH  Google Scholar 

  2. Cai T and Liu W, A direct estimation approach to sparse linear discriminant analysis, Journal of the American Statistical Association, 2011, 106: 1566–1577.

    Article  MathSciNet  MATH  Google Scholar 

  3. Mai Q and Zou H, A direct approach to sparse discriminant analysis in ultra-high dimensions, Biometrika, 2012, 99: 29–42.

    Article  MathSciNet  MATH  Google Scholar 

  4. Fan J, Feng Y, and Tong X, A road to classification in high dimensional space: The regularized optimal affine discriminant, Journal of the Royal Statistical Society: Series B (Statistical Methodology), 2012, 74(4): 745–771.

    Article  MathSciNet  Google Scholar 

  5. Shao J and Li Q, Sparse quadratic discriminant analysis for high dimensional data, Statistica Sinica, 2015, 25: 457–473.

    MathSciNet  MATH  Google Scholar 

  6. Zhang Q and Wang H, On BIC’s selection consistency for discriminant analysis, Statistics Sinica, 2011, 21: 731–740.

    Article  MATH  Google Scholar 

  7. Kohavi R and John G H, Wrappers for feature subset selection, Artificial Intelligence, 1996, 97: 273–324.

    Article  MATH  Google Scholar 

  8. Johnson R A and Wichern D W, Applied Multivariate Statistical Analysis, 6th Edition, Springer, Berlin, 2007.

    MATH  Google Scholar 

  9. Anderson T W, An Introduction to Multivariate Statistical Analysis, 3rd Edition, Wiley, New York, 2003.

    MATH  Google Scholar 

  10. Chen J and Chen Z, Extended Bayesian information criterion for model selection with large model spaces, Biometrika, 2008, 95(3): 759–771.

    Article  MathSciNet  MATH  Google Scholar 

  11. Raftery A E and Dean N, Variable selection for model-based clustering, Journal of the American Statistical Association, 2006, 101: 168–178.

    Article  MathSciNet  MATH  Google Scholar 

  12. Shao J, Mathematical Statistics, 2nd Edition, Springer, New York, 2003.

    Book  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Cui Xiong.

Additional information

ZHANG Jun’s research was supported by the National Natural Science Foundation of China under Grant No. 11401391.

This paper was recommended for publication by Editor SHAO Jun.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Xiong, C., Zhang, J. & Luo, X. Ridge-forward quadratic discriminant analysis in high-dimensional situations. J Syst Sci Complex 29, 1703–1715 (2016). https://doi.org/10.1007/s11424-016-6024-1

Download citation

  • Received:

  • Revised:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11424-016-6024-1

Keywords

Navigation