Feature Classification Based on Manifold Dimension Reduction for Night-Vision Images

  • Lianfa BaiEmail author
  • Jing Han
  • Jiang Yue


To improve the speed of data processing and the accuracy of data classification, it is necessary to apply dimensionality reduction methods. This chapter introduces dimension reduction methods based on manifold learning. First, in order to improve the classification accuracy using sample classification information, choosing corresponding samples via data similarity to construct the intra-class and inter-class scatter matrices. Second, to solve the selecting problem of parameters in LPP using the correlation coefficient to adaptively construct the label graph to characterise the discriminative information of different manifolds, and the local relation graph to characterise the distribution of each manifold. Finally, a maximum kernel maximum nuclear likelihood (KML) similarity measure is defined to calculate the outlier probability of high-dimensional data and detect outliers. The kernel LLE (KLLE) is weighted based on the KML measure, which chooses the best neighbour to generate a precise mapping from high-dimensional night-vision data to the low-dimensional vision.


  1. Belhumeur, P. N., Hespanha, J. P., & Kriegman, D. (1996). Eigenfaces vs. Fisherfaces: Recognition using class specific linear projection. IEEE TPAMI, 19(7), 711–720.CrossRefGoogle Scholar
  2. Belkin, M., & Niyogi, P. (2003). Laplacian eigenmaps for dimensionality reduction and data representa-tion. Journal of Neural Computation, 15(6), 1373–1396.CrossRefGoogle Scholar
  3. Chang, H. D. (2006). Yeung. Robust locally linear embedding. Pattern Recognition, 39(6), 1053–1065.CrossRefGoogle Scholar
  4. Chen, J., Ma, Z. (2011). Locally linear embedding: A review. International Journal of Pattern Recognition and Artificial Intelligence, 25(07).MathSciNetCrossRefGoogle Scholar
  5. Choi, H., & Choi, S. (2007). Robust kernel Isomap. Pattern Recognition, 40, 853–862.CrossRefGoogle Scholar
  6. Chung, F. R. K. (1997). Spectral graph theory, CBMS regional conference series in mathematics (p. 92). American Mathematical Society.Google Scholar
  7. CVC Technical Report. (1998). The AR face database.Google Scholar
  8. Deledalle, C., Denis, L., & Tupin, F. (2009). Iterative weighted maximum likelihood denoising with probabilistic patch-based weights. IEEE Transactions on Image Processing, 18(12), 2661–2672.MathSciNetCrossRefGoogle Scholar
  9. Dornaika, F., & Assoum, A. (2013). Enhanced and parametreless locality-preserving projections for face recognition’. Neurocomputing, 99, 448–457.CrossRefGoogle Scholar
  10. Fukunaga, K. (1990). Introduction to statistical pattern recognition (2nd Ed.). Iteration Number Initial.Google Scholar
  11. Ge, S. S., He, H., & Shen, C. (2012). Geometrically local embedding in manifolds for dimension reduction. Pattern Recognition, 45, 1455–1470.CrossRefGoogle Scholar
  12. Ghodsi, A., Huang, J., Souther, F. et al. (2005). Tangent-corrected embedding. IEEE CVPR (pp. 518–525).Google Scholar
  13. Graham, D. B., & Allinson, N. M. (1998). Face recognition: From theory to applications. NATO ASI Series F, Computer and Systems Sciences, 163, 446–456.Google Scholar
  14. Han, J., Yue, J., Zhang, Y., et al. (2014). Kernel maximum likelihood-scaled locally linear embedding for night vision images. Optics & Laser Technology, 56, 290–298.CrossRefGoogle Scholar
  15. He, X., Yan, S., Hu, Y., et al. (2005). Face recognition using Laplacianfaces’. IEEE TPAMI, 27(3), 328–340.CrossRefGoogle Scholar
  16. Hu, H., Ossikovski, R., & Goudail, F. (2013). Performance of Maximum Likelihood estimation of Mueller matrices taking into account physical realizability and Gaussian or Poisson noise statistics. Optics Express, 21(4), 5117–5129.CrossRefGoogle Scholar
  17. Jin, Z., Yang, J., Hu, Z., et al. (2001). Face recognition based on the uncorrelated discriminant transformation. Pattern Recognition, 34(7), 1405–1416.CrossRefGoogle Scholar
  18. Jones, M. C., & Henderson, D. A. (2009). Maximum likelihood kernel density estimation: On the potential of convolution sieves. Computational Statistics & Data Analysis, 53, 3726–3733.MathSciNetCrossRefGoogle Scholar
  19. Lee, K. C., Ho, J., & Kriegman, D. J. (2005). Acquiring linear subspaces for face recognition under variable lighting. IEEE TPAMI, 27(5), 684–698.CrossRefGoogle Scholar
  20. Li, B., Huang, D. S., Wang, C., et al. (2008). Feature extraction using constrained maximum variance mapping. Jornal of Pattern Recognition, 41(11), 3287–3294.CrossRefGoogle Scholar
  21. Li, B., & Zhang, Y. (2011). Supervised locally linear embedding projection (SLLEP) for machinery fault diagnosis. Mechanical Systems and Signal Processing, 25, 3125–3134.CrossRefGoogle Scholar
  22. Li, J. (2012). Gabor filter based optical image recognition using Fractional Power Polynomial model based common discriminant locality-preserving projection with kernels. Optics and Lasers in Engineering, 50, 1281–1286.CrossRefGoogle Scholar
  23. Li, J., Pan, J., & Chen, S. (2011). Kernel self-optimised locality-preserving discriminant analysis for feature extraction and recognition. Neurocomputing, 74, 3019–3027.CrossRefGoogle Scholar
  24. Martinez, A. M. (1998). The AR face database (CVC Technical Report), p. 24.Google Scholar
  25. Martinez, A. M., Kak, A. C. (2001). PCA versus LDA. C. IEEE Transactions on Pattern Analysis and Machine Intelligence, 23(2), 228–233.CrossRefGoogle Scholar
  26. Niyogi, X. (2004). Locality-preserving projections. Neural information processing systems. MIT, 16, 153–161.Google Scholar
  27. Pan, Y., Ge, S. S., & Mamun, A. A. (2009). Weighted locally linear embedding for dimension reduction. Pattern Recognition, 42, 798–811.CrossRefGoogle Scholar
  28. Roweis, S. T., & Saul, L. K. (2000). Nonlinear dimensionality reduction by locally linear embedding. Journal of Science, 290(5500), 2323–2326.CrossRefGoogle Scholar
  29. Sim, T., Baker, S., Bsat, M. (2001). The CMU pose, illumination, and expression (PIE) database of human faces (Technica1 Report CMU-RI-TR-01-02), Carnegie Me11on University.Google Scholar
  30. Sim, T., Baker, S., Bsat, M. (2003). The CMU pose, illumination, and expression database. C. IEEE Transactions on Pattern Analysis and Machine Intelligence, 25(12), 1615–1618.CrossRefGoogle Scholar
  31. Tenenbaum, J. B., De Silva, V., & Langford, J. C. (2000). A global geometric framework for nonlinear dimensionality reduction. Science, 290(5500), 2319–2323.CrossRefGoogle Scholar
  32. Wen, G. (2009). Relative transformation-based neighbourhood optimisation for isometric embedding. Neurocomputing, 72, 1205–1213.CrossRefGoogle Scholar
  33. Wold, S., Esbensen, K., & Geladi, P. (1987). Principal component analysis. Chemo metrics and intelli-gent laboratory systems, 2(1), 37–52.CrossRefGoogle Scholar
  34. Yan, S., Xu, D., Zhang, B., et al. (2007). Graph embedding and extensions: a general framework for dimensionality reduction. IEEE Transactions on Pattern Analysis and Machine Intelligence, 29(1), 40–51.CrossRefGoogle Scholar
  35. Yang, W., Sun, C., & Zhang, L. (2011). A multi-manifold discriminant analysis method for image feature extraction. Journal of Pattern Recognition, 44(8), 1649–1657.CrossRefGoogle Scholar
  36. Yu, H., & Yang, J. (2001). A direct LDA algorithm for high-dimensional data with application to face recognition. Pattern Recognition, 34(10), 2067–2070.CrossRefGoogle Scholar
  37. Yun, Z., Xuelian, T., Benyong, L., & Xueang, W. (2010). Radar Target Recognition based on KLLE and a KNRD Classifier. WSEAS Transactions on Signal Processing, 2(6), 47–57.Google Scholar
  38. Zhang, S. (2009). Enhanced supervised locally linear embedding. Pattern Recognition Letters, 30, 1208–1218.CrossRefGoogle Scholar
  39. Zhang, S., Li, L., Zhao, Z. (2010). Spoken emotion recognition using kernel discriminant locally linear embedding. Electronics Letters, 46(19).CrossRefGoogle Scholar
  40. Zhao, Z., Han, J., Zhang, Y., et al. (2015). A new supervised manifold learning algorithm. ICIG 2015 (pp. 240–251). Springer International Publishing.Google Scholar
  41. Zhao, L., & Zhang, Z. (2009). Supervised locally linear embedding with probability-based distance for classification. Computers & Mathematics with Applications, 57, 919–926.CrossRefGoogle Scholar

Copyright information

© Springer Nature Singapore Pte Ltd. 2019

Authors and Affiliations

  1. 1.School of Electronic and Optical EngineeringNanjing University of Science and TechnologyNanjingChina
  2. 2.School of Electronic and Optical EngineeringNanjing University of Science and TechnologyNanjingChina
  3. 3.National Key Laboratory of Transient PhysicsNanjing University of Science and TechnologyNanjingChina

Personalised recommendations