Advertisement

Generalized Regression Neural Network Optimized by Genetic Algorithm for Solving Out-of-Sample Extension Problem in Supervised Manifold Learning

  • Hong-Bing HuangEmail author
  • Zhi-Hong Xie
Article
  • 16 Downloads

Abstract

With the advent of big data, massive amounts of high-dimensional data have been accumulated in many fields. The assimilation and processing of such high-dimensional data can be particularly challenging. Manifold learning offers a means for effectively dealing with this challenge. However, the results of applying manifold learning to supervised classification have remained unsatisfactory. The out-of-sample extension problem is a critical issue that must be properly solved in this regard. Genetic algorithms (GAs) have excellent global search capabilities. This paper proposes a generalized regression neural network (GRNN) optimized by a GA for the solution of the out-of-sample extension problem. The prediction performance of a GRNN mainly depends on the appropriateness of the chosen smoothing factor. The essence of the GA optimization is the determination of the optimal smoothing factor of the GRNN, the optimized form of which is subsequently used to forecast the low-dimensional embeddings of the test samples. A GA can be used to obtain a better smoothing factor in a larger search space, resulting in enhanced prediction performance. Experiments were performed to enable a detailed analysis of the important parameters that affect the performance of the proposed algorithm. The results confirmed the effectiveness of the algorithm.

Keywords

Manifold learning Dimensionality reduction Out-of-sample extension Genetic algorithm Generalized regression neural network Optimization 

Notes

Acknowledgements

The authors would like to thank anonymous referees as well as the Associate Editor for their constructive comments and suggestions. They would also like to thank Editage (www.editage.com) for English language editing. This work was partially supported by the Natural Science Foundation of Fujian Province, China, under Grant 2016J01279, and the Natural Science Foundation of Education Department of Fujian Province, China, under Grant JB14003.

References

  1. 1.
    Roweis ST, Saul LK (2000) Nonlinear dimensionality reduction by locally linear embedding. Science 290(5500):2323–2326CrossRefGoogle Scholar
  2. 2.
    Saul LK, Roweis ST (2003) Think globally, fit locally: unsupervised leaning of low dimensional manifolds. J Mach Learn Res 4:119–155zbMATHGoogle Scholar
  3. 3.
    Tenenbaum JB, de Silva V, Langford JC (2000) A global geometric framework for nonlinear dimensionality reduction. Science 290:2319–2323CrossRefGoogle Scholar
  4. 4.
    Belhumeur PN, Hespanha JP, Kriegman DJ (1997) Eigenfaces versus fisherfaces: recognition using class specific linear projection. IEEE Trans Pattern Anal Mach Intell 19(5):711–720CrossRefGoogle Scholar
  5. 5.
    Jolliffe IT (1986) Principle component analysis. Springer, BerlinCrossRefGoogle Scholar
  6. 6.
    Xiao R, Zhao QJ, Zhang D, Shi PF (2011) Facial expression recognition on multiple manifolds. Pattern Recognit 44(1):107–116CrossRefzbMATHGoogle Scholar
  7. 7.
    Lafon S, Keller Y, Coifman RR (2006) Data fusion and multicue data matching by diffusion maps. IEEE Trans Pattern Anal Mach Intell 28(11):1784–1797CrossRefGoogle Scholar
  8. 8.
    Chang Y, Hu C, Rogerio F, Matthew T (2006) Manifold based analysis of facial expression. Image Vision Comput 24(6):605–614CrossRefGoogle Scholar
  9. 9.
    Yang W, Sun C, Zhang L (2011) A multi-manifold discriminant analysis method for image feature extraction. Pattern Recognit 44(8):1649–1657CrossRefzbMATHGoogle Scholar
  10. 10.
    Lafon S, Lee AB (2006) Diffusion maps and coarse-graining: a unified framework for dimensionality reduction, graph partitioning and data set parameterization. IEEE Trans Pattern Anal Mach Intell 28(9):1393–1403CrossRefGoogle Scholar
  11. 11.
    Orsenigo C, Vercellis C (2012) Kernel Ridge regression for out-of-sample mapping in supervised manifold learning. Expert Syst Appl 39:7757–7762CrossRefGoogle Scholar
  12. 12.
    Raducanu B, Dornaika F (2014) Embedding new observations via sparse-coding for non-linear manifold learning. Pattern Recognit 47:480–492CrossRefGoogle Scholar
  13. 13.
    Weng L, Dornaika F, Jin Z (2016) Flexible constrained sparsity preserving embedding. Pattern Recognit 60:813–823CrossRefGoogle Scholar
  14. 14.
    Vural E, Guillemot C (2016) Out-of-sample generalizations for supervised manifold learning for classification. IEEE Trans Image Process 25(3):1410–1424MathSciNetCrossRefzbMATHGoogle Scholar
  15. 15.
    Huang G-B (2015) What are extreme learning machines? Filling the gap between Frank Rosenblatt’s dream and John von Neumann’s puzzle. Cogn Comput 7:263–278CrossRefGoogle Scholar
  16. 16.
    Huang G, Huangb G-B, Song S, You K (2015) Trends in extreme learning machines: a review. Neural Netw 61:32–48CrossRefzbMATHGoogle Scholar
  17. 17.
    Quispe AM, Petitjean C, Heutte L (2016) Extreme learning machine for out-of-sample extension in Laplacian eigenmaps. Pattern Recognit Lett 74:68–73CrossRefGoogle Scholar
  18. 18.
    Liu X, Lin S, Fang J, Zongben X (2015) Is extreme learning machine feasible? A theoretical assessment (Part I). IEEE Trans Neural Netw Learn Syst 26(1):7–20MathSciNetCrossRefGoogle Scholar
  19. 19.
    Lin S, Liu X, Fang J, Xu Z (2015) Is extreme learning machine feasible? A theoretical assessment (part II). IEEE Trans Neural Netw Learn Syst 26(1):21–34MathSciNetCrossRefGoogle Scholar
  20. 20.
    Huang H, Huo H, Fang T (2014) Hierarchical manifold learning with applications to supervised classification for high resolution remotely sensed images. IEEE Trans Geosci Remote Sens 52(3):1677–1692CrossRefGoogle Scholar
  21. 21.
    Alpaydın E (2010) Introduction to machine learning, 2nd edn. MIT Press, Cambridge, MasszbMATHGoogle Scholar
  22. 22.
    Rumelhart DE, Hinton GE, Williams RJ (1986) Learning representations by back-propagating errors. Nature 323:533–536CrossRefzbMATHGoogle Scholar
  23. 23.
    Broomhead DS, Lowe D (1998) Multivariable functional interpolation and adaptive network. Complex Syst 2(3):321–355MathSciNetzbMATHGoogle Scholar
  24. 24.
    Specht DF (1991) A general regression neural network. IEEE Trans Neural Netw 2(6):568–576CrossRefGoogle Scholar
  25. 25.
    Schioler H, Hartmann U (1992) Mapping neural network derived from the Parzen window estimator. Neural Netw 5(6):903–909CrossRefGoogle Scholar
  26. 26.
    Bagheripour P (2014) Committee neural network model for rock permeability prediction. J Appl Geophys 104:142–148CrossRefGoogle Scholar
  27. 27.
    Hossain MA, Madkour AM, Dahal KP, Zhang L (2013) A real-time dynamic optimal guidance scheme using a general regression neural network. Eng Appl Artif Intell 26:1230–1236CrossRefGoogle Scholar
  28. 28.
    Chang H-Y, Wen C-H, Pan W-T (2010) Prediction of the return of common fund through General Regression Neural Network. J Stat Manag Syst 13(3):627–637.  https://doi.org/10.1080/09720510.2010.10701492 CrossRefGoogle Scholar
  29. 29.
    Holland J (1992) Genetic algorithms. Sci Am 267(1):66–72.  https://doi.org/10.1038/scientificamerican0792-66 CrossRefGoogle Scholar
  30. 30.
    Holland J (2000) Building blocks, cohort genetic algorithms, and hyperplane-defined functions. Evol Comput 8(4):373–391CrossRefGoogle Scholar
  31. 31.
    Qiu M, Ming Z, Li J, Gai K, Zong Z (2015) Phase-change memory optimization for green cloud with genetic algorithm. IEEE Trans Comput 64(12):3528–3540MathSciNetCrossRefzbMATHGoogle Scholar
  32. 32.
    Hasda RK, Bhattacharjya RK, Bennis F (2017) Modified genetic algorithms for solving facility layout problems. Int J Interact Des Manuf 11:713–725CrossRefGoogle Scholar
  33. 33.
    Horton P, Jaboyedoff M, Obled C (2017) Global optimization of an analog method by means of genetic algorithms. Mon Weather Rev 145(4):1275–1294CrossRefGoogle Scholar
  34. 34.
    Zang W, Ren L, Zhang W, Liu X (2018) A cloud model based DNA genetic algorithm for numerical optimization problems. Future Gener Comput Syst 81:465–477.  https://doi.org/10.1016/j.future.2017.07.036 CrossRefGoogle Scholar
  35. 35.
    Stehman S (1997) Selecting and interpreting measures of thematic classification accuracy. Remote Sens Environ 62(1):77–89CrossRefGoogle Scholar
  36. 36.
    Back T (1996) Evolutionary algorithms in theory and practice: evolution strategies, evolutionary programming, genetic algorithms. Oxford University Press, OxfordzbMATHGoogle Scholar
  37. 37.
    Muhlenbein H et al (1988) Evolution algorithms in combinatorial optimization. Parallel Comput 7(1):65–85MathSciNetCrossRefzbMATHGoogle Scholar
  38. 38.
    Yang Y, Newsam S (2010) Bag-of-visual-words and spatial extensions for land-use classification. In: ACM SIGSPATIAL international conference on advances in geographic information systems (ACM GIS). http://vision.ucmerced.edu/datasets/landuse.html
  39. 39.
    Dua D, Graff C (2019) UCI machine learning repository. University of California, School of Information and Computer Science, Irvine, CA. http://archive.ics.uci.edu/ml
  40. 40.
    Chen X, Fang T, Huo H et al (2011) Graph-based feature selection for object-oriented classification in VHR airborne imagery. IEEE Trans Geosci Remote Sens 49(1):353–365CrossRefGoogle Scholar
  41. 41.
    Geng X, Zhan D, Zhou Z (2005) Supervised nonlinear dimensionality reduction for visualization and classification. IEEE Trans Syst Man Cybern Part B Cybern 35(6):1098–1107CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC, part of Springer Nature 2019

Authors and Affiliations

  1. 1.School of Information and ManagementGuangxi Medical UniversityNanningChina
  2. 2.Department of Preschool EducationGuangxi Preschool Vocational CollegeNanningChina

Personalised recommendations