Advertisement

A Hybrid Technique for Unsupervised Dimensionality Reduction by Utilizing Enriched Kernel Based PCA and DBSCAN Clustering Algorithm

  • D. HemavathiEmail author
  • H. Srimathi
  • K. Sornalakshmi
Conference paper
Part of the Lecture Notes in Networks and Systems book series (LNNS, volume 98)

Abstract

Selection of relevant features is a significant technique in the real time applications. It constitutes a number of data related to the ever-augmenting domains like financial organization, education etc. For recognition of relevant features, the results should be determined accurately. In the developed work Terry dataset is considered for unsupervised learning. Whereas significant features are selected by embedded method and the selected features are processed with enriched kernel based Principal component Analysis (PCA) for dimensionality reduction, which is further evaluated with Density-based spatial clustering of applications in the noise (DBSCAN) algorithm for developing a better performance.

Keywords

Embedded DBSCAN Terry data Kernel based PCA 

References

  1. 1.
    Kale, A.P., Sonavane, S.: PF-FELM: a Robust pca feature selection for fuzzy extreme learning machine. IEEE J. Sel. Topics Signal Process. 12(6), 1303–1312 (2018)CrossRefGoogle Scholar
  2. 2.
    Anoop, V.S., Asharaf, S., Deepak, P.: Unsupervised concept hierarchy learning: a topic modeling guided approach. Procedia Comput. Sci. 89, 386–394 (2016)CrossRefGoogle Scholar
  3. 3.
    Bolón-Canedo, V., Sánchez-Maroño, N., Alonso Betanzos, A.: Recent advances and emerging challenges of feature selection in the context of big data. Knowl. Based Syst. 86, 33–45 (2015)CrossRefGoogle Scholar
  4. 4.
    Wang, L., Wang, Y., Chang, Q.: Feature selection methods for big data bioinformatics: a survey from the search perspective. Elsevier 6(2), 2–31 (2016)Google Scholar
  5. 5.
    Sridhar, V.K.R.: Unsupervised text normalization using distributed representations of words and phrases. In: Proceedings of NAACL, vol. 6, pp. 8–16 (2015)Google Scholar
  6. 6.
    Tutz, G., Gertheiss, J.: Feature extraction in signal regression: a boosting technique for functional data regression. J. Comput. Gr. Stat. 19(1), 154–174 (2011)MathSciNetCrossRefGoogle Scholar
  7. 7.
    Pavlenko, T.: On feature selection, curse-of-dimensionality and error probability in discriminant analysis. J. Stat. Plann. Infer. 115(2), 565–584 (2003)MathSciNetCrossRefGoogle Scholar
  8. 8.
    Chávez, E., Navarro, G.: Probabilistic proximity search: fighting the curse of dimensionality in metric spaces. Inf. Process. Lett. 85(1), 39–46 (2003)MathSciNetCrossRefGoogle Scholar
  9. 9.
    Salappa, A., Doumpos, M., Zopounidis, C.: Feature selection algorithms in classification problems: an experimental evaluation. J. Optim. Methods Softw. 22(1), 199–214 (2007)MathSciNetCrossRefGoogle Scholar
  10. 10.
    Bühlmann, P., Yu, B.: Boosting with the L2 loss: regression and classification. J. Am. Stat. Assoc. 98, 324–339 (2003). [158, 159, 161]CrossRefGoogle Scholar
  11. 11.
    Liu, H., Hussain, F., Tan, C.L., Dash, M.: Discretization: an enabling technique. Data Min. Knowl. Disc. 6(4), 393–423 (2002)MathSciNetCrossRefGoogle Scholar
  12. 12.
    Jayalakshmi, T., Santhakumaran, A.: Statistical normalization and back propagation for classification. Int. J. Comput. Theory Eng. 3(1), 1793–8201 (2011)Google Scholar
  13. 13.
    Sathya, R., Abraham, A.: Comparison of supervised and unsupervised learning algorithms for pattern classification. Int. J. Adv. Res. Artif. Intell. 2(2), 34–38 (2013)CrossRefGoogle Scholar
  14. 14.
    Jaba Sheela, L., Shanthi, V.: An approach for discretization and feature selection of continuous-valued attributes in medical images for classification learning. Int. J. Comput. Electric. Eng. 1(2), 1793–8163 (2009)Google Scholar
  15. 15.
    Suneeta, N., Hari, M.K., Kumar, S.: Modified gini index classification: a case study of heart disease dataset. Int. J. Comput. Sci. Eng. 02(06), 1959–1965 (2010)Google Scholar
  16. 16.
    El Ferchichi, S., Zidi, S., Lille, L., Laabidi, K.: A new unsupervised clustering-based feature extraction method. Int. J. Comput. Appl. 57(6), 43–49 (2012)Google Scholar
  17. 17.
    Kwak, N., Choi, S., Choi, C.H.: Feature extraction for regression problems and an example application for pose estimation of a face. Int. J. Comput. Theory Eng. 3(1), 1793–8201 (2011)Google Scholar
  18. 18.
    Jasmina, N., Perica, S., Dusan, B.: Toward optimal feature selection using ranking methods and classification algorithms. Yugoslav J. Oper. Res. 21(1), 119–135 (2011)MathSciNetCrossRefGoogle Scholar
  19. 19.
    Chakraborty, G., Chakraborty, B.: A novel normalization technique for unsupervised learning in ANN. IEEE Trans. Neural Netw. 11(1), 253–257 (2000)CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  1. 1.Department of Information Technology, School of ComputingSRM Institute of Science and TechnologyChennaiIndia

Personalised recommendations