Framework for Feature Selection in Health Assessment Systems

Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 926)


Anomaly detection in health assessment systems has gained much attention in the recent past. Various feature selection techniques have been proposed for successful anomaly detection. However, these methods do not cater for the need to select features in health assessment systems. Most of the present techniques are data dependent and do not offer an option for incorporating domain information. This paper proposes a novel domain knowledge-driven feature selection framework named domain-driven selective wrapping (DSW) that can help in the selection of a correlated feature subset. The proposed framework uses an expert’s domain knowledge for the selection of subsets. The framework uses a custom-designed logic-driven anomaly detection block (LDAB) as a wrapper. The experiment results show that the proposed framework is able to select feature subsets more efficiently than traditional sequential selection methods and is very successful in detecting anomalies.


Features Health assessment Smart questionnaire 



The work presented in this paper is supported by the SIEF STEM+ funding. Fan Dong is the recipient of a SIEF STEM+ Business Fellowship.


  1. 1.
    Kohavi, R., John, G.H.: Wrappers for feature subset selection. Artif. Intell. 97(1), 273–324 (1997)CrossRefGoogle Scholar
  2. 2.
    Chandrashekar, G., Sahin, F.: A survey on feature selection methods. Comput. Electr. Eng. 40(1), 16–28 (2014)CrossRefGoogle Scholar
  3. 3.
    Law, M.H.C., Figueiredo, M.A.T., Jain, A.K.: Simultaneous feature selection and clustering using mixture models. IEEE Trans. Pattern Anal. Mach. Intell. 26(9), 1154–1166 (2004)CrossRefGoogle Scholar
  4. 4.
    Reunanen, J.: Overfitting in making comparisons between variable selection methods. J. Mach. Learn. Res. 3, 1371–1382 (2003)zbMATHGoogle Scholar
  5. 5.
    Vergara, J.R., Estévez, P.A.: A review of feature selection methods based on mutual information. Neural Comput. Appl. 24(1), 175–186 (2014)CrossRefGoogle Scholar
  6. 6.
    Rodriguez-Galiano, V.F., Luque-Espinar, J.A., Chica-Olmo, M., Mendes, M.P.: Feature selection approaches for predictive modelling of groundwater nitrate pollution: an evaluation of filters, embedded and wrapper methods. Sci. Total Environ. 624, 661–672 (2018)CrossRefGoogle Scholar
  7. 7.
    Inbarani, H.H., Azar, A.T., Jothi, G.: Supervised hybrid feature selection based on PSO and rough sets for medical diagnosis. Comput. Methods Programs Biomed. 113(1), 175–185 (2014)CrossRefGoogle Scholar
  8. 8.
    Shilaskar, S., Ghatol, A.: Feature selection for medical diagnosis: Evaluation for cardiovascular diseases. Expert Syst. Appl. 40, 4146–4153 (2013)CrossRefGoogle Scholar
  9. 9.
    Langley, P.: Selection of relevant features in machine learning. In: Proceedings of the AAAI Fall Symposium on Relevance, pp. 140–144 (1994)Google Scholar
  10. 10.
    Blum, A.L., Langley, P.: Selection of relevant features and examples in machine learning. Artif. Intell. 97(1), 245–271 (1997)MathSciNetCrossRefGoogle Scholar
  11. 11.
    Vieira, S.M., Mendonça, L.F., Farinha, G.J., Sousa, J.M.C.: Modified binary PSO for feature selection using SVM applied to mortality prediction of septic patients. Appl. Soft Comput. 13(8), 3494–3504 (2013)CrossRefGoogle Scholar
  12. 12.
    Xue, B., Zhang, M., Browne, W.N.: Particle swarm optimisation for feature selection in classification: novel initialisation and updating mechanisms. Appl. Soft Comput. 18, 261–276 (2014)CrossRefGoogle Scholar
  13. 13.
    Goldberg, D.E.: Genetic Algorithms in Search, Optimization and Machine Learning. Addison-Wesley Longman Publishing Co., Inc., Boston (1989)zbMATHGoogle Scholar
  14. 14.
    Cordón, O., Damas, S., Santamaría, J.: Feature-based image registration by means of the CHC evolutionary algorithm. Image Vis. Comput. 24(5), 525–533 (2006)CrossRefGoogle Scholar
  15. 15.
    Wilcox, A.B., Hripcsak, G.: The role of domain knowledge in automating medical text report classification. J. Am. Med. Inform. Assoc. JAMIA 10(4), 330–338 (2003)CrossRefGoogle Scholar
  16. 16.
    Xing, E.P., Karp, R.M.: CLIFF: clustering of high-dimensional microarray data via iterative feature filtering using normalized cuts. Bioinformatics 17(Suppl. 1), S306–S315 (2001)CrossRefGoogle Scholar
  17. 17.
    Pudil, P., Novovičová, J., Choakjarernwanit, N., Kittler, J.: Feature selection based on the approximation of class densities by finite mixtures of special type. Pattern Recogn. 28(9), 1389–1398 (1995)CrossRefGoogle Scholar
  18. 18.
    Mitra, P., Murthy, C., Pal, S.K.: Unsupervised feature selection using feature similarity. IEEE Trans. Pattern Anal. Mach. Intell. 24(3), 301–312 (2002)CrossRefGoogle Scholar
  19. 19.
    Pal, S.K., De, R.K., Basak, J.: Unsupervised feature evaluation: a neuro-fuzzy approach. IEEE Trans. Neural Netw. 11(2), 366–376 (2000)CrossRefGoogle Scholar
  20. 20.
    Xu, Z., King, I., Lyu, M.R.-T., Jin, R.: Discriminative semi-supervised feature selection via manifold regularization. IEEE Trans. Neural Netw. 21(7), 1033–1047 (2010)CrossRefGoogle Scholar
  21. 21.
    Zhao, Z., Liu, H.: Semi-supervised feature selection via spectral analysis. In: SDM (2007)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  1. 1.Centre for Artificial IntelligenceUTSSydneyAustralia

Personalised recommendations