An interactive nonparametric evidential regression algorithm with instance selection

  • 25 Accesses


The nonparametric evidential regression (EVREG) method provides flexible forms of prediction regarding the value of output, allowing the output of training instances to be partially unknown. However, the superfluous training instances still have negative effects on the parameter learning in EVREG. To relax this limitation, this paper introduces an interactive nonparametric evidential regression (IEVREG) algorithm with instance selection. More specifically, the significance of an instance is firstly measured by defining the evaluation functions, taking into account both the prediction accuracy of regression model and the spatial information between that instance with other ones. According to a search strategy, the instances with high degree of significance are then selected to maximize an objective function. Different from existing instance selection methods, the selection of training instances is synchronously accomplished with the parameter learning in IEVREG, rather than just a separated data preprocessing operation as traditional methods do. Furthermore, the noise and redundant instances can be simultaneously removed and the performance of IEVREG is robust to the order of presentation of instances in raw data set. Experimental results show that the proposed IEVREG algorithm has appropriate prediction accuracy, while performing well selection of the representative training instances from the raw data set. Simulations on synthetic and UCI real-world data sets validate our conclusions.

This is a preview of subscription content, log in to check access.

Access options

Buy single article

Instant unlimited access to the full article PDF.

US$ 39.95

Price includes VAT for USA

Subscribe to journal

Immediate online access to all issues from 2019. Subscription will auto renew annually.

US$ 99

This is the net price. Taxes to be calculated in checkout.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11


  1. Abualigah LMQ (2019) Feature selection and enhanced Krill herd algorithm for text document clustering. Springer, Berlin

  2. Abualigah LM, Khader AT (2017) Unsupervised text feature selection technique based on hybrid particle swarm optimization algorithm with genetic operators for the text clustering. J Supercomput 73(11):4773–4795

  3. Abualigah LM, Khader AT, Hanandeh ES (2018a) A new feature selection method to improve the document clustering using particle swarm optimization algorithm. J Comput Sci 25:456–466

  4. Abualigah LM, Khader AT, Hanandeh ES (2018b) A combination of objective functions and hybrid Krill herd algorithm for text document clustering analysis. Eng Appl Artif Intell 73:111–125

  5. Alcala-Fdez J, Fernandez A, Luengo J, Derrac J, Garcia S (2011) Keel data-mining software tool: data set repository, integration of algorithms and experimental analysis framework. Multiple Valued Log Soft Comput 17:255–287

  6. Amin F, Fahmi A, Abdullah S, Ali A, Ahmad R, Ghani F (2018) Triangular cubic linguistic hesitant fuzzy aggregation operators and their application in group decision making. J Intell Fuzzy Syst 34:1–15 02

  7. Antonelli M, Ducange P, Marcelloni F (2012) Genetic training instance selection in multiobjective evolutionary fuzzy systems: a coevolutionary approach. IEEE Trans Fuzzy Syst 20(2):276–290

  8. Arnaiz-Gonzalez A, Diez-Pastor JF, Rodriguez JJ, Garcia-Osorio C (2016a) Instance selection for regression: adapting drop. Neurocomputing 201:66–81

  9. Arnaiz-Gonzalez A, Diez-Pastor JF, Rodriguez JJ, Garcia-Osorio CI (2016b) Instance selection for regression by discretization. Expert Syst Appl 54:340–350

  10. Arnaiz-Gonzalez A, Blachnik M, Kordos M, Garcia-Osorio C (2016c) Fusion of instance selection methods in regression tasks. Inf Fusion 30:69–79

  11. Cameron AC, Windmeijer FAG (1997) An \(R\)-squared measure of goodness of fit for some common nonlinear regression models. J Econom 77(2):329–342

  12. Denœux T (2008) A \(k\)-nearest neighbor classification rule based on Dempster–Shafer theory. Springer, Berlin, pp 737–760

  13. Fahmi A, Amin F, Abdullah S, Ali A (2018a) Cubic fuzzy Einstein aggregation operators and its application to decision-making. Int J Syst Sci 49(11):2385–2397

  14. Fahmi A, Abdullah S, Amin F, Ali A, Ahmad Khan W (2018b) Some geometric operators with triangular cubic linguistic hesitant fuzzy number and their application in group decision-making. J Intell Fuzzy Syst 35:1–15 07

  15. Fahmi A, Abdullah S, Amin F, Khan MSA (2019) Trapezoidal cubic fuzzy number einstein hybrid weighted averaging operators and its application to decision making. Soft Comput 23(14):5753–5783

  16. Garcia S, Luengo J (2015) Data preprocessing in data mining. Springer, Berlin

  17. Guillen A, Herrera LJ, Rubio G, Pomares H, Lendasse A, Rojas I (2010) New method for instance or prototype selection using mutual information in time series prediction. Neurocomputing 73(10):2030–2038 (Subspace learning/selected papers from the European symposium on time series prediction)

  18. Hart P (1968) The condensed nearest neighbor rule (Corresp.). IEEE Trans Inf Theory 14(3):515–516

  19. Kordos M, Blachnik M (2012) Instance selection with neural networks for regression problems. In: Villa AEP, Duch W, Érdi P, Masulli F, Palm G (eds) Artificial neural networks and machine learning—ICANN 2012. Springer, Berlin, pp 263–270

  20. Olvera-López JA, Carrasco-Ochoa JA, Martínez-Trinidad JF, Kittler J (2010) A review of instance selection methods. Artif Intell Rev 34(2):133–143

  21. Petit-Renaud S, Denoeux T (2004) Nonparametric regression analysis of uncertain and imprecise data using belief functions. Int J Approx Reason 35(1):1–28

  22. Rodriguez-Fdez I, Mucientes M, Bugarin A (2013) An instance selection algorithm for regression and its application in variance reduction. In: Proceedings of the IEEE international conference on fuzzy systems (FUZZ-IEEE), pp 1–8

  23. Shafer G (1976) A mathematical theory of evidence. Princeton Univ. Press, Princeton, NJ, USA

  24. Smets P (1994) What is Dempster-Shafer’s model? Adv Dempster-Shafer Theor Evid 1994:5–34

  25. Smets P (1998) The transferable belief model for quantified belief representation. Springer, Dordrecht, pp 267–301

  26. Smets P, Kennes R (1994) The transferable belief model. Artif Intell 66(2):191–234

  27. Smets P, Kennes R (2008) The transferable belief model. Springer, Berlin, pp 693–736

  28. Song YS, Liang JY, Jing L, Zhao XW (2017) An efficient instance selection algorithm for \(k\) nearest neighbor regression. Neurocomputing 251:26–34

  29. Su Z-G, Wang P-H, Shen J, Yu XJ (2011) Multi-model strategy based evidential soft sensor model for predicting evaluation of variables with uncertainty. Appl Soft Comput 11(2):2595–2610 (The impact of soft computing for the progress of artificial intelligence)

  30. Su Z-G, Wang YF, Wang P-H (2013) Parametric regression analysis of imprecise and uncertain data in the fuzzy belief function framework. Int J Approx Reason 54(8):1217–1242

  31. Su Z-G, Denoeux T, Hao Y-S, Zhao M (2018) Evidential \(k\)-nn classification with enhanced performance via optimizing a class of parametric conjunctive \(t\)-rules. Knowl Based Syst 142:7–16

  32. Tolvi J (2004) Genetic algorithms for outlier detection and variable selection in linear regression models. Soft Comput 8(8):527–533

  33. Tufekci P (2014) Prediction of full load electrical power output of a base load operated combined cycle power plant using machine learning methods. Int J Electr Power Energy Syst 60:126–140

  34. Wilson DL (1972) Asymptotic properties of nearest neighbor rules using edited data. IEEE Trans Syst Man Cybern SMC–2(3):408–421

Download references


The authors would like to thank the editors and anonymous referees for their invaluable comments and suggestions. This work is supported in part by the National Natural Science Foundation of China under Grants 51876035 and 51976032.

Author information

Correspondence to Pei-hong Wang.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Ethical standards

All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Declaration of Helsinki and its later amendments or comparable ethical standards.

Human and animals rights

This article does not contain any studies with animals performed by any of the authors.

Informed consent

Informed consent was obtained from all individual participants included in the study.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Communicated by A. Di Nola.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Gong, C., Wang, P. & Su, Z. An interactive nonparametric evidential regression algorithm with instance selection. Soft Comput (2020) doi:10.1007/s00500-020-04667-4

Download citation


  • Nonparametric evidential regression
  • Belief functions
  • Instance selection