Abstract
Universal Nearest Neighbours (unn) is a classifier recently proposed, which can also effectively estimates the posterior probability of each classification act. This algorithm, intrinsically binary, requires the use of a decomposition method to cope with multiclass problems, thus reducing their complexity in less complex binary subtasks. Then, a reconstruction rule provides the final classification. In this paper we show that the application of unn algorithm in conjunction with a reconstruction rule based on the posterior probabilities provides a classification scheme robust among different biomedical image datasets. To this aim, we compare unn performance with those achieved by Support Vector Machine with two different kernels and by a k Nearest Neighbours classifier, and applying two different reconstruction rules for each of the aforementioned classification paradigms. The results on one private and five public biomedical datasets show satisfactory performance.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Piro, P., Nock, R., Nielsen, F., Barlaud, M.: Leveraging k-nn for generic classification boosting. Neurocomputing 80, 3–9 (2012)
Allwein, E.L., et al.: Reducing multiclass to binary: a unifying approach for margin classifiers. Journal of Machine Learning Research 1, 113–141 (2001)
Dietterich, T.G., Bakiri, G.: Solving multiclass learning problems via error-correcting output codes. Journal of Artificial Intelligence Research 2, 263 (1995)
Iannelo, G., et al.: On the use of classification reliability for improving performance of the one-per-class decomposition method. DKE 68, 1398–1410 (2009)
D’Ambrosio, R., Nock, R., Bel Haj Ali, W., Nielsen, F., Barlaud, M.: Boosting Nearest Neighbors for the Efficient Estimation of Posteriors (April 2012)
Schapire, R.E., Singer, Y.: Improved boosting algorithms using confidence-rated predictions. Machine Learning Journal 37, 297–336 (1999)
Nock, R., Nielsen, F.: On the efficient minimization of classification-calibrated surrogates. In: NIPS*21, pp. 1201–1208 (2008)
Nock, R., Nielsen, F.: Bregman divergences and surrogates for learning. IEEE Trans. on Pattern Analysis and Machine Intelligence 31(11), 2048–2059 (2009)
Bartlett, P., Jordan, M., McAuliffe, J.D.: Convexity, classification, and risk bounds. Journal of the Am. Stat. Assoc. 101, 138–156 (2006)
Bel Haj Ali, W., et al.: A bio-inspired learning and classification method for subcellular localization of a plasma membrane protein. In: VISAPP 2012 (2012)
Frank, A., Asuncion, A.: UCI machine learning repository (2010)
Rigon, A., et al.: Indirect immunofluorescence in autoimmune diseases: Assessment of digital images for diagnostic purpose. Cytometry Part B: Clin. Cytometry 72B(6), 472–477
Platt, J.-C.: Probabilistic outputs for support vector machines and comparisons to regularized likelihood methods. In: Advances in Large Margin Classifiers, pp. 61–74. MIT Press (1999)
Cordella, L.P., et al.: Reliability parameters to improve combination strategies in multi-expert systems. Pattern Analysis & Applications 2(3), 205–214 (1999)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2012 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
D’Ambrosio, R., Bel Haj Ali, W., Nock, R., Soda, P., Nielsen, F., Barlaud, M. (2012). Biomedical Images Classification by Universal Nearest Neighbours Classifier Using Posterior Probability. In: Wang, F., Shen, D., Yan, P., Suzuki, K. (eds) Machine Learning in Medical Imaging. MLMI 2012. Lecture Notes in Computer Science, vol 7588. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-35428-1_15
Download citation
DOI: https://doi.org/10.1007/978-3-642-35428-1_15
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-35427-4
Online ISBN: 978-3-642-35428-1
eBook Packages: Computer ScienceComputer Science (R0)