Advertisement

M-ary Random Forest

  • Vikas Jain
  • Ashish PhophaliaEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11941)

Abstract

Random forest (RF) is a supervised, ensemble of decision trees method. Each decision tree recursively partitions the feature space into two disjoint sub-regions using axis parallel splits until each sub-region becomes homogeneous with respect to a particular class or reach to a stoppage criterion. The conventional RF uses one feature at a time for splitting. Therefore, it does not consider the feature inter-dependency. Keeping this aim in mind, the current paper introduces an approach to perform multi-features splitting. This partition the feature space into M-regions using axis parallel splits. Therefore, the forest created using this is named as M-ary Random Forest (MaRF). The suitability of the proposed method is tested over the various heterogeneous UCI datasets. Experimental results show that the proposed MaRF is performing better for both classification and regression. The proposed MaRF method has also been tested over Hyperspectral imaging (HSI) for classification and it has shown satisfactory improvement with respect to other state-of-the-art methods.

Keywords

Classification Ensemble method Hyperspectral imaging Random forest 

References

  1. 1.
    Indian pines and pavia university dataset. http://lesun.weebly.com/hyperspectral-data-set.html. Accessed 15 Jan 2019
  2. 2.
    UCI repository. https://archive.ics.uci.edu/ml/index.php. Accessed 15 Nov 2018
  3. 3.
    Biau, G.: Analysis of a random forests model. J. Mach. Learn. Res. 13(Apr), 1063–1095 (2012)MathSciNetzbMATHGoogle Scholar
  4. 4.
    Biau, G., Devroye, L., Lugosi, G.: Consistency of random forests and other averaging classifiers. J. Mach. Learn. Res. 9(Sep), 2015–2033 (2008)MathSciNetzbMATHGoogle Scholar
  5. 5.
    Breiman, L.: Random forests. Mach. Learn. 45(1), 5–32 (2001)CrossRefGoogle Scholar
  6. 6.
    Cao, X., Xu, L., Meng, D., Zhao, Q., Xu, Z.: Integration of 3-dimensional discrete wavelet transform and markov random field for hyperspectral image classification. Neurocomputing 226, 90–100 (2017)CrossRefGoogle Scholar
  7. 7.
    Criminisi, A., Shotton, J.: Decision Forests for Computer Vision and Medical Image Analysis. Springer (2013)Google Scholar
  8. 8.
    Denil, M., Matheson, D., De Freitas, N.: Narrowing the gap: random forests in theory and in practice. In: International Conference on Machine Learning, pp. 665–673 (2014)Google Scholar
  9. 9.
    Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Mach. Learn. 63(1), 3–42 (2006)CrossRefGoogle Scholar
  10. 10.
    Ishwaran, H.: The effect of splitting on random forests. Mach. Learn. 99(1), 75–118 (2015)MathSciNetCrossRefGoogle Scholar
  11. 11.
    Ji, R., Gao, Y., Hong, R., Liu, Q., Tao, D., Li, X.: Spectral-spatial constraint hyperspectral image classification. IEEE Trans. Geosci. Remote Sens. 52(3), 1811–1824 (2014)CrossRefGoogle Scholar
  12. 12.
    Louppe, G.: Understanding random forests: from theory to practice. arXiv preprint arXiv:1407.7502 (2014)
  13. 13.
    Murthy, S.K., Kasif, S., Salzberg, S.: A system for induction of oblique decision trees. J. Artif. Intell. Res. 2, 1–32 (1994)CrossRefGoogle Scholar
  14. 14.
    Oshiro, T.M., Perez, P.S., Baranauskas, J.A.: How many trees in a random forest? In: Perner, P. (ed.) MLDM 2012. LNCS (LNAI), vol. 7376, pp. 154–168. Springer, Heidelberg (2012).  https://doi.org/10.1007/978-3-642-31537-4_13CrossRefGoogle Scholar
  15. 15.
    Paul, A., Mukherjee, D.P., Das, P., Gangopadhyay, A., Chintha, A.R., Kundu, S.: Improved random forest for classification. IEEE Trans. Image Process. 27(8), 4012–4024 (2018)MathSciNetCrossRefGoogle Scholar
  16. 16.
    Wang, L., Zhao, C.: Hyperspectral Image Processing. Springer (2016)Google Scholar
  17. 17.
    Wang, Y., Xia, S.T., Tang, Q., Wu, J., Zhu, X.: A novel consistent random forest framework: Bernoulli random forests. IEEE Trans. Neural Netw. Learn. Syst. 29(8), 3510–3523 (2018)MathSciNetCrossRefGoogle Scholar
  18. 18.
    Wickramarachchi, D., Robertson, B., Reale, M., Price, C., Brown, J.: HHCART: an oblique decision tree. Comput. Stat. Data Anal. 96, 12–23 (2016)MathSciNetCrossRefGoogle Scholar
  19. 19.
    Zhang, Y., Cao, G., Li, X., Wang, B.: Cascaded random forest for hyperspectral image classification. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 11(4), 1082–1094 (2018)CrossRefGoogle Scholar
  20. 20.
    Zhou, Z.H., Feng, J.: Deep forest: towards an alternative to deep neural networks. arXiv preprint arXiv:1702.08835 (2017)

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Indian Institute of Information Technology, VadodaraGandhinagarIndia

Personalised recommendations