Skip to main content

Comparison of Ensemble Approaches: Mixture of Experts and AdaBoost for a Regression Problem

  • Conference paper
Intelligent Information and Database Systems (ACIIDS 2014)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 8398))

Included in the following conference series:

Abstract

Two machine learning approaches: mixture of experts and AdaBoost.R2 were adjusted to the real-world regression problem of predicting the prices of residential premises based on historical data of sales/purchase transactions. The computationally intensive experiments were conducted aimed to compare empirically the prediction accuracy of ensemble models generated by the methods. The analysis of the results was performed using statistical methodology including nonparametric tests followed by post-hoc procedures designed especially for multiple n×n comparisons. No statistically significant differences were observed among the best ensembles: two generated by mixture of experts and two by AdaBoost.R2 employing multilayer perceptrons and general linear models as base learning algorithms.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Woźniak, M., Graña, M., Corchado, E.: A survey of multiple classifier systems as hybrid systems. Information Fusion 16, 3–17 (2014)

    Article  Google Scholar 

  2. Polikar, R.: Ensemble Based Systems in Decision making. IEEE Circuits and Systems Magazine 6(3), 21–45 (2006)

    Article  Google Scholar 

  3. Lasota, T., Telec, Z., Trawiński, B., Trawiński, K.: Exploration of Bagging Ensembles Comprising Genetic Fuzzy Models to Assist with Real Estate Appraisals. In: Corchado, E., Yin, H. (eds.) IDEAL 2009. LNCS, vol. 5788, pp. 554–561. Springer, Heidelberg (2009)

    Chapter  Google Scholar 

  4. Lasota, T., Telec, Z., Trawiński, B., Trawiński, K.: A Multi-agent System to Assist with Real Estate Appraisals Using Bagging Ensembles. In: Nguyen, N.T., Kowalczyk, R., Chen, S.-M. (eds.) ICCCI 2009. LNCS (LNAI), vol. 5796, pp. 813–824. Springer, Heidelberg (2009)

    Chapter  Google Scholar 

  5. Graczyk, M., Lasota, T., Trawiński, B., Trawiński, K.: Comparison of Bagging, Boosting and Stacking Ensembles Applied to Real Estate Appraisal. In: Nguyen, N.T., Le, M.T., Świątek, J. (eds.) ACIIDS 2010, Part II. LNCS (LNAI), vol. 5991, pp. 340–350. Springer, Heidelberg (2010)

    Chapter  Google Scholar 

  6. Krzystanek, M., Lasota, T., Telec, Z., Trawiński, B.: Analysis of Bagging Ensembles of Fuzzy Models for Premises Valuation. In: Nguyen, N.T., Le, M.T., Świątek, J. (eds.) ACIIDS 2010. LNCS (LNAI), vol. 5991, pp. 330–339. Springer, Heidelberg (2010)

    Chapter  Google Scholar 

  7. Trawiński, B., Lasota, T., Smętek, M., Trawiński, G.: An Attempt to Employ Genetic Fuzzy Systems to Predict from a Data Stream of Premises Transactions. In: Hüllermeier, E., Link, S., Fober, T., Seeger, B. (eds.) SUM 2012. LNCS (LNAI), vol. 7520, pp. 127–140. Springer, Heidelberg (2012)

    Chapter  Google Scholar 

  8. Trawiński, B.: Evolutionary fuzzy system ensemble approach to model real estate market based on data stream exploration. J. Univers. Comput. Sci. 19(4), 539–562 (2013)

    Google Scholar 

  9. Telec, Z., Lasota, T., Trawiński, B., Trawiński, G.: An Analysis of Change Trends by Predicting from a Data Stream Using Neural Networks. In: Larsen, H.L., Martin-Bautista, M.J., Vila, M.A., Andreasen, T., Christiansen, H. (eds.) FQAS 2013. LNCS (LNAI), vol. 8132, pp. 589–600. Springer, Heidelberg (2013)

    Chapter  Google Scholar 

  10. Trawiński, B., Lasota, T., Smętek, M., Trawiński, G.: Weighting Component Models by Predicting from Data Streams Using Ensembles of Genetic Fuzzy Systems. In: Larsen, H.L., Martin-Bautista, M.J., Vila, M.A., Andreasen, T., Christiansen, H. (eds.) FQAS 2013. LNCS (LNAI), vol. 8132, pp. 567–578. Springer, Heidelberg (2013)

    Chapter  Google Scholar 

  11. Lasota, T., Łuczak, T., Trawiński, B.: Investigation of Random Subspace and Random Forest Methods Applied to Property Valuation Data. In: Jędrzejowicz, P., Nguyen, N.T., Hoang, K. (eds.) ICCCI 2011, Part I. LNCS, vol. 6922, pp. 142–151. Springer, Heidelberg (2011)

    Chapter  Google Scholar 

  12. Lasota, T., Telec, Z., Trawiński, G., Trawiński, B.: Empirical Comparison of Resampling Methods Using Genetic Fuzzy Systems for a Regression Problem. In: Yin, H., Wang, W., Rayward-Smith, V. (eds.) IDEAL 2011. LNCS, vol. 6936, pp. 17–24. Springer, Heidelberg (2011)

    Chapter  Google Scholar 

  13. Lasota, T., Telec, Z., Trawiński, B., Trawiński, G.: Investigation of Random Subspace and Random Forest Regression Models Using Data with Injected Noise. In: Graña, M., Toro, C., Howlett, R.J., Jain, L.C. (eds.) KES 2012. LNCS, vol. 7828, pp. 1–10. Springer, Heidelberg (2013)

    Chapter  Google Scholar 

  14. Jacobs, R.A., Jordan, M.I., Nowlan, S.J., Hinton, G.E.: Adaptive mixtures of local experts. Neural Computation 3, 79–87 (1991)

    Article  Google Scholar 

  15. Jordan, M.I., Jacobs, R.A.: Hierarchical mixtures of experts and the EM algorithm. Neural Computation 6, 181–214 (1994)

    Article  Google Scholar 

  16. Avnimelech, R., Intrator, N.: Boosted mixture of experts: An ensemble learning scheme. Neural Computation 11(2), 483–497 (1999)

    Article  Google Scholar 

  17. Srivastava, A.N., Su, R., Weigend, A.S.: Data mining for features using scale-sensitive gated experts. IEEE Transactions on Pattern Analysis and Machine Intelligence 21, 1268–1279 (1999)

    Article  Google Scholar 

  18. Lima, C.A.M., Coelho, A.L.V., Von Zuben, F.J.: Hybridizing mixtures of experts with support vector machines: Investigation into nonlinear dynamic systems identification. Information Sciences 177(10), 2049–2074 (2007)

    Article  Google Scholar 

  19. Graczyk, M., Lasota, T., Telec, Z., Trawiński, B.: Application of mixture of experts to construct real estate appraisal models. In: Graña Romay, M., Corchado, E., Garcia Sebastian, M.T. (eds.) HAIS 2010, Part I. LNCS (LNAI), vol. 6076, pp. 581–589. Springer, Heidelberg (2010)

    Chapter  Google Scholar 

  20. Lasota, T., Londzin, B., Trawiński, B., Telec, Z.: Investigation of Mixture of Experts Applied to Residential Premises Valuation. In: Selamat, A., Nguyen, N.T., Haron, H. (eds.) ACIIDS 2013, Part II. LNCS (LNAI), vol. 7803, pp. 225–235. Springer, Heidelberg (2013)

    Chapter  Google Scholar 

  21. Freund, Y., Schapire, R.E.: Decision-theoretic generalization of on-line learning and an application to boosting. Journal of Computer and System Sciences 55(1), 119–139 (1997)

    Article  MATH  MathSciNet  Google Scholar 

  22. Burduk, R.: New AdaBoost Algorithm Based on Interval-Valued Fuzzy Sets. In: Yin, H., Costa, J.A.F., Barreto, G. (eds.) IDEAL 2012. LNCS, vol. 7435, pp. 794–801. Springer, Heidelberg (2012)

    Chapter  Google Scholar 

  23. Kajdanowicz, T., Kazienko, P.: Boosting-based Multi-label Classification. Journal of Universal Computer Science 19(4), 502–520 (2013)

    Google Scholar 

  24. Drucker, H.: Improving Regressors using Boosting Techniques. In: Fisher Jr., D.H. (ed.) Proceedings of the Fourteenth International Conference on Machine Learning, pp. 107–115. Morgan Kaufmann (1997)

    Google Scholar 

  25. Shrestha, D.L., Solomatine, D.P.: Experiments with AdaBoost.RT, an improved boosting scheme for regression. Neural Computing 18(7), 1678–1710 (2006)

    Article  MATH  Google Scholar 

  26. Zemel, R.S., Pitassi, T.: A gradient-based boosting algorithm for regression problems. In: Advances in Neural Information Processing Systems 13. MIT Press (2001)

    Google Scholar 

  27. Duffy, N., Helmbold, D.: Boosting methods for regression. Machine Learning 47, 153–200 (2002)

    Article  MATH  Google Scholar 

  28. Song, Y., Zhang, C.: New Boosting Methods of Gaussian Processes for Regression. In: Proceedings of International Joint Conference on Neural Networks, Montreal, Canada (2005)

    Google Scholar 

  29. Pardoe, D., Stone, P.: Boosting for Regression Transfer. In: Proceedings of the 27th International Conference on Machine Learning, Haifa, Israel (2010)

    Google Scholar 

  30. Moerland, P.: Some methods for training mixtures of experts. Technical Report IDIAP-Com 97-05, IDIAP Research Institute (1997)

    Google Scholar 

  31. Demšar, J.: Statistical comparisons of classifiers over multiple data sets. Journal of Machine Learning Research 7, 1–30 (2006)

    MATH  Google Scholar 

  32. García, S., Herrera, F.: An Extension on “Statistical Comparisons of Classifiers over Multiple Data Sets” for all Pairwise Comparisons. Journal of Machine Learning Research 9, 2677–2694 (2008)

    MATH  Google Scholar 

  33. Graczyk, M., Lasota, T., Telec, Z., Trawiński, B.: Nonparametric Statistical Analysis of Machine Learning Algorithms for Regression Problems. In: Setchi, R., Jordanov, I., Howlett, R.J., Jain, L.C. (eds.) KES 2010, Part I. LNCS (LNAI), vol. 6276, pp. 111–120. Springer, Heidelberg (2010)

    Chapter  Google Scholar 

  34. Trawiński, B., Smętek, M., Telec, Z., Lasota, T.: Nonparametric Statistical Analysis for Multiple Comparison of Machine Learning Regression Algorithms. International Journal of Applied Mathematics and Computer Science 22(4), 867–881 (2012)

    MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer International Publishing Switzerland

About this paper

Cite this paper

Lasota, T., Londzin, B., Telec, Z., Trawiński, B. (2014). Comparison of Ensemble Approaches: Mixture of Experts and AdaBoost for a Regression Problem. In: Nguyen, N.T., Attachoo, B., Trawiński, B., Somboonviwat, K. (eds) Intelligent Information and Database Systems. ACIIDS 2014. Lecture Notes in Computer Science(), vol 8398. Springer, Cham. https://doi.org/10.1007/978-3-319-05458-2_11

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-05458-2_11

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-05457-5

  • Online ISBN: 978-3-319-05458-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics