Skip to main content

Adaptive Boosting for Spatial Functions with Unstable Driving Attributes

  • Conference paper
  • First Online:
Knowledge Discovery and Data Mining. Current Issues and New Applications (PAKDD 2000)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 1805))

Included in the following conference series:

Abstract

Combining multiple global models (e.g. back-propagation based neural networks) is an effective technique for improving classification accuracy by reducing a variance through manipulating training data distributions. Standard combining methods do not improve local classifiers (e.g. k-nearest neighbors) due to their low sensitivity to data perturbation. Here, we propose an adaptive attribute boosting technique to coalesce multiple local classifiers each using different relevant attribute information. In addition, a modification of boosting method is developed for heterogeneous spatial databases with unstable driving attributes by drawing spatial blocks of data at each boosting round. To reduce the computational costs of k-nearest neighbor (k-NN) classifiers, a novel fast k-NN algorithm is designed. The adaptive attribute boosting applied to real life spatial data and artificial spatial data show observable improvements in prediction accuracy for both local and global classifiers when unstable driving attributes are present in the data. The “spatial” variant of boosting applied to the same data sets resulted in highly significant improvements for the k-NN classifier, making it competitive to boosted neural networks.

Partial support by the INEEL University Research Consortium project No. C94-175936 to T. Fiez and Z. Obradovic is gratefully acknowledged.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Breiman, L.: Bagging predictors, Machine Learning 24, 123–140, (1996)

    MATH  MathSciNet  Google Scholar 

  2. Freund, Y., and Schapire, R. E.: Experiments with a new boosting algorithm, Machine Learning: Proceedings of the Thirteenth International Conference, pp. 325–332, (1996)

    Google Scholar 

  3. Kong, E. B., Dietterich, T. G.: Error-correcting output coding corrects bias and variance, In Proc. of the twelfth National Conference on Artificial Intelligence, 725–730, (1996)

    Google Scholar 

  4. Liu, L. and Motoda, H.: Feature Selection for Knowledge Discovery and Data Mining, Kluwer Academic Publishers, Boston (1998)

    MATH  Google Scholar 

  5. Ricci, F., and Aha, D. W.: Error-correcting output codes for local learners, In Proceedings of the 10th European Conference on Machine Learning, (1998)

    Google Scholar 

  6. Bay, S. D.: Nearest Neighbor Classification from Multiple Feature Subsets. Intelligent Data Analysis. 3(3): 191–209, (1999)

    Article  Google Scholar 

  7. Tumer, K., and Ghosh, J.: Error correlation and error reduction in ensemble classifiers, Connection Science 8, 385–404, (1996)

    Article  Google Scholar 

  8. Cherkauer, K. J.: Human expert-level performance on a scientific image analysis task by a system using combined artificial neural networks. In P. Chan, (Ed.): Working Notes of the AAAI Workshop on Integrating Multiple Learned Models, 15–21, (1996)

    Google Scholar 

  9. Bishop, C, Neural Networks for Pattern Recognition, Oxford University Press, (1995)

    Google Scholar 

  10. Riedmiller, M., Braun, H.: A Direct Adaptive Method for Faster Backpropagation Learning: The RPROP Algorithm, Proceedings of the IEEE International Conf. on Neural Networks, San Francisco, 586–591 (1993)

    Google Scholar 

  11. Hagan, M., Menhaj, M.B.: Training feedforward networks with the Marquardt algorithm. IEEE Transactions on Neural Networks 5, 989–993 (1994)

    Article  Google Scholar 

  12. Vucetic, S., Fiez, T., Obradovic, Z.: A Data Partitioning Scheme for Spatial Regression, Proceedings of the IEEE/INNS Int’l Conf. on Neural Networks, Washington, D.C., No. 348, session 8.1A., (1999)

    Google Scholar 

  13. Pokrajac, D., Fiez, T. and Obradovic, Z.: A Spatial Data Simulator for Agriculture Knowledge Discovery Applications, in review.

    Google Scholar 

  14. Margineantu, D. D., and Dietterich, T. G.: Pruning adaptive boosting, In Proceedings of the 14th International Conference on Machine Learning, 211–218 (1997)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2000 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Lazarevic, A., Fiez, T., Obradovic, Z. (2000). Adaptive Boosting for Spatial Functions with Unstable Driving Attributes. In: Terano, T., Liu, H., Chen, A.L.P. (eds) Knowledge Discovery and Data Mining. Current Issues and New Applications. PAKDD 2000. Lecture Notes in Computer Science(), vol 1805. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-45571-X_38

Download citation

  • DOI: https://doi.org/10.1007/3-540-45571-X_38

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-67382-8

  • Online ISBN: 978-3-540-45571-4

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics