Abstract
Combining multiple global models (e.g. back-propagation based neural networks) is an effective technique for improving classification accuracy by reducing a variance through manipulating training data distributions. Standard combining methods do not improve local classifiers (e.g. k-nearest neighbors) due to their low sensitivity to data perturbation. Here, we propose an adaptive attribute boosting technique to coalesce multiple local classifiers each using different relevant attribute information. In addition, a modification of boosting method is developed for heterogeneous spatial databases with unstable driving attributes by drawing spatial blocks of data at each boosting round. To reduce the computational costs of k-nearest neighbor (k-NN) classifiers, a novel fast k-NN algorithm is designed. The adaptive attribute boosting applied to real life spatial data and artificial spatial data show observable improvements in prediction accuracy for both local and global classifiers when unstable driving attributes are present in the data. The “spatial” variant of boosting applied to the same data sets resulted in highly significant improvements for the k-NN classifier, making it competitive to boosted neural networks.
Partial support by the INEEL University Research Consortium project No. C94-175936 to T. Fiez and Z. Obradovic is gratefully acknowledged.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Breiman, L.: Bagging predictors, Machine Learning 24, 123–140, (1996)
Freund, Y., and Schapire, R. E.: Experiments with a new boosting algorithm, Machine Learning: Proceedings of the Thirteenth International Conference, pp. 325–332, (1996)
Kong, E. B., Dietterich, T. G.: Error-correcting output coding corrects bias and variance, In Proc. of the twelfth National Conference on Artificial Intelligence, 725–730, (1996)
Liu, L. and Motoda, H.: Feature Selection for Knowledge Discovery and Data Mining, Kluwer Academic Publishers, Boston (1998)
Ricci, F., and Aha, D. W.: Error-correcting output codes for local learners, In Proceedings of the 10th European Conference on Machine Learning, (1998)
Bay, S. D.: Nearest Neighbor Classification from Multiple Feature Subsets. Intelligent Data Analysis. 3(3): 191–209, (1999)
Tumer, K., and Ghosh, J.: Error correlation and error reduction in ensemble classifiers, Connection Science 8, 385–404, (1996)
Cherkauer, K. J.: Human expert-level performance on a scientific image analysis task by a system using combined artificial neural networks. In P. Chan, (Ed.): Working Notes of the AAAI Workshop on Integrating Multiple Learned Models, 15–21, (1996)
Bishop, C, Neural Networks for Pattern Recognition, Oxford University Press, (1995)
Riedmiller, M., Braun, H.: A Direct Adaptive Method for Faster Backpropagation Learning: The RPROP Algorithm, Proceedings of the IEEE International Conf. on Neural Networks, San Francisco, 586–591 (1993)
Hagan, M., Menhaj, M.B.: Training feedforward networks with the Marquardt algorithm. IEEE Transactions on Neural Networks 5, 989–993 (1994)
Vucetic, S., Fiez, T., Obradovic, Z.: A Data Partitioning Scheme for Spatial Regression, Proceedings of the IEEE/INNS Int’l Conf. on Neural Networks, Washington, D.C., No. 348, session 8.1A., (1999)
Pokrajac, D., Fiez, T. and Obradovic, Z.: A Spatial Data Simulator for Agriculture Knowledge Discovery Applications, in review.
Margineantu, D. D., and Dietterich, T. G.: Pruning adaptive boosting, In Proceedings of the 14th International Conference on Machine Learning, 211–218 (1997)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2000 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Lazarevic, A., Fiez, T., Obradovic, Z. (2000). Adaptive Boosting for Spatial Functions with Unstable Driving Attributes. In: Terano, T., Liu, H., Chen, A.L.P. (eds) Knowledge Discovery and Data Mining. Current Issues and New Applications. PAKDD 2000. Lecture Notes in Computer Science(), vol 1805. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-45571-X_38
Download citation
DOI: https://doi.org/10.1007/3-540-45571-X_38
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-67382-8
Online ISBN: 978-3-540-45571-4
eBook Packages: Springer Book Archive