Skip to main content

Part of the book series: Perspectives in Neural Computing ((PERSPECT.NEURAL))

Summary

This chapter provides an introduction to the main methods and issues in the combination of Artificial Neural Nets. A distinction is made between ensemble and modular modes of combination, and the two are then treated separately. The reasons for ensemble combination are considered, and an account is provided of the main methods for creating and combining ANNs in ensembles. This account is accompanied by a discussion of the relative effectiveness of these methods, in which the concepts of diversity and selection are explained. The review of modular combination outlines the main methods of creating and combining modules, depending on whether the relationship between the modules is co-operative, competitive, sequential or supervisory. An overivew of the chapters in the book forms the conclusion section.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. D.E. Eckhardt and L.D. Lee. A theoretical basis for the analysis of multiversion software subject to coincident errors. IEE Transactions on Software Engineering, SE-11(12), 1985.

    Google Scholar 

  2. A.J.C. Sharkey. On combining artificial neural nets. Connection. Science. Special Issue on Combining Artificial Neural: Ensemble Approaches, 8 (3 & 4): 299–314, 1996

    Google Scholar 

  3. L.K. Hansen and P. Salamon. Neural network ensembles. IEEE Transaction on Pattern Analysis and Machine Intelligence, 12 (10): 993–1000, 1990

    Article  Google Scholar 

  4. C. Bishop. Neural Networks for Pattern Recognition. Clarendon Press, Oxford, 1 edition, 1995.

    Google Scholar 

  5. R.R. Brooks, S.S. Iyengar, and N.S.V. Rao. Sensor fusion survey: Sensors, statistics, signal processing and neural networks. In Neural Networks and Their Applications: NEURAP’97, pages 183–190. March 1997

    Google Scholar 

  6. J.A. Fodor. The Modularity of Mind: An Essay on Faculty Psychology. A Bradford Book, MIT Press, London, England, 1983

    Google Scholar 

  7. B. Grofman and G. Owen. Editors, Information Pooling and Group Decision Making. Jai Press Inc, Greenwich, Connecticut, 1986

    Google Scholar 

  8. J. von Neumann. Probabilistic logics and the synthesis of reliable organisms from unreliable components. In C.E. Shannon and J. McCarthy, editors, Automata Studies, pages 43–98. Princeton University Press, 1956

    Google Scholar 

  9. J.M. Bates and C.W.J. Granger. The combination of forecasts. Operations Research Quarterly, 20: 451–468, 1969

    Article  Google Scholar 

  10. N.S.V Rao. Fusion rule estimation in multiple sensor systems with unknown noise densities. Journal of Franklin Institute, 331B (5): 509–530, 1995

    Google Scholar 

  11. A. Avizienis and J.P.J Kelly. Fault diagnosis by design diversity: Concepts and experiments. IEEE Comput, 17: 67–80, 1984

    Google Scholar 

  12. S. Geman, E. Bienenstock, and R. Doursat. Neural networks and the bias/variance dilemma. Neural Computation, 4 (1): 1–58, 1992

    Article  Google Scholar 

  13. B. Parmanto, P.W. Munro, and H.R. Doyle. Reducing variance of committee prediction with resampling techniques. Connection Science. Special Issue on Combining Artificial Neural: Ensemble Approaches, 8 (3 & 4): 405–426, 1996

    Google Scholar 

  14. N. Intrator and Y. Raviv. Bootstrapping with noise: an effective regularization technique. Connection Science. Special Issue on Combining Artificial Neural Nets: Ensemble Approaches, 8 (3 &4): 355–372, 1996

    Google Scholar 

  15. B. Rosen. Ensemble learning using decorrelated neural networks. Connection Science. Special Issue on Combining Artificial Neural: Ensemble Approaches, 8 (3 & 4): 373–384, 1996

    Google Scholar 

  16. A. Krogh and J. Vedelsby. Neural network ensembles, cross validation and active learning. In G. Tesauro, D.S. Touretzky, and T.K. Leen, editors, Advances in Neural Information Processing Systems 7. MIT Press, 1995

    Google Scholar 

  17. K. Kim and E.B. Bartlett. Error estimation by series association for neural network systems. Neural Computation, 7: 799–808, 1995

    Article  Google Scholar 

  18. G. Rogova. Combining the results of several neural network classifiers. Neural Networks, 7 (5): 777–781, 1994

    Article  Google Scholar 

  19. D.H. Wolpert. Stacked generalization. Neural Networks, 5: 241–259, 1992

    Article  Google Scholar 

  20. R.A. Jacobs. Methods for combining experts’ probability assessments. Neural Computation, 7: 867–888, 1995

    Article  Google Scholar 

  21. B.Littlewood and D.R. Miller. Conceptual modelling of coincident failures in multiversion software. IEE Transactions on Software Engineering, 15(12), 1986.

    Google Scholar 

  22. J.G. Gueard Jr. and R.T. Clemen. Collinearity and the use of latent root regression for combining gnp forecasts. Journal of Forecasting, 8: 231–238, 1989

    Article  Google Scholar 

  23. A.J.C. Sharkey and N.E. Sharkey. How to improve the reliability of artificial neural networks. Technical Report CS-95–11, University of Sheffield, Department of Computer Science, University of Sheffield, 1995

    Google Scholar 

  24. A.J.C. Sharkey and N.E. Sharkey. Combining diverse neural nets. The Knowledge Engineering Review, 12 (3): 231–247, 1997

    Article  Google Scholar 

  25. L. Breiman. Bagging predictors. Machine Learning, 26 (2): 123–140, 1996

    Google Scholar 

  26. B. Efron and R. Tibshirani. An Introduction to the Bootstrap. Chapman and Hall, New York, 1993

    MATH  Google Scholar 

  27. A.J.C. Sharkey, N.E. Sharkey, and G.O. Chandroth. Neural nets and diversity. Neural Computing and Applications, 4: 218–227, 1996

    Article  Google Scholar 

  28. K. Tumer and J. Ghosh. Error correlation and error reduction in ensemble classifiers. Connection Science. Special Issue on Combining Artificial Neural: Ensemble Approaches, 8 (3 &4): 385–404, 1996

    Google Scholar 

  29. R.E. Schapire. The strength of weak learnability. Machine Learning, 5: 197–227, 1990

    Google Scholar 

  30. H. Drucker, C. Cortes, L.D. Jackel, Y. LeCun, and V. Vapnik. Boosting and other ensemble methods. Neural Computation, 6 (6): 1289–1301, 1994

    Article  MATH  Google Scholar 

  31. Y. Freund and R. Schapire. Experiments with a new boosting algorithm. In Proceedings of the Thirteenth International Conference on Machine Learning, pages 149–156. Morgan Kaufmann, 1996

    Google Scholar 

  32. L. Breiman. Arcing classifiers. Technical Report 460, Statistics Department, University of California, Berkeley, 1996

    Google Scholar 

  33. C. Genest and J.V. Zidek. Combining probability distributions: a critique and annotated bibliography. Statistical Science, 1: 114–148, 1996

    Article  MathSciNet  Google Scholar 

  34. L. Xu, A. Krzyzak, and C.Y. Suen. Methods of combining multiple classifiers and their applications to handwriting recognition. IEEE Transactions on Systems, Man and Cybernetics, 22 (3): 418–435, May 1992

    Article  Google Scholar 

  35. Peter A. Zhilkin and Ray L. Somorjai. Application of several methods of classification fusion to magnetic resonance spectra. Connection Science. Special issue on Combining Artificial Neural Nets: Ensemble Approaches, 8 (3 & 4): 427–442, 1996

    Google Scholar 

  36. M.P. Perrone and L.N. Cooper. When networks disagree: Ensemble methods for hybrid neural networks. In R.J. Mammone, editor, Neural Networks for Speech and Image Processing, chapter 10. Chapman-Hall, 1993

    Google Scholar 

  37. S. Hashem. Optimal Linear Combinations of Neural Networks. PhD thesis, School of Industrial Engineering, Purdue University, 1993

    Google Scholar 

  38. S. Hashem. Effects of collinearity on combining neural networks. Connection Science: Special Issue on Combining Artificial Neural Networks: Ensemble Approaches, 8 (3 &4): 315–336, 1996

    Google Scholar 

  39. S. Hashem. Optimal linear combinations of neural networks. Neural Networks, 10 (4): 599–614, 1997

    Article  Google Scholar 

  40. K. AI-Ghoneim and B.V.K. Vijaya Kumar. Learning ranks with neural networks. In Applications and Science of Artificial Neural Networks: Proceedings of the SPIE, volume 2492, pages 446–464. 1995

    Google Scholar 

  41. K. Tumer and J. Ghosh. Order statistics combiners for neural classifiers. In Proceedings of the World Congress on Neural Networks, pages I:31–34. INNS„Press, Washington DC, 1995

    Google Scholar 

  42. L. Breiman. Stacked regression. Technical Report 367, Statistics Department, University of California, Berkeley, 1993

    Google Scholar 

  43. M. LeBlanc and R. Tibshirani. Combining estimates in regression and classification. Paper available from ftp site: ustat.toronto.edu, 1993

    Google Scholar 

  44. J.F. Kolen and J.B. Pollack. Backpropagation is sensitive to initial conditions. Technical Report TR 90-JK-BPSIC, 1990

    Google Scholar 

  45. J. Denker, D. Schwartz, B. Wittner, S. Solla, R. Howard, L. Jackel, and J. Hopfield. Large automatic learning, rule extraction and generalisation. Complex Systems, 1: 877–922, 1987

    MathSciNet  MATH  Google Scholar 

  46. N.E. Sharkey and A.J.C. Sharkey. An analysis of catastrophic interference. Connection Science, 1995(7):313–341, 3 &#xamp; 4 1995

    Google Scholar 

  47. P. Sollich and A. Krogh. Learning with ensembles: How overfitting can be useful. In D.S. Touretzky, M.C. Mozer, and M.E. Hasselmo, editors, Advances in Neural Information Processing Systems 8. MIT Press, 1996

    Google Scholar 

  48. D.W. Optiz and J.W. Shavlik. Actively searching for an effective neural network ensemble. Connection Science. Special Issue on Combining Artificial Neural: Ensemble Approaches, 8 (3 & 4): 337–354, 1996

    Google Scholar 

  49. R. Murray-Smith and T.A. Johansen. Multiple Model Approaches to Modelling and Control. Taylor and Francis, UK, 1997

    Google Scholar 

  50. K.S. Narendra, J. Balakrishnan, and K. Ciliz. Adaptation and learning using multiple models, swtiching and tuning. IEEE Control Systems Magazine, pages 37–51, June 1995

    Google Scholar 

  51. R.A. Brooks. A robust layered control system for a mobile robot. IEEE Journal of Robotics and Automation, RA-2: 14–23, 1986

    Article  MathSciNet  Google Scholar 

  52. N.E. Sharkey. Artificial neural networks for coordination and control: the portability of experiential representations. Robotics and Autonomous Systems, 22: 345–360, 1997

    Article  Google Scholar 

  53. N.E. Sharkey and A.J.C. Sharkey. A modular design for connectionist parsing. In M.F.J. Drosaers and A. Nijholt, editors, Proceedings of Workshop on Language Technology, pages 87–96. 1992

    Google Scholar 

  54. T. Catfolis and K. Meert. Hybridization and specialization of real-time recurrent learning-based neural networks. Connection Science. Special Issue on Combining Artificial Neural Nets: Modular Approaches, 9 (1): 51–70, 1997

    Google Scholar 

  55. Y. Bennani and P. Gallinari. Task decomposition through a modular connectionist architecture: A talker identification system. In I. Aleksander and J. Taylor, editors, Third International Conference on Artificial Neural Networks, volume 1, pages 783–786. North-Holland, Amsterdam, 1992

    Google Scholar 

  56. W. Marslen-Wilson and L.K. Tyler. Against modularity. In J.L. Garfield, editor, Modularity in Knowledge Representation and Natural Language, pages 37–62. Bradford Book, MIT Press, 1987

    Google Scholar 

  57. T.J.Prescott. Spatial representation for navigation in animats. Adaptive Behaviour, 4(2), 1996.

    Google Scholar 

  58. P. Gallinari. Modular neural net systems: training of. In M.A. Arbib, editor, The Handbook of Brain Theory and Neural Networks, pages 582–585. Bradford Books: MIT Press, 1995

    Google Scholar 

  59. T. Hrycej. Modular Learning in Neural Networks. John Wiley, Chichester, 1992

    MATH  Google Scholar 

  60. L.Y. Pratt, J. Mostow, and C.A. Kamm. Direct transfer of learned information among neural networks. In Proceedings of the Ninth National Conference on Artificial Intelligence (AAI-91), pages 584–589. Anaheim, CA, 1991

    Google Scholar 

  61. B. Lu and M. Ito. Task decomposition and module combination based on class relations: A modular neural network for pattern classification. Technical Report BMC TR-98–1, Bio-Mimetic Control Research Centre, Nagoya, Japan, 1998

    Google Scholar 

  62. J.B. Hampshire and A.H. Waibel. The M,eta-P,i network: Building distributed representations for robust multisource pattern recognition. IEEE Transactions on Pattern Analysis and Machine Intelligence, 14 (7): 751–769, 1992

    Article  Google Scholar 

  63. A. Waibel, H. Sawai, and K. Shikano. Modularity and scaling in large phonemic neural networks. IEEE Transactions on Acoustics, Speech and Signal Processing, 37: 1888–1898, 1989

    Article  Google Scholar 

  64. L. Ohno-Machado and M.A. Musen. Modular neural networks for medical prognosis: Quantifying the benefits of combining neural networks for survival prediction. Connection Science. Special Issue on Combining Artificial Neural Nets: Modular Approaches, 9 (1): 71–86, 1997

    Google Scholar 

  65. W.G. Baxt. Improving the accuracy of an artificial neural network using multiple differently trained networks. Neural Computation, 4: 772–780, 1992

    Article  Google Scholar 

  66. R. Anand, K.G. Mehrotra, C.K. Mohan, and S. Ranka. Efficient classification for multiclass problems using modular neural networks. IEEE Trans Neural Networks, 6: 117–124, 1995

    Article  Google Scholar 

  67. R.A. Jacobs, M.I. Jordan, S.J. Nowlan, and G.E. Hinton. Adaptive mixtures of local experts. Neural Computation, 3: 79–97, 1991

    Article  Google Scholar 

  68. M.I. Jordan and R.A. Jacobs. Hierarchical mixtures of experts and the em algorithm. Neural Computation, 6 (2): 181–214, 1994

    Article  Google Scholar 

  69. F. Peng, R.A. Jacobs, and M.A. Tanner. Bayesian inference in mixtures-ofexperts and hierarchical mistures-of-experts models with an application to speech recognition. Journal of the American Statistical Association, 1995

    Google Scholar 

  70. R. Luo and M. Kay. Data fusion and sensor integration: State-of-the-art 1990’s. In Abidi and Gonzales, editors, Data Fusion in Robotics and Machine Intelligence, pages 7–136. Academic Press, Boston, 1992

    Google Scholar 

  71. C. McCormack. Adaptation of learning rule parameters using a meta neural network. Connection Science. Special Issue on Combining Artificial Neural Nets: Modular Approaches, 9 (1): 123–136, 1997

    MathSciNet  Google Scholar 

  72. R.R. Murphy. Biological and cognitive foundations of intelligent sensor fusion. IEEE Transactions on Systems, Man and Cybernetics, pages 42–51, January 1996

    Google Scholar 

  73. A.J.C. Sharkey, N.E. Sharkey, and S.S. Cross. Adapting an ensemble approach for the diagnosis of breast cancer. In Proceedings of ICANN98. Springer-Verlag, 1998

    Google Scholar 

  74. D.J. Chalmers. Syntactic transformation on distributed representations. Connection Science, 2: 53–62, 1990

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 1999 Springer-Verlag London Limited

About this chapter

Cite this chapter

Sharkey, A.J.C. (1999). Multi-Net Systems. In: Sharkey, A.J.C. (eds) Combining Artificial Neural Nets. Perspectives in Neural Computing. Springer, London. https://doi.org/10.1007/978-1-4471-0793-4_1

Download citation

  • DOI: https://doi.org/10.1007/978-1-4471-0793-4_1

  • Publisher Name: Springer, London

  • Print ISBN: 978-1-85233-004-0

  • Online ISBN: 978-1-4471-0793-4

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics