A Fusion Centre for Intelligent Robotic Systems

  • Gerard T. McKee
Part of the Microprocessor-Based and Intelligent Systems Engineering book series (ISCA, volume 9)


In this paper we present a database-type architecture for intelligent robotic systems, where the database contains a model of the environment surrounding the robot, generated by fusing data provided by a heterogeneous set of sensors, and updated continually in response to changes in the stimuli reaching the sensors. We discuss various forms of sensor fusion, and illustrate our approach with an example involving fusion of vision and taction for object recognition. We also discuss a database-type approach to the development of robotic system applications.


Object Recognition Robotic System Fusion Centre Sensory Field Sensor Fusion 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. [1]
    Luo, R. C. and Kay, M. G. Multisensor Integration and Fusion in Intelligent Systems, IEEE Transactions on Systems, Man, and Cybernetics, vol. SMC-19, no. 5, pp. 901–931, 1989.CrossRefGoogle Scholar
  2. [2]
    Nandhakumar, N. and Aggarwal, J. K. Synergistic Integration of Thermal and Visual Images for Computer Vision, in Proc. SPIE, vol. 782, Infrared Sensors & Sensor Fusion, Buser, R. G. & Warren, E B. (Ed.), 1987, pp. 28–36.Google Scholar
  3. [3]
    Blackman, S. S. Theoretical Approaches to Data Association and Fusion, in Proc. SPIE, vol. 931, Sensor Fusion, Weaver, C. W Ed., Orlando, Fl, Apr. 1988, pp. 50–55.Google Scholar
  4. [4]
    A. L. Yarbus, Eye Movements and Vision. Plenum Press, New York. 1967.Google Scholar
  5. [5]
    Brooks, R. A. A Robust Layered Control System for a Mobile Robot, IEEE J. Robotics and Automation, vol. 2, no. 1, pp. 14–23, 1986.CrossRefGoogle Scholar
  6. [6]
    Date, C. J. An Introduction to Database Systems, Volume 1, Fifth Edition, Addison Wesley, 1990.Google Scholar
  7. [7]
    Waltz, E. L. and Buede, D. M. Data Fusion and Decision Support for Command and Control, IEEE Transactions on Systems, Man, and Cybernetics, vol. SMC-16, no. 6, pp. 865–879, 1986.CrossRefGoogle Scholar
  8. [8]
    McKee, G. T. “What can be fused?” Paper presented at the 1988 NATO ARW on Multisensor Fusion for Computer Vision, [to be published].Google Scholar
  9. [9]
    Seetharaman, G. and Chu, C-H., Hierarchical Fusion of Geometric Constraints for Image Segmentation, in Proc. SPIE, vol. 1383, Sensor Fusion III: 3-D Perception and Recognition, Schenker, P. S. (Ed.), 1990, pp. 582–588.Google Scholar
  10. [10]
    Broida, T. J. Feature Correspondence in Multiple Sensor Data Fusion, in Proc. SPIE, vol. 1383, Sensor Fusion III: 3-D Perception and Recognition, Schenker, P. S. (Ed.), 1990, pp. 635–651.Google Scholar
  11. [11]
    Ayache, N. and Faugeras, O. D. Building, Registrating, and Fusing Noisy Visual Maps, The International Journal of Robotics Research, vol. 7, No. 6, December 1988.CrossRefGoogle Scholar
  12. [12]
    Allen, P. K. and Bajcsy, R. Object Recognition Using Vision and Touch, in Proc. 9th hit. Joint Conf. in Artificial Intelligence, Los Angeles, CA, Aug. 1981, pp. 1131–1137.Google Scholar

Copyright information

© Springer Science+Business Media Dordrecht 1991

Authors and Affiliations

  • Gerard T. McKee
    • 1
  1. 1.Department of Computer Science School of Engineering and Information SciencesUniversity of ReadingWhiteknights, ReadingEngland

Personalised recommendations