A Phenomenological Approach to Thermal and Visual Sensor Fusion

  • N. Nandhakumar
  • J. K. Aggarwal
Conference paper
Part of the NATO ASI Series book series (volume 58)


A new computer vision technique is developed which is based on analyzing different modalities of imaging, simultaneously. Such an approach is termed multisensory computer vision. This talk will describe a system which integrates information from thermal (infra-red) imagery and visual imagery to classify objects in outdoor scenes. Integration is synergistic in that it makes available new information, in this case — estimates of surface heat fluxes, which cannot be obtained by processing thermal and visual imagery separately. The approach establishes a quantitative measure of the imaged object’s relative ability to sink or source heat radiation, and a way of categorizing the object based on this property. Information integration is implemented at different levels of abstraction in the interpretation hierarchy i.e., at the pixel and at the symbolic levels. Heuristic rules are employed in a decision tree classifier to categorize imaged objects as being either vegetation, building, pavement or a vehicle.


Heat Flux Visual Image Surface Heat Flux Thermal Image Visual Imagery 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. [1]
    Kim J.H., Payton D.W., and Olin K.E., “An Expert System for Object Recognition in Natural Scenes,” Proc. of First Conf. on AI Applications, Denver, CO, Dec. 1984, pp 170–175.Google Scholar
  2. [2]
    Burton M., and Benning C., “Comparison of Imaging Infrared Detection Algorithms”, Proc. of SPIE, Vol. 302, 1981, pp 26–32.Google Scholar
  3. [3]
    Sevigny L.,“Evaluation of a Class of Segmenters for IR Imagery”, Report DREV R4172180,Defense Research Establishment, Valcartier, Canada, 1981.Google Scholar
  4. [4]
    Sevigny L., Hvedstrup-Jensen G., Bohner M., Ostevold E., and Grinaker S., “Discrimination and Classification of Vehicles in Natural Scenes from Thermal Imagery,” Computer Vision, Graphics and Image Processing, Vol. 24, 1983, pp 229–243.CrossRefGoogle Scholar
  5. [5]
    Hinderer J., “Model for Generating Synthetic 3-D Images of Small Vehicles”, Proc. of SPIE, Vol. 302, 1981, pp 8–13.Google Scholar
  6. [6]
    Casasent D., Pauly J., and Felterly D., “Infrared Ship Classification Using A New Moment Pattern Recognition Concept”, Proc. of SPIE, Vol. 302, 1981, pp 126–133.Google Scholar
  7. [7]
    Keng J., “Automatic Ship Recognition Using A Passive Radiometric Sensor”, Proc. of SPIE, Vol. 302, 1981, pp 122–125.Google Scholar
  8. [8]
    Lin Y.J., “Feature Analysis for Forward Looking Infrared (FLIR) Target Identification”, Proc. of SPIE, Vol. 302, 1981, pp 117–121.Google Scholar
  9. [9]
    Hester C.F., and Casasent D., “Interclass Discrimination Using Synthetic Discriminant Functions (SFDs)”, Proc. of SPIE, Vol. 302, 1981, pp 108–116Google Scholar
  10. [10]
    Aguilera R.A., “Advanced IR Image Seeker Program”, Proc of SPIE, Vol. 253, 1980, pp 58–64.Google Scholar
  11. [11]
    Nandhakumar N., and Aggarwal J.K., “The Artificial Intelligence Approach to Pattern Recognition–A Perspective and An Overview”, Pattern Recognition, Vol. 18, No. 6, 1985, pp. 383–389.CrossRefGoogle Scholar
  12. [12]
    Nandhakumar N., and Aggarwal J.K., “Synergetic Analysis of Thermal and Visual Images for Scene Perception”, Proc. of Platinum Jubilee Conference on Systems and Signal Processing, Indian Institute of Science, Bangalore, India, Dec. 1986, pp 177–180.Google Scholar
  13. [13]
    Newman E.A. and Hartline P.H., “The Infrared ”Vision“ of Snakes”, Scientific American, Vol. 246, No. 3, March 1982, pp 116–127.CrossRefGoogle Scholar
  14. [14]
    Stanford L.R. and Hartline P.H., “Spatial Sharpening by 2nd-Order Trigeminal Neurons in Crotaline Infrared Systems”, Brain Research, Vol. 185, No. 1, March 3, 1980, pp 115–123.CrossRefGoogle Scholar
  15. [15]
    Gruberg E.R., Kicliter E., Newman E.A., Kass L., and Hartline P.H., “Connections of the Tectum of the Rattlesnake Crotalus Viridis: An HRP Study”, The Journal of Comparative Neurology, Vol. 188, No. 1, Nov. 1979, pp 31–41.CrossRefGoogle Scholar
  16. [16]
    Hartline P.H., “The Optic Tectum of Reptiles: Neurophysiological Studies”, in Comparative Neurology of the Optic Tectum, H. Vanegas, Ed., Plenum Press, N.Y., 1984, pp 601–618.Google Scholar
  17. [17]
    Hartline P.H.,“Thermoreceptors in Snakes”, in Electroreceptors and Other Receptors in Lower Vertebrates, A. Fessard, Ed., Springer Verlag, 1974, pp 297–312.Google Scholar
  18. [18]
    Nandhakumar N., and Aggarwal J.K., “Integrating Information from Thermal and Visual Images for Scene Analysis”, Proc. of SPIE, Vol. 635, 1986, pp 132–142.Google Scholar
  19. [19]
    Thepchatri T., Johnson C.P., and Matlock H., “Prediction of Temperature and Stresses in Highway Bridges by A Numerical Procedure Using Daily Weather Reports”, Center for Highway Research, University of Texas at Austin, Tech. Report 23–1, 1977.Google Scholar
  20. [20]
    Strock C., and Koral R. L., Handbook of Air Conditioning, Heating and Ventilating, Industrial Press, Inc., 2nd Ed., 1965.Google Scholar
  21. [21]
    Incropera F.P., and De Witt D.P., Fundamentals of Heat Transfer, John Wiley & Sons, Inc., New York, 1981.Google Scholar
  22. [22]
    Roshenaw W.M., and Hartnett J.R., Handbook of Heat Transfer, McGraw Hill Book Co., New York, 1973.Google Scholar
  23. [23]
    Kim Y.C., and Aggarwal J.K., “Finding Range from Stereo Images”, Proc. of IEEE Conf. on Computer Vision and Pattern Recognition, San Fransisco, CA, June 1985, pp 289–294.Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1990

Authors and Affiliations

  • N. Nandhakumar
    • 1
  • J. K. Aggarwal
    • 1
  1. 1.Computer and Vision Research CenterCollege of Engineering The University of Texas at AustinAustinUSA

Personalised recommendations