Skip to main content

A Fault-Tolerant Distributed Vision System Architecture for Object Tracking in a Smart Room

  • Conference paper
  • First Online:
Computer Vision Systems (ICVS 2001)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 2095))

Included in the following conference series:

Abstract

In recent years, distributed computer vision has gained a lot of attention within the computer vision community for applications such as video surveillance and object tracking. The collective information gathered by multiple cameras that are strategically placed has many advantages. For example, aggregation of information from multiple viewpoints reduces the uncertainty about the scene. Further, there is no single point of failure, thus the system as a whole could continue to perform the task at hand. However, the advantages arising out of such cooperation can be realized only by timely sharing of the information between them. This paper discusses the design of a distributed vision system that enables several heterogeneous sensors with different processing rates to exchange information in a timely manner in order to achieve a common goal, say tracking of multiple human subjects and mobile robots in an indoor smart environment.

In our fault-tolerant distributed vision system, a resource manager manages individual cameras and buffers the time-stamped object candidates received from them. A User Agent with a given task specification approaches the resource manager, first for knowing the available resources (cameras) and later for receiving the object candidates from the resources of its interest. Thus the resource manager acts as a proxy between the user agents and cameras, thereby freeing the cameras to do dedicated feature detection and extraction only. In such a scenario, many failures are possible. For example, one of the cameras may have a hardware failure or it may lose the target, which moved away from its field of view. In this context, important issues such as failure detection and handling, synchronization of data from multiple sensors and sensor reconfiguration by view planning are discussed in the paper. Experimental results with real scene images will be given.

Abstract

This work is supported by DARPA/ITO Mobile Autonomous Robots S/W (MARS) (Contract Number DOD DABT63-99-1-004) and Software for Distributed Robotics (SDR) (Contract Number DOD DABT63-99-1-0022)

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. D. R. Karuppiah et al. Software mode changes for continuous motion tracking. In P. Robertson, H. Shorbe, and R. Laddaga, editors, Self-Adaptive Software, volume 1936 of Lecture Notes in Computer Science, Oxford, UK, April 17–19 2000. Springer Verlag.

    Google Scholar 

  2. T. Matsuyama et al. Dynamic memory: Architecture for real time integration of visual perception, camera action, and network communication. In Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition, volume 2, Hilton Head Island, SC, June 2000.

    Google Scholar 

  3. P. Greguss. Panoramic imaging block for three-dimensional space. U.S. Patent 4,566,763, January 1986.

    Google Scholar 

  4. I. Haritaoglu, D. Harwood, and L. S. Davis. W4: Real-time system for detection and tracking people in 2.5d. In Proc. of the 5th European Conf. on Computer Vision, Freiburg, Germany, June 1998.

    Google Scholar 

  5. G. Holness, D. Karuppiah, S. Uppala, R. Grupen, and S. C. Ravela. A service paradigm for reconfigurable agents. In Proc. of the 2nd Workshop on Infrastructure for Agents, MAS, and Scalable MAS, Montreal, Canada, May 2001. ACM. To appear.

    Google Scholar 

  6. Z. T. Kalbarczyk, S. Bagchi, K. Whisnant, and R. K. Iyer. Chameleon: A software infrastructure for adaptive fault tolerance. IEEE Transactions on Parallel and Distributed Systems, 10(6):1–20, June 1999.

    Article  Google Scholar 

  7. M. Kokar, K. Baclawski, and Y. A. Eracar. Control theory based foundations of self controlling software. IEEE Intelligent Systems, 14(3):37–45, May 1999.

    Article  Google Scholar 

  8. R. Ladagga. Creating robust-software through self-adaptation. IEEE Intelligent Systems, 14(3):26–29, May 1999.

    Article  Google Scholar 

  9. K. Marzullo and S. Owicki. Maintaining the time in a distributed system. ACM Operating Systems Review, 19(3):44–54, July 1985.

    Article  Google Scholar 

  10. D. L. Mills. Internet time synchronization: The network time protocol. IEEE Transactions on Communications, 39(10):1482–1493, October 1991.

    Article  Google Scholar 

  11. D. L. Mills. Improved algorithms for synchronizing computer network clocks. IEEE/ACM Transactions on Networks, 3(3):245–254, June 1995.

    Article  Google Scholar 

  12. A. Nakazawa, H. Kato, and S. Inokuchi. Human tracking using distributed vision systems. In In 14th International Conference on Pattern Recognition, pages 593–596, Brisbane, Australia, 1998.

    Google Scholar 

  13. Kim C. Ng, H. Ishiguro, Mohan M. Trivedi, and T. Sogo. Monitoring dynamically changing environments by ubiquitous vision system. In IEEE Workshop on Visual Surveillance, Fort Collins, Colorado, June 1999.

    Google Scholar 

  14. A. Pentland. Looking at people: Sensing for ubiquitous and wearable computing. IEEE Transactions on Pattern Analysis and Machine Intelligence, 22(1):107–119, January 2000.

    Article  Google Scholar 

  15. T. Sogo, H. Ishiguro, and Mohan. M. Trivedi. Panoramic Vision: Sensors, Theory and Applications, chapter N-Ocular Stereo for Real-time Human Tracking. Springer Verlag, 2000.

    Google Scholar 

  16. Mohan M. Trivedi, K. Huang, and I. Mikic. Intelligent environments and active camera networks. IEEE Transactions on Systems, Man and Cybernetics, October 2000.

    Google Scholar 

  17. Z. Zhu, K. Deepak Rajasekar, E. Riseman, and A. Hanson. Panoramic virtual stereo vision of cooperative mobile robots for localizing 3d moving objects. In Proceedings of IEEE Workshop on Omnidirectional Vision–OMNIVIS’00, pages 29–36, Hilton Head Island, SC, June 2000.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2001 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Karuppiah, D.R., Zhu, Z., Shenoy, P., Riseman, E.M. (2001). A Fault-Tolerant Distributed Vision System Architecture for Object Tracking in a Smart Room. In: Schiele, B., Sagerer, G. (eds) Computer Vision Systems. ICVS 2001. Lecture Notes in Computer Science, vol 2095. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-48222-9_14

Download citation

  • DOI: https://doi.org/10.1007/3-540-48222-9_14

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-42285-3

  • Online ISBN: 978-3-540-48222-2

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics