Advertisement

Visual Observation of a Moving Agent

  • Tarek M. Sobh
  • Ruzena Bajcsy
Chapter
Part of the Microprocessor-Based and Intelligent Systems Engineering book series (ISCA, volume 9)

Abstract

We address the problem of observing a moving agent. In particular, we propose a system for observing a manipulation process, where a robot hand manipulates an object. A discrete event dynamic system (DEDS) frame work is developed for the hand/object interaction over time and a stabilizing observer is constructed. Low-level modules are developed for recognizing the “events” that causes state transitions within the dynamic manipulation system. The work examines closely the possibilities for errors, mistakes and uncertainties in the manipulation system, observer construction process and event identification mechanisms. The system utilizes different tracking techniques in order to observe and recognize the task in an active, adaptive and goal-directed manner.

Keywords

Image Flow Displacement Error Manipulation System Robot Hand Manipulation Action 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. [1]
    P. Anandan, “A Unified Perspective on Computational Techniques for the Measurement of Visual Motion”. In Proceedings of the l St International Conference on Computer Vision, 1987.Google Scholar
  2. [2]
    R. Bajcsy, “Active Perception”, Proceedings of the IEEE, Vol. 76, No. 8, August 1988.CrossRefGoogle Scholar
  3. [3]
    R. Bajcsy and T. M. Sobh, A Framework for Observing a Manipulation Process. Technical Report MS-CIS-90–34 and GRASP Lab. TR 216, University of Pennsylvania, June 1990.Google Scholar
  4. [4]
    R. Bajcsy and T. M. Sobh, Observing a Moving Agent. Technical Report MSCIS-91–01 and GRASP Lab. TR 247, Computer Science Dept., School of Engineering and Applied Science, University of Pennsylvania, January 1991.Google Scholar
  5. [5]
    P. J. Burt, et al., “Object Tracking with a Moving Camera”, IEEE Workshop on Visual Motion, March 1989.Google Scholar
  6. [6]
    B. K. P. Horn and B. G. Schunck, “Determining Optical Flow”, Artificial Intelligence, vol. 17, 1981, pp. 185–203.CrossRefGoogle Scholar
  7. [7]
    Y. Li and W. M. Wonham, “Controllability and Observability in the State-Feedback Control of Discrete-Event Systems”, Proc. 27th Conf. on Decision and Control, 1988.Google Scholar
  8. [8]
    H. C. Longuet-Higgins and K. Prazdny, The interpretation of a moving Retinal Image, Proc. Royal Society of London B, 208, 385–397.Google Scholar
  9. [9]
    C. M. Ozveren, Analysis and Control of Discrete Event Dynamic Systems: A State Space Approach, Ph.D. Thesis, Massachusetts Institute of Technology, August 1989.Google Scholar
  10. [10]
    P. J. Ramadge and W. M. Wonham, “Modular Feedback Logic for Discrete Event Systems”, SIAM Journal of Control and Optimization, September 1987.Google Scholar
  11. [11]
    T. M. Sobh and K. Wohn, “Recovery of 3-D Motion and Structure by Temporal Fusion”. In Proceedings of the 2nd SPIE Conference on Sensor Fusion, November 1989.Google Scholar
  12. [12]
    M. Subbarao and A. M. Waxman, On The Uniqueness of Image Flow Solutions for Planar Surfaces in Motion, CAR—TR-113, Center for Automation Research, University of Maryland, April 1985.Google Scholar
  13. [13]
    S. Ullman, “Analysis of Visual Motion by Biological and Computer Systems”, IEEE Computer, August 1981.Google Scholar
  14. [14]
    S. Ullman, Maximizing Rigidity: The incremental recovery of 3-D structure from rigid and rubbery motion, AI Memo 721, MIT AI lab. 1983.Google Scholar

Copyright information

© Springer Science+Business Media Dordrecht 1991

Authors and Affiliations

  • Tarek M. Sobh
    • 1
  • Ruzena Bajcsy
    • 1
  1. 1.GRASP Laboratory Department of Computer and Information Science School of Engineering and Applied ScienceUniversity of PennsylvaniaPhiladelphiaUSA

Personalised recommendations