Advertisement

Overview of the Mvp Sensor Planning System For Robotic Vision Tasks

  • Konstantinos Tarabanis
  • Roger Y. Tsai
  • Peter K. Allen
Chapter
Part of the Microprocessor-Based and Intelligent Systems Engineering book series (ISCA, volume 9)

Abstract

In this paper, we present an overview of the MVP sensor planning system that we have developed. MVP automatically determines viewpoints for a robotic vision system so that object features of interest are simultaneously visible, inside the field-of-view, in-focus and magnified as required. We have analytically characterized the domain of admissible camera locations, orientations and optical settings for which each of the above feature detectability requirements is satisfied separately. In addition, we have posed the problem in an optimization setting in order to determine viewpoints that satisfy all requirements simultaneously and with a margin. Experimental results are shown of this technique when all the above feature detectability constraints are included. Camera views are taken from the computed viewpoints by a robot vision system that is positioned and its lens is set according to the results of this method.

Keywords

Machine Vision Camera View Sensor Placement Optical Setting Occlude Region 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Bibliography

  1. [1]
    Bejczy, A. K., Kim, W. S., and Venema, S. C., “The Phantom robot: predictive displays for tele-operation with time delay.,”Proceedings 1990 IEEE International Conference on Robotics and Automationpp. 546–551, 1990.Google Scholar
  2. [2]
    Cowan, C. K., and Kovesi, P. D., “Automatic sensor placement from vision task requirements,”IEEE Trans. Pattern Anal. Mach. Intell.vol. 10, no. 3, pp. 407–16, May 1988.CrossRefGoogle Scholar
  3. [3]
    Lenz, R. K., and Tsai, R. Y., “Techniques for calibration of the scale factor and image center for high accuracy 3-D machine vision metrology,”IEEE Trans. Pattern Anal. Mach. Intellvol. 10, no. 5, pp. 713–20, Sept. 1988.CrossRefGoogle Scholar
  4. [4]
    Mundy, J. L., “Industrial machine vision - is it practical?,” in Freeman, H., editorMachine VisionSan Diego, California: Academic Press, 1988.Google Scholar
  5. [5]
    Shafer, S. A., Automation and calibration for robot vision systems, Pittsburgh, PA: Carnegie Mellon University, CMU-CS-88–147, May 1988.Google Scholar
  6. [6]
    Tarabanis, K., and Tsai, R.Y., “Viewpoint planning: the visibility constraint,” Proc. DARPA Image Understanding Workshop, Palo Alto, California, May 23–26 1989.Google Scholar
  7. [7]
    Tarabanis, K., and Tsai, R.Y., “Computing viewpoints that satisfy optical constraints,”Proceedings CVPR ‘81: The Computer Society Conference on Computer Vision and Pattern Recognition, Maui, Hawaii, June 3–6 1991.Google Scholar
  8. [8]
    Tarabanis, K., and Tsai, R. Y., “Planning viewpoints that satisfy several feature detectability constraints for robotic vision,” 5th International Conference on AdvancedRobotics ICAR 19911991.Google Scholar
  9. [9]
    Tarabanis, K., Tsai, R. Y., and Allen, P.K. “Satisfying the Resolution Constraint in the ”MVP“ Machine Vision Planning System,” 13th LASTED International Symposium Robotics and Manufacturing, also in Proc. 1990 DARPA Image Understanding Workshop, 1990.Google Scholar
  10. [10]
    Tarabanis, K., Tsai, R. Y., and Allen, P.K. “Automated sensor planning for robotic vision tasks,”Proceedings 1991 IEEE International Conference on Robotics and Automation April 7–12 1991.Google Scholar
  11. [11]
    Tsai, R.Y., “A versatile camera calibration technique for high accuracy 3D machine vision metrology using off-the-shelf TV camera and lenses,”IEEE Journal Robotics Automationvol. RA-3, no. 4, August 1987.Google Scholar
  12. [12]
    Tsai, R. Y., and Lavin, M. A., Thee-dimensional mechanical part measurement using a vision/robot system., Yorktown Heights, NY: IBM T.J. Watson Research Center, RC 10506, May 1984.Google Scholar
  13. [13]
    Tsai, R.Y., and Tarabanis, K., “Model-based planning of sensor placement and optical settings,” Proc. Sensor Fusion II: Human and Machine Strategies, Philadelphia, Pennsylvania, November 6–9 1989.Google Scholar
  14. [14]
    Tsai, R. Y., and Tarabanis, K., “Occlusion-free sensor placement planning,” in Freeman, H., editorMachine Vision for Three-Dimensional ScenesSan Diego, California: Academic Press, 1990.Google Scholar
  15. [15]
    Wolfe, W. J., Mathis, D. W., Magee, M., and Hoff, W., “Task panel sensing with a movable camera.,” Proc. Intelligent Robots and Computer Vision VIII: Systems and Applications, vol. SPIE Vol. 1193, 1989.Google Scholar
  16. [16]
    Womack, K. H., “Front-end foresight,”MVA/SME’s Vision Technology Quaterlyp. 4, January 1990.Google Scholar

Copyright information

© Springer Science+Business Media Dordrecht 1991

Authors and Affiliations

  • Konstantinos Tarabanis
    • 1
  • Roger Y. Tsai
    • 2
  • Peter K. Allen
    • 1
  1. 1.Computer Science DepartmentColumbia UniversityNew YorkUSA
  2. 2.Manufacturing ResearchIBM T.J. Watson Research CenterNew YorkUSA

Personalised recommendations