An Evaluation of Camera Pose Methods for an Augmented Reality System: Application to Teaching Industrial Robots
In automotive industry, industrial robots are widely used in production lines for many tasks such as welding, painting or assembly. Their use requires, from users, both a good manipulation and robot control. Recently, new tools have been developed to realize fast and accurate trajectories in many production sectors by using the real prototype of vehicle or a generalized design within a virtual simulation platform. However, many issues could be considered in these cases: the delay between the design of the vehicle and its production is often important, moreover, the virtual modeling presents a non realistic aspect of the real robot and vehicle, so this factor could introduce localization inacurracies in performing trajectories. Our work is registered as a part of TRI project (Teleteaching Industrial Robots) which aims to realize a demonstrator showing the interaction of industrial robots with virtual components and allowing to train users to perform successfully their tasks on a virtual representation of a production entity.
In this project we make use of Augmented Reality (AR) techniques to overlay virtual objects onto the real world in order to enhance the user’s perception and interaction while performing a specific industrial task. The idea is to allow the real robot to teach trajectories of an automotive task thanks to vehicle virtual model. The pose accuracy is prerequisite of our application since it allows a reliable teaching of the real trajectory. Therefore, we survey some vision-based pose computation algorithms and present a method that offers increased robustness and accuracy in the context of real-time AR tracking. Our aim is to determine the performance of these pose estimation methods in term of errors and distance evaluation. The evaluation of the pose estimation methods was obtained using a series of tests and an experimental protocol. The analysis of results shows the performance of algorithms in term of accuracy, stability and convergence.
KeywordsAugmented Reality pose estimation industrial robot computer vision real-time tracking
Unable to display preview. Download preview PDF.
- 2.Araujo, H., Carceroni, R., Brown, C.: A Fully Projective Formulation for Lowe’s Tracking Algorithm. Technical Report 641, University of Rochester, USA (1996)Google Scholar
- 5.Bischoff, R., Kurth, J.: Concepts, tools and devices for facilitating interaction with industrial robots through augmented reality. In: ISMAR Workshop on Industrial Augmented Reality, Santa Barbara, CA, USA, October 22 (2006)Google Scholar
- 6.Chai, L., Hoff, W.A., Vincent, T.: Three-dimensional motion and structure estimation using inertial sensors and computer vision for augmented reality. In: Presence: Teleoperators and Virtual Environments, Cambridge, MA, USA, vol. 11, pp. 474–492 (2002)Google Scholar
- 7.Chong, J.W.S., Nee, A.Y.C., Youcef-Toumi, K., Ong, S.K.: An application of augmented reality (ar) in the teaching of an arc welding robot. In: Innovation in Manufacturing Systems and Technology, IMST (2005)Google Scholar
- 11.Ennakr, S., Domingues, C., Benchikh, L., Otmane, S., Mallem, M.: Towards robot teaching based on virtual and augmented reality concepts. In: 2nd Mediterranean Conference on Intelligent Systems and Automation (CISA 2009), vol. 1107, pp. 337–341 (2009)Google Scholar
- 14.Haralick, R.M., Ottenberg, K., Lee, C., Nolle, M.: Analysis and solutions of the three point perspective pose estimation problem. In: Proc. IEEE Conf. Computer Vision and Pattern Recognition, Maui, Hawaii, pp. 592–598 (1991)Google Scholar
- 16.Kato, H., Billinghurst, M., Poupyrev, I., Imamoto, K., Tachibana, K.: Virtual object manipulation on a table-top ar environment. In: Proceedings of the International Symposium on Augmented Reality (ISAR 2000), Munich, Germany, pp. 111–119 (October 2000)Google Scholar
- 19.Maidi, M., Ababsa, F., Mallem, M.: Active contours motion based on optical flow for tracking in augmented reality. In: 8th International Conference on Virtual Reality (VRIC 2006), Laval, France, pp. 215–222 (2006)Google Scholar
- 20.Maidi, M., Didier, J.-Y., Ababsa, F., Mallem, M.: A performance study for camera pose estimation using visual marker based tracking. Machine Vision and Applications, IAPR International Journal (2008)Google Scholar
- 21.Moravec, H.P.: Towards automatic visual obstacle avoidance. In: Proc. 5th Int. Joint Conf. Artificial Intell., Cambridge, Massachusetts, USA, vol. 2, p. 584 (August 1977)Google Scholar
- 22.Ong, S.K., Chong, J.W.S., Nee, A.Y.C.: Methodologies for immersive robot programming in an augmented reality environment. In: GRAPHITE 2006: Proceedings of the 4th International Conference on Computer Graphics and Interactive Techniques in Australasia and Southeast Asia, pp. 237–244. ACM, New York (2006)CrossRefGoogle Scholar
- 25.Satoh, K., Takemoto, K., Uchiyama, S., Yamamoto, H.: A registration evaluation system using an industrial robot. In: ISMAR 2006: Proceedings of the 5th IEEE and ACM International Symposium on Mixed and Augmented Reality, pp. 79–87. IEEE Computer Society, Washington, DC (2006)Google Scholar
- 26.Shaheen, M., Mallem, M., Chavand, F.: Visual command of a robot using 3d scene reconstruction in an augmented reality system. Control Engineering Practice Elsevier Sciences Ltd. 9(4), 375–385 (2001)Google Scholar
- 27.Shimizu, N., Sugimoto, M., Sekiguchi, D., Hasegawa, S., Inami, M.: Mixed reality robotic user interface: virtual kinematics to enhance robot motion. In: ACE 2008: Proceedings of the 2008 International Conference on Advances in Computer Entertainment Technology, pp. 166–169. ACM, New York (2008)CrossRefGoogle Scholar