Abstract
In this chapter, a unified passivity-based visual servoing control structure considering a vision system mounted on the robot is presented. This controller is suitable to be applied for robotic arms, mobile robots, as well as mobile manipulators. The proposed control law makes the robot able to perform a moving target tracking in its workspace. Taking advantage of the passivity properties of the control system and considering exact knowledge of the target velocity, the asymptotic convergence of the control errors to zero is proved. Later, a robustness analysis is carried out based on L 2-gain performance, thus proving that control errors are ultimately bounded even when bounded errors exist in the estimation of the target velocity. Both numerical simulation and experimental results illustrate the performance of the algorithm in a robotic manipulator, a mobile robot, and also a mobile manipulator.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Abbreviations
- DOF:
-
Degree of freedom
References
Basaca-Preciado, L. C., Sergiyenko, O. Y., Rodríguez-Quinonez, J. C., Garcia, X., Tyrsa, V. V., Rivas-Lopez, M., Hernandez-Balbuena, D., Mercorelli, P., Podrygalo, M., Gurko, A., Tabakova, I., & Starostenko, O. (2014). Optical 3D laser measurement system for navigation of autonomous mobile robot. Optics and Lasers in Engineering, 54, 159–169.
Toibero, J. M., Roberti, F., & Carelli, R. (2009). Stable contour-following control of wheeled mobile robots. Robotica, 27(1), 1–12.
Toibero, J. M., Roberti, F., Carelli, R., & Fiorini, P. (2011). Switching control approach for stable navigation of mobile robots in unknown environments. Robotics and Computer Integrated Manufacturing, 27(3), 558–568.
Rodríguez-Quiñonez, J. C., Sergiyenko, O., Flores-Fuentes, W., Rivas-Lopez, M., Hernandez-Balbuena, D., Rascón, R., & Mercorelli, P. (2017). Improve a 3D distance measurement accuracy in stereo vision systems using optimization methods’ approach. Opto-Electronics Review, 25(1), 24–32.
Toibero, J. M., Soria, C., Roberti, F., Carelli, R., & Fiorini, P. (2009). Switching visual servoing approach for stable corridor navigation. In Proceedings of the International Conference on Advanced Robotics, Munich, Germany, 22–26 June 2009.
Traslosheros, A., Sebastián, J. M., Torrijos, J., Carelli, R., & Roberti, F. (2014). Using a 3DOF parallel robot and a spherical bat to hit a Ping-Pong ball. International Journal of Advanced Robotic Systems, 11(5). https://doi.org/10.5772/58526.
Weiss, L. E., Sanderson, A., & Neuman, P. (1987). Dynamic sensor-based control of robots with visual feedback. IEEE Journal of Robotics and Automation, 3(9), 404–417.
Carelli, R., De la Cruz, C., & Roberti, F. (2006). Centralized formation control of non-holonomic mobile robots. Latin American Applied Research, 36(2), 63–69.
Gong, Z., Tao, B., Yang, H., Yin, Z., & Ding, H. (2018). An uncalibrated visual servo method based on projective homography. IEEE Transactions on Automation Science and Engineering, 15(2), 806–817.
López-Nicolás, G., Guerrero, J. J., & Sagüés, C. (2010). Visual control of vehicles using two-view geometry. Mechatronics, 20(2), 315–325.
Carelli, R., Kelly, R., Nasisi, O. H., Soria, C., & Mut, V. (2006). Control based on perspective lines of a nonholonomic mobile robot with camera-on-board. International Journal of Control, 79, 362–371.
Wang, H., Guo, D., Xu, H., Chen, W., Liu, T., & Leang, K. K. (2017). Eye-in-hand tracking control of a free-floating space manipulator. IEEE Transactions on Aerospace and Electronic Systems, 53(4), 1855–1865.
Carelli, R., Santos-Victor, J., Roberti, F., & Tosetti, S. (2006). Direct visual tracking control of remote cellular robots. Robotics and Autonomous Systems, 54(10), 805–814.
Taryudi, & Wang, M. S. (2017). 3D object pose estimation using stereo vision for object manipulation system. In Proceedings of the IEEE International Conference on Applied System Innovation, Sapporo, Japan, 13–17 May 2017.
López-Nicolás, G., Guerrero, J. J., & Sagüés, C. (2010). Visual control through the trifocal tensor for nonholonomic robots. Robotics and Autonomous Systems, 58(2), 216–226.
Andaluz, V., Carelli, R., Salinas, L., Toibero, J. M., & Roberti, F. (2012). Visual control with adaptive dynamical compensation for 3D target tracking by mobile manipulators. Mechatronics, 22(4), 491–502.
Roberti, F., Toibero, J. M., Soria, C., Vassallo, R., & Carelli, R. (2009). Hybrid collaborative stereo vision system for mobile robots formation. International Journal of Advanced Robotic Systems, 6(4), 257–266.
Zhang, K., Chen, J., Li, Y., & Gao, Y. (2018). Unified visual servoing tracking and regulation of wheeled mobile robots with an uncalibrated camera. IEEE/ASME Transactions on Mechatronics, 23(4), 1728–1739.
Fujita, M., Kawai, H., & Spong, M. W. (2007). Passivity-based dynamic visual feedback control for three dimensional target tracking: Stability and L2-gain perfomance analysis. IEEE Transactions on Control Systems Technology, 15(1), 40–52.
Kawai, H., Toshiyuki, M., & Fujita, M. (2006). Image-based dynamic visual feedback control via passivity approach. In Proceedings of the IEEE International Conference on Control Applications, Munich, Germany, 4–6 October 2006.
Murao, T., Kawai, H., & Fujita, M. (2005). Passivity-based dynamic visual feedback control with a movable camera. In Proceedings of the 44th IEEE International Conference on Decision and Control, Sevilla, Spain, 12–15 December 2005.
Soria, C., Roberti, F., Carelli, R., & Sebastián, J. M. (2008). Control Servo-visual de un robot manipulador planar basado en pasividad. Revista Iberoamericana de Automática e Informática Industrial, 5(4), 54–61.
Martins, F., Sarcinelli, M., Freire Bastos, T., & Carelli, R. (2008). Dynamic modeling and trajectory tracking control for unicycle-like mobile robots. In Proceedings of the 3rd International Symposium on Multibody Systems and Mechatronics, San Juan, Argentina, 8–12 April 2008.
Andaluz, V., Roberti, F., Salinas, L., Toibero, J. M., & Carelli, R. (2015). Passivity-based visual feedback control with dynamic compensation of mobile manipulators: Stability and L2-gain performance analysis. Robotics and Autonomous Systems, 66, 64–74.
Morales, B., Roberti, F., Toibero, J. M., & Carelli, R. (2012). Passivity based visual servoing of mobile robots with dynamics compensation. Mechatronics, 22(4), 481–490.
El-Hawwary, M. I., & Maggiore, M. (2008). Global path following for the unicycle and other results. In Proceedings of the American Control Conference, Seattle, Washington, 11–13 June 2008.
Lee, D. (2007). Passivity-based switching control for stabilization of wheeled mobile robots. In Proceedings of the Robotics: Science and Systems, Atlanta, Georgia, 27–30 June 2007.
Arcak, M. (2007). Passivity as a design tool for group coordination. IEEE Transactions on Automatic Control, 52(8), 1380–1390.
Igarashi, Y., Hatanaka, T., Fujita, M., & Spong, M. W. (2007). Passivity-based 3D attitude coordination: Convergence and connectivity. In Proceedings of the IEEE Conference on Decision and Control, New Orleans, LA,10–11 December 2007.
Ihle, I., Arcak, M., & Fossen, T. (2007). Passivity-based designs for synchronized path-following. Automatica, 43(9), 1508–1518.
Spong, M. W., Holm, J. K., & Lee, D. J. (2007). Passivity-based control of biped locomotion. IEEE Robotics & Automation Magazine, 14(2), 30–40.
Fujita, M., Hatanaka, T., Kobayashi, N., Ibuki, T., & Spong, M. (2009). Visual motion observer-based pose synchronization: A passivity approach. In Proceedings of the IEEE International Conference on Decision and Control, Shanghai, China, 16–18 December 2009.
Kawai, H., Murao, T., & Fujita, M. (2011). Passivity-based visual motion observer with panoramic camera for pose control. Journal of Intelligent & Robotic Systems, 64(3–4), 561–583.
Hu, Y. M., & Guo, B. H. (2004). Modeling and motion planning of a three-link wheeled mobile manipulator. In Proceedings of the International Conference on Control, Automation, Robotics and Vision, Kunming, China, 6–9 December 2004.
Andaluz, V., Roberti, F., Toibero, J. M., & Carelli, R. (2012). Adaptive unified motion control of mobile manipulators. Control Engineering Practice, 20(12), 1337–1352.
Hutchinson, S., Hager, G., & Corke, P. (1996). A tutorial on visual servo control. IEEE Transactions on Robotics and Automation, 12(5), 651–670.
Hill, D., & Moylan, P. (1976). Stability results for nonlinear feedback systems. Automatica, 13, 373–382.
Bynes, C. I., Isidori, A., & Willems, J. C. (1991). Passivity, feedback equivalence, and the global stabilization of minimun phase nonlinear systems. IEEE Transactions on Automatic Control, 36(11), 1228–1240.
Ortega, R., Loria, A., Nelly, R., & Praly, L. (1995). On passivity based output feedback global stabilization of Euler-Lagrange systems. International Journal of Robust and Nonlinear Control, 5, 313–324.
Vidyasagar, M. (1979). New passivity-type criteria for large-scale interconnected systems. IEEE Transactions on Automat. Control, 24, 575–579.
Vidyasagar, M. (1978). Nonlinear systems analysis. Englewood Cliffs, NJ: Prentice Hall International Editions.
Bayle, B., & Fourquet, J. Y. (2001). Manipulability analysis for mobile manipulators. In Proceedings of the IEEE International Conference on Robotics and Automation, Seoul, Korea, May 2001.
Kalata, P. (1994). The tracking index: A generalized parameter for α-β and α-β-γ target trackers. IEEE Transactions on Aerospace and Electronic Systems, 20(2), 174–182.
Van der Schaft, A. (2000). L2 gain and passivity techniques in nonlinear control. London: Springer.
Das, A. K., Fierro, R., Kumar, V., Ostrowski, J. P., Spletzer, J., & Taylor, C. J. (2002). A vision-based formation control framework. IEEE Transactions on Robotics and Automation, 18(5), 813–825.
Zulli, R., Fierro, R., Conte, G., & Lewis, F. L. (1995). Motion planning and control for nonholonomic mobile robots. In Proceedings of the IEEE International Symposium on Intelligent Control, CA, 27–29 August 1995.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Appendices
Appendix 1
1.1 Mobile Robot Model
This chapter considers a unicycle-like robot, consisting of two self-driven wheels located on the same axle and a castor wheel, as Fig. 12.37 shows. Therefore, if the robot is considered as a concentrated mass placed in the point C at the middle of the wheel’s axle, the kinematic model that represents the pose of the robot in the plane are
where (x, y) represents the position of the robot, ϕ represents the robot’s orientation, u is the linear velocity of the robot, and ω is the angular velocity of the robot.
This type of robots has a non-holonomic constraint given by
This restriction states that the robot is not able to move laterally but it is able only to navigate in the perpendicular direction to the wheels’ axle.
1.2 Feature Selection
Without loss of generality for the control laws proposed in this chapter, a cylindrical object is selected, defining the vector of image features as \( \boldsymbol{\upxi} ={\left[\begin{array}{cc}{\xi}_1& {\xi}_2\end{array}\right]}^{\mathrm{T}}={\left[\begin{array}{cc}{x}_{\mathrm{m}}& {d}_{\mathrm{m}}\end{array}\right]}^{\mathrm{T}} \), where x m is the projection of the cylinder middle point x-coordinate on the image plane; and d m represents the projection of the actual width D of the cylinder on the image plane [25]. This feature definition is depicted in Fig. 12.38. According to the camera projection model (12.4), the image features are straightforwardly obtained as
Now, the problem consists of obtaining the vision system model. This model has to describe the time variation of the image characteristics \( \dot{\boldsymbol{\upxi}} \) as a function of both the robot motion \( {\left[\begin{array}{cc}u& \omega \end{array}\right]}^{\mathrm{T}} \) and the target velocity v T. For this purpose, the pose of the target on the X mc − Z mc plane with respect to the vision system can be written as a function of the distance d and the angle φ, defined as depicted in Fig. 12.39, as follows:
Replacing (12.44) with (12.43), the vector of image characteristics is expressed as a function of the relative pose between the target and the camera
and the time derivative of (12.45) is
with
The variation in relative position between the robot and the target is due to both the robot motion and the target motion: \( {\left[\begin{array}{cc}\dot{\varphi}& \dot{d}\end{array}\right]}^{\mathrm{T}}={\left[\begin{array}{cc}\dot{\varphi}& \dot{d}\end{array}\right]}_{\mathrm{R}}^{\mathrm{T}}+{\left[\begin{array}{cc}\dot{\varphi}& \dot{d}\end{array}\right]}_{\mathrm{T}}^{\mathrm{T}} \). Now, from the kinematic model of the mobile robot in polar coordinates (see Fig. 12.39) and first considering a static object, the time variation of the relative posture between the target and the robot (time variation of d and φ) is obtained as a function of the linear and angular velocities of the robot \( \boldsymbol{\upmu} ={\left[\begin{array}{cc}u& \omega \end{array}\right]}^{\mathrm{T}} \) as follows:
Now consider a static position for the robot, that is, constant values for x, y, ϕ; obtaining the time derivative of (12.44), the target velocity \( {\mathbf{v}}_{\mathrm{T}}={\left[\begin{array}{cc}{\dot{x}}_{\mathrm{T}\mathrm{mc}}& {\dot{z}}_{\mathrm{T}\mathrm{mc}}\end{array}\right]}^{\mathrm{T}}=\mathbf{A}{\left[\begin{array}{cc}\dot{\varphi}& \dot{d}\end{array}\right]}_{\mathrm{T}}^{\mathrm{T}} \) is expressed as
Then, as A is invertible, the following expression can be obtained
Finally, by introducing the motions of both the robot (12.48) and the target (12.50) into (12.46), the model of the vision system is obtained
Defining
a compact form for the vision system model is obtained
where
Appendix 2
Formal definitions associated with passivity of operators relating functional spaces, used in this work, follow [44].
Definition 1
L p signal spaces.
For all p ∈ [1, ∞), L p signal spaces are defined as
L p are the Banach spaces with respect to the norm
Definition 2
L ∞ signal spaces.
L ∞ signal spaces are defined as
L ∞ are the Banach spaces with respect to the norm
Definition 3
Truncated function.
Let f : ℝ+ → ℝ. Then, for each T ∈ ℝ+, the function f T : ℝ+ → ℝ is defined by
Definition 4
Extended L p signal spaces.
Extended L p spaces are defined as
Definition 5
Given g, h ∈ L 2e, the inner product and the norm ‖•‖2e in the set L 2e are defined as
Definition 6
Let G : L 2e → L 2e be an input-output mapping. Then, G is passive if there exists some constant β such that
Definition 7
Let G : L 2e → L 2e be an input-output mapping. Then, G is strictly input passive if there exists some constants β ∈ ℝ and δ > 0 such that
Definition 8
Let G : L 2e → L 2e be an input-output mapping. Then, G is strictly output passive if there exists some constants β ∈ ℝ and δ > 0such that
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this chapter
Cite this chapter
Roberti, F., Toibero, J.M., Sarapura, J.A., Andaluz, V., Carelli, R., Sebastián, J.M. (2020). Unified Passivity-Based Visual Control for Moving Object Tracking. In: Sergiyenko, O., Flores-Fuentes, W., Mercorelli, P. (eds) Machine Vision and Navigation. Springer, Cham. https://doi.org/10.1007/978-3-030-22587-2_12
Download citation
DOI: https://doi.org/10.1007/978-3-030-22587-2_12
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-22586-5
Online ISBN: 978-3-030-22587-2
eBook Packages: EngineeringEngineering (R0)