Trade-off between resolution and frame rate of visual tracking of mini-robots on an experimental planar platform

Abstract

Accurate and fast visual localization is required in many applications of mini-robotics. Obtaining the best possible result for a given platform requires a balanced combination of camera settings and efficient image processing of the acquired image. In this paper, we study the trade-off between a high-resolution and a high-speed acquisition mode of a conventional camera for an experimental platform of magnetically-propelled mini-robots. Specifically, we propose a two stage localization algorithm based on fast pre-location using block matching followed by optical flow correction for subpixel accuracy of localization. In the experimental evaluation, we show that the difference in localization using two images of the same scene in resolutions 1000 × 1000px and 200 × 200px is only 0.2px of the higher resolution. The computational cost of the lower resolution is 16times lower than that of the higher resolution. This allows greater accuracy localization at a higher frame rate, which significantly improves the dynamics of control. Experimental results demonstrate the precision and speed of the proposed algorithm in the task of tracking a magnetically propelled robot on the platform.

This is a preview of subscription content, log in to check access.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18

Notes

  1. 1.

    The empty scene can be updated in every run, or the lighting conditions should not change rapidly. In that case, the maximum does not refer to the robot’s position.

  2. 2.

    Step size is given by an experimentally chosen formula round(size(robot_template,1)/10)+ 2 for best results in high and even low resolutions

  3. 3.

    1 px is equal to 45μ m

References

  1. 1.

    Wang B, Zhang Y, Zhang L (2018) Recent progress on micro-and nano-robots: Towards in vivo tracking and localization. Quant Imag Med Surg 8.5:461–479

    Article  Google Scholar 

  2. 2.

    Sitti M, et al (2015) Biomedical applications of untethered mobile milli/microrobots. Proceedings of the IEEE 03.2:205–224

    Article  Google Scholar 

  3. 3.

    Hsu A, et al (2017) Automated 2D micro-assembly using diamagnetically levitated milli-robots. Manipulation, Automation and Robotics at Small Scales (MARSS), 2017 International Conference on IEEE, pp 1–6

  4. 4.

    Marino H, Bergeles C, Nelson BJ (2014) Robust electromagnetic control of microrobots under force and localization uncertainties. IEEE Trans Autom Sci Eng 11.1:310–316

    Article  Google Scholar 

  5. 5.

    Son D, Dogan MD, Sitti M (2017) Magnetically actuated soft capsule endoscope for fine-needle aspiration biopsy. IEEE International Conference on Robotics and Automation, pp 1132–1139

  6. 6.

    Stoll J, Dupont P (2005) Passive markers for ultrasound tracking of surgical instruments. International Conference on Medical Image Computing and Computer-Assisted Intervention

  7. 7.

    Novotny PM, Cannon JW, Howe RD (2003) Tool localization in 3D ultrasound images. International Conference on Medical Image Computing and Computer-Assisted Intervention

  8. 8.

    Yu J, et al (2018) Ultra-extensible ribbon-like magnetic microswarm. Nat Commun 9.1:3260

    Article  Google Scholar 

  9. 9.

    Tasoglu S, et al (2014) Untethered micro-robotic coding of three-dimensional material composition. Nat Commun 5:3124

    Article  Google Scholar 

  10. 10.

    Steager E, et al (2013) Automated biomanipulation of single cells using magnetic microrobots. Int J Robot Res 32.3:346–359

    Article  Google Scholar 

  11. 11.

    Wu G, et al (2018) Mobile robot location algorithm based on image processing technology. EURASIP J Image Vid Process 2018.1:107

    Article  Google Scholar 

  12. 12.

    Juřík M, Kuthan J, Vlček J, Mach F (2019) Positioning uncertainty reduction of magnetically guided actuation on planar surfaces. IEEE international conference on robotics and automation

  13. 13.

    Pelrine R, et al (2012) Diamagnetically levitated robots: An approach to massively parallel robotic systems with unusual motion properties. IEEE International Conference on Robotics and Automation (ICRA)

  14. 14.

    Pelrine R (2019) Super materials and robots making robots: Challenges and opportunities in robotic building at the microstructural level. Robotic Systems and Autonomous Platforms, Woodhead Publishing. pp 477–492

  15. 15.

    Brox T, Bruhn A, Papenberg N, Weickert J (2004) High accuracy optical flow estimation based on a theory for warping. European conference on computer vision,pp 25–36

  16. 16.

    Juřík M, Šmídl V, Kuthan J, Mach F (2019) Trade-off Between Resolution and Frame Rate for Visual Tracking of Mini-robots on Planar Surfaces. International Conference on Manipulation, Automation and Robotics at Small Scales (MARSS)

  17. 17.

    Kim D, Lee D, Myung H, Choi HT (2012) Object detection and tracking for autonomous underwater robots using weighted template matching Oceans-Yeosu

  18. 18.

    van Assen HC, Egmont-Petersen M, Reiber JH (2002) Accurate object localization in gray level images using the center of gravity measure: accuracy versus precision. IEEE Trans Imag Process 11.12:1379–1384

    Article  Google Scholar 

  19. 19.

    Zemánek J, Drs J, Hurák Z (2014) Dielectrophoretic actuation strategy for micromanipulation along complex trajectories. International Conference on Advanced Intelligent Mechatronics (ASME)

  20. 20.

    Sinha SN, Frahm JM, Pollefeys M, Gen Y (2006) GPU-based video feature tracking and matching. In: EDGE, workshop on edge computing using new commodity architectures, vol 278, p 4321

  21. 21.

    Lucas BD, Kanade T (1981) An iterative image registration technique with an application to stereo vision. Robotics 81:674–679

    Google Scholar 

  22. 22.

    Anandan P (1989) A computational framework and an algorithm for the measurement of visual motion. Int J Comput Vis 2.3:283–310

    Article  Google Scholar 

  23. 23.

    Bruhn A, Weickert J, Schnörr Ch (2005) Lucas/kanade meets horn/schunck: Combining local and global optic flow methods. Int J Comput Vis 61.3:211–231

    Google Scholar 

Download references

Acknowledgements

This research has been supported by the Ministry of Education, Youth and Sports of the Czech Republic under the RICE New Technologies and Concepts for Smart Industrial Systems, project No. LO1607 and by the University of West Bohemia under the project SGS-2018-043.

Author information

Affiliations

Authors

Corresponding author

Correspondence to Martin Juřík.

Appendix

Appendix

The algorithms were written in MATLAB 2019a due to fast development and easy connection to our other algorithms (e.g. control algorithms). Using lower-level programming languages would probably increase the overall speed. All tests were run on a computer equipped with Intel Core i5-7200U, 16GB, Windows 10 (64bit). The camera used was a Logitech Brio with these resolutions / frame rate possibilities: UHD/30 FPS, FHD/60 FPS, HD/90 FPS.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Juřík, M., Šmídl, V. & Mach, F. Trade-off between resolution and frame rate of visual tracking of mini-robots on an experimental planar platform. J Micro-Bio Robot (2020). https://doi.org/10.1007/s12213-020-00134-3

Download citation

Keywords

  • Visual localization
  • Block matching
  • Optical flow
  • Mini-robots