Robust Laparoscopic Instruments Tracking Using Colored Strips
To assist surgeons in the acquisition of the required skills for the proper execution of the laparoscopic procedure, surgical simulators are used. During training with simulators it is useful to provide a surgical performance quantitative evaluation. Recent research works showed that such evaluation can be obtained by tracking the laparoscopic instruments, using only the images provided by the laparoscope and without hindering the surgical scene. In this work the state of the art method is improved so that a robust tracking can run even with the noisy background provided by realistic simulators. The method was validated by comparison with the tracking of a “chess-board” pattern and following tests were performed to check the robustness of the developed algorithm. Despite the noisy environment, the implemented method was found to be able to track the tip of the surgical instrument with a good accuracy compared to the other studies in the literature.
KeywordsOptical tracking Single camera Laparoscopic training Surgical simulation Surgical performance evaluation
This research work was supported by VALVETECH project, FAS fund – Tuscany Region (Realization of a newly developed polymeric aortic valve, implantable through robotic platform with minimally invasive surgical techniques) and SThARS project, grant “Ricerca finalizzata e Giovani ricercatori 2011–2012” Young Researchers - Italian Ministry of Health (Surgical Training in identification and isolation of deformable tubular structures with hybrid Augmented Reality Simulation).
- 5.Lahanas, V., Georgiou, E., Loukas, C.: Surgical simulation training systems: box trainers, virtual reality and augmented reality simulators. Int. J. Adv. Robot. Automn. 1, 1–9 (2016)Google Scholar
- 6.Moglia, A., et al.: Patient specific surgical simulator for the evaluation of the movability of bimanual robotic arms. Stud. Health Technol. Inform. 163, 379–385 (2011)Google Scholar
- 9.Allan, M., Chang, P.-L., Ourselin, S., Hawkes, D.J., Sridhar, A., Kelly, J., Stoyanov, D.: Image based surgical instrument pose estimation with multi-class labelling and optical flow. In: Navab, N., Hornegger, J., Wells, W.M., Frangi, A.F. (eds.) MICCAI 2015. LNCS, vol. 9349, pp. 331–338. Springer, Cham (2015). doi: 10.1007/978-3-319-24553-9_41 CrossRefGoogle Scholar
- 12.Viglialoro, R.M., et al.: AR visualization of “Synthetic Calot’s Triangle” for training in cholecystectomy. In: 12th IASTED International Conference on Biomedical Engineering. BioMed, Austria (2016)Google Scholar
- 13.Viglialoro, R.M., et al.: A physical patient specific simulator for cholecystectomy training. In: Computer Assisted Radiology and Surgery (CARS) (2012)Google Scholar
- 14.Condino, S., Viglialoro, R.M., Fani, S., Bianchi, M., Morelli, L., Ferrari, M., Bicchi, A., Ferrari, V.: Tactile augmented reality for arteries palpation in open surgery training. In: Zheng, G., Liao, H., Jannin, P., Cattin, P., Lee, S.-L. (eds.) MIAR 2016. LNCS, vol. 9805, pp. 186–197. Springer, Cham (2016). doi: 10.1007/978-3-319-43775-0_17 CrossRefGoogle Scholar
- 16.Kyriakoulis, N., Gasteratos, A.: Light-invariant 3D object’s pose estimation using color distance transform. In: IEEE International Conference Imaging Systems and Techniques (IST) (2010)Google Scholar
- 18.Cutolo, F., et al.: Robust and accurate algorithm for wearable stereoscopic augmented reality with three indistinguishable markers. Electronics 5(3) (2016). Article number 59Google Scholar