Advertisement

Objective classification of psychomotor laparoscopic skills of surgeons based on three different approaches

  • Fernando Pérez-Escamirosa
  • Antonio Alarcón-ParedesEmail author
  • Gustavo Adolfo Alonso-Silverio
  • Ignacio Oropesa
  • Oscar Camacho-Nieto
  • Daniel Lorias-Espinoza
  • Arturo Minor-Martínez
Original Article
  • 21 Downloads

Abstract

Background

The determination of surgeons’ psychomotor skills in minimally invasive surgery techniques is one of the major concerns of the programs of surgical training in several hospitals. Therefore, it is important to assess and classify objectively the level of experience of surgeons and residents during their training process. The aim of this study was to investigate three classification methods for establishing automatically the level of surgical competence of the surgeons based on their psychomotor laparoscopic skills.

Methods

A total of 43 participants, divided into an experienced surgeons group with ten experts (> 100 laparoscopic procedures performed) and non-experienced surgeons group with 24 residents and nine medical students (< 10 laparoscopic procedures performed), performed three tasks in the EndoViS training system. Motion data of the instruments were captured with a video-tracking system built into the EndoViS simulator and analyzed using 13 motion analysis parameters (MAPs). Radial basis function networks (RBFNets), K-star (K*), and random forest (RF) were used for classifying surgeons based on the MAPs’ scores of all participants. The performance of the three classifiers was examined using hold-out and leave-one-out validation techniques.

Results

For all three tasks, the K-star method was superior in terms of accuracy and AUC in both validation techniques. The mean accuracy of the classifiers was 93.33% for K-star, 87.58% for RBFNets, and 84.85% for RF in hold-out validation, and 91.47% for K-star, 89.92% for RBFNets, and 83.72% for RF in leave-one-out cross-validation.

Conclusions

The three proposed methods demonstrated high performance in the classification of laparoscopic surgeons, according to their level of psychomotor skills. Together with motion analysis and three laparoscopic tasks of the Fundamental Laparoscopic Surgery Program, these classifiers provide a means for objectively classifying surgical competence of the surgeons for existing laparoscopic box trainers.

Keywords

Laparoscopic surgery Training Motion analysis Objective assessment Classification Video-based tracking 

Notes

Acknowledgements

The authors want to thank all the surgeons, residents, and medical students for their enthusiastic and kindly participation in this study.

Funding

The authors declare that no grants or funding sources were received for this work.

Compliance with ethical standards

Conflict of interest

The authors declare that they have no conflict of interest.

Informed consent

Informed consent was obtained from all individual participants included in the study.

References

  1. 1.
    Cuschieri A (2005) Laparoscopic surgery: current status, issues and future developments. Surgeon 3:125–130CrossRefGoogle Scholar
  2. 2.
    Ritter EM, Scott DJ (2007) Design of a proficiency-based skills training curriculum for the fundamentals of laparoscopic surgery. Surg Innov 14:107–112.  https://doi.org/10.1177/1553350607302329 CrossRefPubMedGoogle Scholar
  3. 3.
    van Hove PD, Tuijthof GJ, Verdaasdonk EG, Stassen LP, Dankelman J (2010) Objective assessment of technical surgical skills. Br J Surg 97:972–987.  https://doi.org/10.1002/bjs.7115 CrossRefPubMedGoogle Scholar
  4. 4.
    Moorthy K, Munz Y, Sarker SK, Darzi A (2003) Objective assessment of technical skills in surgery. BMJ 327:1032–1037.  https://doi.org/10.1136/bmj.327.7422.1032 CrossRefPubMedPubMedCentralGoogle Scholar
  5. 5.
    Harrysson I, Hull L, Sevdalis N, Darzi A, Aggarwal R (2014) Development of a knowledge, skills, and attitudes framework for training in laparoscopic cholecystectomy. Am J Surg 207:790–796.  https://doi.org/10.1016/j.amjsurg.2013.08.049 CrossRefPubMedGoogle Scholar
  6. 6.
    Chipman JG, Schmitz CC (2009) Using objective structured assessment of technical skills to evaluate a basic skills simulation curriculum for first-year surgical residents. J Am Coll Surg 209:364–370.e2.  https://doi.org/10.1016/j.jamcollsurg.2009.05.005 CrossRefPubMedGoogle Scholar
  7. 7.
    Seymour NE (2008) VR to OR: a review of the evidence that virtual reality simulation improves operating room performance. World J Surg 32:182–188.  https://doi.org/10.1007/s00268-007-9307-9 CrossRefPubMedGoogle Scholar
  8. 8.
    Sturm LP, Windsor JA, Cosman PH, Cregan P, Hewett PJ, Maddern GJ (2008) A systematic review of skills transfer after surgical simulation training. Ann Surg 248:166–179.  https://doi.org/10.1097/SLA.0b013e318176bf24 CrossRefPubMedGoogle Scholar
  9. 9.
    Bansal VK, Raveendran R, Misra MC, Bhattacharjee H, Rajan K, Krishna A, Kumar P, Kumar S (2014) A prospective randomized controlled blinded study to evaluate the effect of short-term focused training program in laparoscopy on operating room performance of surgery residents (CTRI /2012/11/003113). J Surg Educ 71:52–60.  https://doi.org/10.1016/j.jsurg.2013.06.012 CrossRefPubMedGoogle Scholar
  10. 10.
    Uemura M, Tomikawa M, Kumashiro R, Miao T, Souzaki R, Ieiri S, Ohuchida K, Lefor AT, Hashizume M (2014) Analysis of hand motion differentiates expert and novice surgeons. J Surg Res 188:8–13.  https://doi.org/10.1016/j.jss.2013.12.009 CrossRefPubMedGoogle Scholar
  11. 11.
    Hagelsteen K, Sevonius D, Bergenfelz A, Ekelund M (2016) Simball box for laparoscopic training with advanced 4D motion analysis of skills. Surg Innov 23:309–316.  https://doi.org/10.1177/1553350616628678 CrossRefPubMedGoogle Scholar
  12. 12.
    Chmarra MK, Bakker NH, Grimbergen CA, Dankelman J (2006) TrEndo, a device for tracking minimally invasive surgical instruments in training setups. Sens Actuators A Phys 126:328–334.  https://doi.org/10.1016/j.sna.2005.10.040 CrossRefGoogle Scholar
  13. 13.
    Pérez-Escamirosa F, Chousleb-Kalach A, Hernández-Baro MD, Sánchez-Margallo JA, Lorias-Espinoza D, Minor-Martínez A (2016) Construct validity of a video-tracking system based on orthogonal cameras approach for objective assessment of laparoscopic skills. Int J Comput Assist Radiol Surg 11:2283–2293.  https://doi.org/10.1007/s11548-016-1388-1 CrossRefPubMedGoogle Scholar
  14. 14.
    Oropesa I, Sánchez-González P, Chmarra MK, Lamata P, Fernández A, Sánchez-Margallo JA, Jansen FW, Dankelman J, Sánchez-Margallo FM, Gómez EJ (2013) EVA: laparoscopic instrument tracking based on endoscopic video analysis for psychomotor skills assessment. Surg Endosc 27:1029–1039.  https://doi.org/10.1007/s00464-012-2513-z CrossRefPubMedGoogle Scholar
  15. 15.
    Oropesa I, Chmarra MK, Sánchez-González P, Lamata P, Rodrigues SP, Enciso S, Sánchez-Margallo FM, Jansen FW, Dankelman J, Gómez EJ (2013) Relevance of motion-related assessment metrics in laparoscopic surgery. Surg Innov 20:299–312.  https://doi.org/10.1177/1553350612459808 CrossRefPubMedGoogle Scholar
  16. 16.
    Sánchez-Margallo JA, Sánchez-Margallo FM, Oropesa I, Enciso S, Gómez EJ (2016) Objective assessment based on motion-related metrics and technical performance in laparoscopic suturing. Int J Comput Assist Radiol Surg.  https://doi.org/10.1007/s11548-016-1459-3 CrossRefPubMedGoogle Scholar
  17. 17.
    Escamirosa FP, Flores RMO, García IO, Vidal CR, Martínez AM (2015) Face, content, and construct validity of the EndoViS training system for objective assessment of psychomotor skills of laparoscopic surgeons. Surg Endosc Other Interv Tech 29:3392–3403.  https://doi.org/10.1007/s00464-014-4032-6 CrossRefGoogle Scholar
  18. 18.
    Chmarra MK, Grimbergen CA, Jansen F-W, Dankelman J (2010) How to objectively classify residents based on their psychomotor laparoscopic skills? Minim Invasive Ther Allied Technol 19:2–11.  https://doi.org/10.3109/13645700903492977 CrossRefPubMedGoogle Scholar
  19. 19.
    Chmarra MK, Klein S, de Winter JC, Jansen FW, Dankelman J (2010) Objective classification of residents based on their psychomotor laparoscopic skills. Surg Endosc 24:1031–1039.  https://doi.org/10.1007/s00464-009-0721-y CrossRefPubMedGoogle Scholar
  20. 20.
    Oropesa I, Sánchez-González P, Chmarra MK, Lamata P, Pérez-Rodríguez R, Jansen FW, Dankelman J, Gómez EJ (2014) Supervised classification of psychomotor competence in minimally invasive surgery based on instruments motion analysis. Surg Endosc 28:657–670.  https://doi.org/10.1007/s00464-013-3226-7 CrossRefPubMedGoogle Scholar
  21. 21.
    Gao Y, Vedula SS, Reiley CE, Ahmidi N, Varadarajan B, Lin HC, Tao L, Zappella L, Béjar B, Yuh DD, Chen CCG, Vidal R, Khudanpur S, Hager GD (2014) JHU-ISI gesture and skill assessment working set (JIGSAWS): a surgical activity dataset for human motion modeling. In: MICCAI workshop: M2CAI, vol 3, p 3Google Scholar
  22. 22.
    Zia A, Essa I (2018) Automated surgical skill assessment in RMIS training. Int J Comput Assist Radiol Surg 13(5):731–739CrossRefGoogle Scholar
  23. 23.
    Ismail Fawaz H, Forestier G, Weber J, Idoumghar L, Muller PA (2018) Evaluating surgical skills from kinematic data using convolutional neural networks. In: MICCAI, pp 214–221Google Scholar
  24. 24.
    Wang Z, Majewicz Fey A (2018) Deep learning with convolutional neural network for objective skill evaluation in robot-assisted surgery. Int J Comput Assist Radiol Surg 13(12):1959–1970CrossRefGoogle Scholar
  25. 25.
    Funke I, Mees ST, Weitz J, Speidel S (2019) Video-based surgical skill assessment using 3D convolutional neural networks. Int J Comput Assist Radiol Surg 14(7):1217–1225.  https://doi.org/10.1007/s11548-019-01995-1 CrossRefPubMedGoogle Scholar
  26. 26.
    Allen B, Nistor V, Dutson E, Carman G, Lewis C, Faloutsos P (2010) Support vector machines improve the accuracy of evaluation for the performance of laparoscopic training tasks. Surg Endosc 24:170–178.  https://doi.org/10.1007/s00464-009-0556-6 CrossRefPubMedGoogle Scholar
  27. 27.
    Fried GM (2008) FLS assessment of competency using simulated laparoscopic tasks. J Gastrointest Surg 12:210–212.  https://doi.org/10.1007/s11605-007-0355-0 CrossRefPubMedGoogle Scholar
  28. 28.
    Wenger L, Richardson C, Tsuda S (2015) Retention of fundamentals of laparoscopic surgery (FLS) proficiency with a biannual mandatory training session. Surg Endosc 29:810–814.  https://doi.org/10.1007/s00464-014-3759-4 CrossRefPubMedGoogle Scholar
  29. 29.
    Xeroulis G, Dubrowski A, Leslie K (2009) Simulation in laparoscopic surgery: a concurrent validity study for FLS. Surg Endosc 23:161–165.  https://doi.org/10.1007/s00464-008-0120-9 CrossRefPubMedGoogle Scholar
  30. 30.
    Fraser SA, Klassen DR, Feldman LS, Ghitulescu GA, Stanbridge D, Fried GM (2003) Evaluating laparoscopic skills. Surg Endosc 17:964–967.  https://doi.org/10.1007/s00464-002-8828-4 CrossRefPubMedGoogle Scholar
  31. 31.
    Feldman LS, Sherman V, Fried GM (2004) Using simulators to assess laparoscopic competence: ready for widespread use? Surgery 135:28–42.  https://doi.org/10.1016/S0039-6060(03)00155-7 CrossRefPubMedGoogle Scholar
  32. 32.
    Vassiliou MC, Ghitulescu GA, Feldman LS, Stanbridge D, Leffondré K, Sigman HH, Fried GM (2006) The MISTELS program to measure technical skill in laparoscopic surgery. Surg Endosc 20:744–747.  https://doi.org/10.1007/s00464-005-3008-y CrossRefPubMedGoogle Scholar
  33. 33.
    Pérez F, Sossa H, Martínez R, Lorias D, Minor A (2013) Video-based tracking of laparoscopic instruments using an orthogonal webcams system. World Acad Sci Eng Technol Int J Medical Heal Biomed Bioeng Pharm Eng 7:440–443Google Scholar
  34. 34.
    Park J, Sandberg IW (1991) Universal approximation using radial-basis-function networks. Neural Comput 3:246–257.  https://doi.org/10.1162/neco.1991.3.2.246 CrossRefPubMedGoogle Scholar
  35. 35.
    Jain AK (2010) Data clustering: 50 years beyond K-means. Pattern Recognit Lett.  https://doi.org/10.1016/j.patrec.2009.09.011 CrossRefGoogle Scholar
  36. 36.
    Painuli S, Elangovan M, Sugumaran V (2014) Tool condition monitoring using K-star algorithm. Expert Syst Appl 41:2638–2643.  https://doi.org/10.1016/j.eswa.2013.11.005 CrossRefGoogle Scholar
  37. 37.
    Cleary JG, Cleary JG, Trigg LE (1995) K*: an instance-based learner using an entropic distance measure. in: Proceedings of 12TH international conference on machine learning, pp 108–114Google Scholar
  38. 38.
    Oshiro TM, Perez PS, Baranauskas JA (2012) How many trees in a random forest?. Springer, Berlin, pp 154–168Google Scholar
  39. 39.
    Breiman L (2001) Random forests. Mach Learn 45:5–32.  https://doi.org/10.1023/A:1010933404324 CrossRefGoogle Scholar
  40. 40.
    Hall M, Frank E, Holmes G, Pfahringer B, Reutemann P, Witten IH (2009) The WEKA data mining software. ACM SIGKDD Explor Newsl 11:10–18.  https://doi.org/10.1145/1656274.1656278 CrossRefGoogle Scholar
  41. 41.
    Lin Z, Uemura M, Zecca M, Sessa S, Ishii H, Tomikawa M, Hashizume M, Takanishi A (2013) Objective skill evaluation for laparoscopic training based on motion analysis. IEEE Trans Biomed Eng 60:977–985.  https://doi.org/10.1109/TBME.2012.2230260 CrossRefPubMedGoogle Scholar
  42. 42.
    Hofstad EF, Våpenstad C, Chmarra MK, Langø T, Kuhry E, Mårvik R (2013) A study of psychomotor skills in minimally invasive surgery: what differentiates expert and nonexpert performance. Surg Endosc 27:854–863.  https://doi.org/10.1007/s00464-012-2524-9 CrossRefPubMedGoogle Scholar

Copyright information

© CARS 2019

Authors and Affiliations

  1. 1.Instituto de Ciencias Aplicadas y Tecnología (ICAT)Universidad Nacional Autónoma de México (UNAM)Ciudad de MéxicoMexico
  2. 2.Department of Biomedical Informatics, Faculty of MedicineUniversidad Nacional Autónoma de México (UNAM)Ciudad de MéxicoMexico
  3. 3.Laboratory of Computing Technologies and Electronics, Faculty of EngineeringUniversidad Autónoma de GuerreroChilpancingoMexico
  4. 4.Biomedical Engineering and Telemedicine Centre (GBT), ETSI Telecomunicación, Center for Biomedical TechnologyUniversidad Politécnica de Madrid (UPM)MadridSpain
  5. 5.Intelligent Computing LaboratoryCentro de Innovación y Desarrollo Tecnológico en Computación (CIDETEC-IPN)Ciudad de MéxicoMexico
  6. 6.Department of Electrical Engineering, Bioelectronics SectionCentro de Investigación y de Estudios Avanzados del Instituto Politécnico Nacional (CINVESTAV–IPN)Ciudad de MéxicoMexico

Personalised recommendations