Abstract
We introduce a novel 3D gaze-guided robotic scrub nurse (RN) and test the platform in simulated surgery to determine usability and acceptability with clinical teams. Surgeons and trained scrub nurses performed an ex vivo task on pig colon. Surgeons used gaze via wearable eye-tracking glasses to select surgical instruments on a screen, in turn initiating RN to deliver the instrument. Comparison was done between human- and robot-assisted tasks (HT vs RT). Real-time gaze-screen interaction was based on a framework developed with synergy of conventional wearable eye-tracking, motion capture system and RGB-D cameras. NASA-TLX and Van der Laan’s technology acceptance questionnaires were collected and analyzed. 10 teams of surgical trainees (ST) and scrub nurses (HN) participated. Overall, NASA-TLX feedback was positive. ST and HN revealed no statistically significant difference in overall task load. Task performance feedback was unaffected. Frustration was reported by ST. Overall, Van der Laan’s scores showed positive usefulness and satisfaction scores following RN use. There was no significant difference in task interruptions across HT vs RT. Similarly, no statistical difference was found in duration to task completion in both groups. Quantitative and qualitative feedback was positive. The source of frustration has been understood. Importantly, there was no significant difference in task workflow or operative time, with overall perceptions towards task performance remaining unchanged in HT vs RT.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Ebert, L.C., Hatch, G., Ampanozi, G., Thali, M.J., Ross, S.: You can’t touch this: touch-free navigation through radiological images. Surg. Innov. 19(3), 301–307 (2012). https://doi.org/10.1177/1553350611425508
El-Shallaly, G.E.H., Mohammed, B., Muhtaseb, M.S., Hamouda, A.H., Nassar, A.H.M.: Voice recognition interfaces (VRI) optimize the utilization of theatre staff and time during laparoscopic cholecystectomy. Minim. Invasive Ther. Allied Technol. (2005). https://doi.org/10.1080/13645700500381685
Gillie, T., Broadbent, D.: What makes interruptions disruptive? A study of length, similarity, and complexity. Psychol. Res. (1989). https://doi.org/10.1007/BF00309260
Hong, N., Kim, M., Lee, C., Kim, S.: Head-mounted interface for intuitive vision control and continuous surgical operation in a surgical robot system (2018). https://doi.org/10.1007/s11517-018-1902-4
Jacob, M.G., Li, Y.T., Wachs, J.P.: Gestonurse: a multimodal robotic scrub nurse. In: 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI), vol. 1, pp. 153–154 (2012). https://doi.org/10.1109/ICSMC.2011.6083972
Kogkas, A.A., Darzi, A., Mylonas, G.P.: Gaze-contingent perceptually enabled interactions in the operating theatre. Int. J. Comput. Assist. Radiol. Surg. 1–10 (2017). https://doi.org/10.1007/s11548-017-1580-y
Laviana, A.A., Williams, S.B., King, E.D., Chuang, R.J., Hu, J.C.: Robot assisted radical prostatectomy: the new standard? Minerva urologica e nefrologica = Ital. J. Urol. Nephrol. 67(1), 47–53 (2015)
Makary, M.A., Daniel, M.: Medical error-the third leading cause of death in the US. BMJ (Online) (2016). https://doi.org/10.1136/bmj.i2139
Rivera-Rodriguez, A.J., Karsh, B.T.: Interruptions and distractions in healthcare: review and reappraisal (2010). https://doi.org/10.1136/qshc.2009.033282
Shah, M.: Solving the robot-world/hand-eye calibration problem using the kronecker product. J. Mech. Robot. 5(3), 31007 (2013). https://doi.org/10.1115/1.4024473
Treat, M.R., Amory, S.E., Downey, P.E., Taliaferro, D.A.: Initial clinical experience with a partly autonomous robotic surgical instrument server. Surg. Endosc. Other Intervent. Tech. (2006). https://doi.org/10.1007/s00464-005-0511-0
Velasquez, C.A., Mazhar, R., Chaikhouni, A., Zhou, T., Wachs, J.P.: Taxonomy of communications in the operating room. In: Duffy, V., Lightner, N. (eds.) AHFE 2017. AISC, vol. 590, pp. 251–262. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-60483-1_25
Wachs, J.P., et al.: A gesture-based tool for sterile browsing of radiology images. J. Am. Med. Inf. Assoc. (2008). https://doi.org/10.1197/jamia.M2410
Acknowledgements
This research project is supported by the NIHR Imperial Biomedical Research Centre (BRC).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
1 Electronic supplementary material
Below is the link to the electronic supplementary material.
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this paper
Cite this paper
Kogkas, A., Ezzat, A., Thakkar, R., Darzi, A., Mylonas, G. (2019). Free-View, 3D Gaze-Guided Robotic Scrub Nurse. In: Shen, D., et al. Medical Image Computing and Computer Assisted Intervention – MICCAI 2019. MICCAI 2019. Lecture Notes in Computer Science(), vol 11768. Springer, Cham. https://doi.org/10.1007/978-3-030-32254-0_19
Download citation
DOI: https://doi.org/10.1007/978-3-030-32254-0_19
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-32253-3
Online ISBN: 978-3-030-32254-0
eBook Packages: Computer ScienceComputer Science (R0)