Avoid common mistakes on your manuscript.
Background
The artificial intelligence (AI) model that Kumazu reported has been improved and can highlight connective tissues and nerves in the surgical field intraoperatively on a monitor.1 This improved surgical AI prototype was named Eureka (Anaut, Inc., Tokyo, Japan) and could only be used for research and education now.
We examined whether AI navigation (AIN) using Eureka assists trainees in recognizing nerves during colorectal surgery.
Methods
This study comprised 10 patients who underwent laparoscopic left colorectal surgery between November 2022 and December 2022. All surgeries were performed by one attending doctor who was qualified according to the endoscopic surgical skill qualification system of the Japan Society for Endoscopic Surgery and several trainees.2 Whether the attending doctor and trainee were able to recognize each nerve at the same time was examined. If the trainee was unable to recognize nerves, we examined whether viewing AIN monitor connects to the laparoscopic system could assist the trainee in recognizing nerves (Figs. 1 and 2). The AIN monitor was only viewed at the evaluating time. This study was approved by the Research Ethics Committee of our institute (2022–27). All patients provided consent to participate in this study. The results are expressed as the median and interquartile range.
Results
There were three sigmoid colon cancer patients, six rectal cancer patients, and one sigmoid diverticulum patients. The mean operative duration was 246 [198–272] min, the mean blood loss was 5 [1–9] ml, and the mean duration of postoperative hospital stay was 9 [8–11] days.
The frequencies at which the trainee was not able to recognize the nerves without AIN are as follows: right hypogastric nerve during sigmoid colon mobilization, 9/24 (38%); lumbar splanchnic nerves during dissection of the dorsal inferior mesenteric artery, 8/24 (33%); bilateral hypogastric nerves during dissection of the dorsal rectum, 10/24 (42%); pelvic visceral nerves (S3, S4) during rectal dissection, 12/16 (75%).
When the trainees could not recognize the hypogastric nerves, lumbar splanchnic nerves, AIN aided in anatomical recognition in all cases. However, for the pelvic visceral nerves, AIN provided assistance in 25% of cases.
Differences between 2 junior trainees with less than 5 years of experience and 3 senior trainees with 5–10 years of experience are shown in Table 1.
Discussion
AIN using Eureka provided intraoperative real-time visualization of nerves, which was safe for education and did not negatively affect surgical outcomes.
Because Eureka can highlight nerves on recorded surgical videos, AIN may improve the efficiency of trainees’ self-study.
Although the utility of AIN varies by anatomical structure, AIN can potentially assist in anatomical recognition and contribute to surgical education for especially young trainee.
Data Availability
The data that support the findings of this study are available from the corresponding author upon request.
References
Kumazu Y, Kobayashi N, Kitamura N, Rayan E, Neculoiu P, Misumi T, Hojo Y, Nakamura T, Kumamoto T, Kurahashi Y, Ishida Y, Masuda M, Shinohara H. Automated segmentation by deep learning of loose connective tissue fibers to define safe dissection planes in robot-assisted gastrectomy. Sci Rep 2021;11:21198.
Akagi T, Endo H, Inomata M, Yamamoto H, Mori T, Kojima K, Kuroyanagi H, Sakai Y, Nakajima K, Shiroshita H, Etoh T, Saida Y, Yamamoto S, Hasegawa H, Ueno H, Kakeji Y, Miyata H, Kitagawa Y, Watanabe M. Clinical impact of Endoscopic Surgical Skill Qualification System (ESSQS) by Japan Society for Endoscopic Surgery (JSES) for laparoscopic distal gastrectomy and low anterior resection based on the National Clinical Database (NCD) registry. Ann Gastroenterol Surg 2020;4:721-734.
Acknowledgements
The authors would like to thank the operating room staff of the Kawaguchi Municipal Medical Center. The authors would also like to thank Dr. Ken Hanyu (Kajigaya Clinic) for total support. Finally, the authors would like to express special thanks to Nao Kobayashi, Yuta Kumazu, Sakiko Tamatani, and Kyohei Fukata (Anaut, Inc.).
Funding
This work was supported by JSPS KAKENHI Grant Number 22K16524.
Author information
Authors and Affiliations
Contributions
SR was responsible for conceptualization of the study, the methodology, writing — original draft, data curation, formal analysis, project administration, writing — original draft, and writing — review and editing. TK, KG, and TK jointly performed the surgery and aided in the investigation. JS and RI aided in the investigation. YN supervised the study.
Corresponding author
Ethics declarations
Conflict of Interest
The authors declare no competing interests.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Ryu, S., Goto, K., Kitagawa, T. et al. Real-time Artificial Intelligence Navigation-Assisted Anatomical Recognition in Laparoscopic Colorectal Surgery. J Gastrointest Surg 27, 3080–3082 (2023). https://doi.org/10.1007/s11605-023-05819-1
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11605-023-05819-1