Intelligent HMI in Orthopedic Navigation
The human-machine interface (HMI) is an essential part of image-guided orthopedic navigation systems. HMI provides a primary platform to merge surgically relevant pre- and intraoperative images from different modalities and 3D models including anatomical structures and implants to support surgical planning and navigation. With the various input-output techniques of HMI, surgeons can intuitively manipulate anatomical models generated from medical images and/or implant models for surgical planning. Furthermore, HMI recreates sight, sound, and touch feedback for the guidance of surgery operations which helps surgeons to sense more relevant information, e.g., anatomical structures and surrounding tissue, the mechanical axis of limbs, and even the mechanical properties of tissue. Thus, with the help of interactive HMI, precision operations, such as cutting, drilling, and implantation, can be performed more easily and safely.
Classic HMI is based on 2D displays and standard input devices of computers. In contrast, modern visual reality (VR) and augmented reality (AR) techniques allow the showing more information for surgical navigation. Various attempts have been applied to image-guided orthopedic therapy. In order to realize rapid image-based modeling and to create effective interaction and feedback, intelligent algorithms have been developed. Intelligent algorithms can realize fast registration of image to image and image to patients, and the algorithms to compensate the visual offset in AR display have been investigated. In order to accomplish more effective human-computer interaction, various input methods and force sensing/force reflecting methods have been developed. This chapter reviews related human-machine interface techniques for image-guided orthopedic navigation, analyzes several examples of clinical applications, and discusses the trend of intelligent HMI in orthopedic navigation.
KeywordsIntelligent human-machine interface Visual reality (VR) Augmented reality (AR) Orthopedic navigation
- 7.Stefanou MA, Pasparakis D, Mastrokalos D et al (2014) Radiographic assessment of lower limb lengthening in achondroplastic patients, using the ilizarov frame: a 5-19 year follow up study. Int J Orthop 1(4):140–145. https://doi.org/10.6051/j.issn.2311-5106.2014.01.33 CrossRefGoogle Scholar
- 15.Perkins SL, Lin MA, Srinivasan S et al (2017) A mixed-reality system for breast surgical planning. In: 2017 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct) Proceedings pp 269–274. https://doi.org/10.1109/ISMAR-Adjunct.2017.92
- 20.Phil (2014) See the Myo armband in surgery. http://blog.thalmic.com/myo-armband-surgery/. Accessed 5 Feb 2018
- 23.Boiadjiev T, Boiadjiev G, Delchev K et al (2015) Eliminating of far pedicle cortex perforation by automatic spine drilling. Appl Mech Mater 799-800:505–508. https://doi.org/10.4028/www.scientific.net/AMM. 799-800.505 CrossRefGoogle Scholar
- 26.Jin HY, Hu Y, Deng Z et al (2014) Model-based state recognition of bone drilling with robotic orthopedic surgery system. In: 2014 IEEE International Conference on Robotics and Automation ICRA, Hong Kong, China, 31 May–7 June 2014. IEEE International Conference on Robotics and Automation ICRA. IEEE, pp 3538–3543. https://doi.org/10.1109/ICRA.2014.6907369
- 30.Jin HY, Hu Y, Gao P et al (2014) Intraoperative control for robotic spinal surgical system with audio and torque sensing. In: Processing of 2014 International conference on multisensor fusion and information integration for intelligent systems (Mfi), Beijing, China, 28–29 Sept. 2014 2014. IEEE, p 6. https://doi.org/10.1109/MFI.2014.6997711
- 31.Sun Y, Jin HY, Hu Y et al (2014) State recognition of bone drilling with audio signal in robotic orthopedics surgery system. In: 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems, Chicago, IL, USA, 14–18 Sept. 2014. IEEE International Conference on Intelligent Robots and Systems. IEEE, pp 3503–3508. https://doi.org/10.1109/IROS.2014.6943051
- 40.MedGadget (2017) CAE Healthcare announces first mixed reality ultrasound simulation solution with Microsoft HoloLens. https://www.medgadget.com/2017/01/cae-healthcare-announces-first-mixed-reality-ultrasound-simulation-solution-with-microsoft-hololens.html. Accessed 10 Feb 2018
- 43.Tholey G, Desai JP, Castellanos AE (2005) Force feedback plays a significant role in minimally invasive surgery: results and analysis. Ann Surg 241(1):102–109. https://doi.org/10.1097/01.sla.0000149301.60553.1e CrossRefPubMedPubMedCentralGoogle Scholar
- 46.Indraccolo C, Paolis LT (2017) Augmented reality and MYO for a touchless interaction with virtual organs. Paper presented at the augmented reality, virtual reality and computer graphics. In: 4th international conference, AVR 2017, Ugento, Italy, 12–15 June 2017Google Scholar
- 47.Mentis HM, O’Hara K, Gonzalez G et al (2015) Voice or gesture in the operating room. In: ACM Conference Extended Abstracts on Human Factors in Computing Systems, Seoul, Republic of Korea, 18–23 April 2015. pp 773–780. https://doi.org/10.1145/2702613.2702963