A practical marker-less image registration method for augmented reality oral and maxillofacial surgery

  • Junchen Wang
  • Yu Shen
  • Shuo YangEmail author
Original Article



Image registration lies in the core of augmented reality (AR), which aligns the virtual scene with the reality. In AR surgical navigation, the performance of image registration is vital to the surgical outcome.


This paper presents a practical marker-less image registration method for AR-guided oral and maxillofacial surgery where a virtual scene is generated and mixed with reality to guide surgical operation or provide surgical outcome visualization in the manner of video see-through overlay. An intraoral 3D scanner is employed to acquire the patient’s teeth shape model intraoperatively. The shape model is then registered with a custom-made stereo camera system using a novel 3D stereo matching algorithm and with the patient’s CT-derived 3D model using an iterative closest point scheme, respectively. By leveraging the intraoral 3D scanner, the CT space and the stereo camera space are associated so that surrounding anatomical models and virtual implants could be overlaid on the camera’s view to achieve AR surgical navigation.


Jaw phantom experiments were performed to evaluate the target registration error of the overlay, which yielded an average error of less than 0.50 mm with the time cost less than 0.5 s. Volunteer trial was also conducted to show the clinical feasibility.


The proposed registration method does not rely on any external fiducial markers attached to the patient. It performs automatically so as to maintain a correct AR scene, overcoming the misalignment difficulty caused by patient’s movement. Therefore, it is noninvasive and practical in oral and maxillofacial surgery.


Augmented reality Oral and maxillofacial surgery Video see-through Image registration 



This work was partially supported by National Natural Science Foundation of China (Grant No. 61701014).

Compliance with ethical standards

Conflict of interest

The authors declare that they have no conflict of interest.

Ethical approval

All procedures performed in studies involving human participants were in accordance with the ethical standards of our institutional review board and with the 1964 Helsinki Declaration and its later amendments or comparable ethical standards.

Informed consent

Informed consent was obtained from all individual participants included in the study.


  1. 1.
    Kamphuis C, Barsom E, Schijven M, Christoph N (2014) Augmented reality in medical education? Perspect Med Educ 3(4):300–311CrossRefGoogle Scholar
  2. 2.
    Loukas C, Lahanas V, Georgiou E (2013) An integrated approach to endoscopic instrument tracking for augmented reality applications in surgical simulation training. Int J Med Robot Comput Assist Surg 9(4):e34–e51CrossRefGoogle Scholar
  3. 3.
    Rhienmora P, Gajananan K, Haddawy P, Dailey MN, Suebnukarn S (2010) Augmented reality haptics system for dental surgical skills training. In: ACM symposium on virtual reality software and technology, 2010, pp 97-98Google Scholar
  4. 4.
    Edwards PJ, Johnson LG, Hawkes DJ, Fenlon MR, Strong AJ, Gleeson MJ (2004) Clinical experience and perception in stereo augmented reality surgical navigation. In: Yang G-Z, Jiang T-Z (eds) Medical imaging and augmented reality, Berlin, Heidelberg. Springer, Berlin, pp 369–376CrossRefGoogle Scholar
  5. 5.
    Fallavollita P, Wang L, Weidert S, Navab N (2016) Augmented reality in orthopaedic interventions and education. In: Zheng G, Li S (eds) Computational radiology for orthopaedic interventions. Springer, Cham, pp 251–269. CrossRefGoogle Scholar
  6. 6.
    Liao H, Inomata T, Sakuma I, Dohi T (2010) 3-D augmented reality for MRI-guided surgery using integral videography autostereoscopic image overlay. IEEE Trans Biomed Eng 57(6):1476–1486CrossRefGoogle Scholar
  7. 7.
    Tardif JP, Roy S, Meunier J (2003) Projector-based augmented reality in surgery without calibration. In: Proceedings of the 25th annual international conference of the IEEE engineering in medicine and biology society (IEEE Cat. No. 03CH37439), 17–21 Sept. 2003, vol 541, pp 548–551.
  8. 8.
    Wang J, Suenaga H, Hoshi K, Yang L, Kobayashi E, Sakuma I, Liao H (2014) Augmented reality navigation with automatic marker-free image registration using 3-D image overlay for dental surgery. IEEE Trans Biomed Eng 61(4):1295–1304. CrossRefGoogle Scholar
  9. 9.
    Wang J, Suenaga H, Liao H, Hoshi K, Yang L, Kobayashi E, Sakuma I (2015) Real-time computer-generated integral imaging and 3D image calibration for augmented reality surgical navigation. Comput Med Imaging Graph 40:147–159. CrossRefGoogle Scholar
  10. 10.
    Liao H, Ishihara H, Tran HH, Masamune K, Sakuma I, Dohi T (2010) Precision-guided surgical navigation system using laser guidance and 3D autostereoscopic image overlay. Comput Med Imaging Graph 34(1):46–54. CrossRefGoogle Scholar
  11. 11.
    Kiyokawa K, Kurata Y, Ohno H (2001) An optical see-through display for mutual occlusion with a real-time stereovision system. Comput Graph 25(5):765–779. CrossRefGoogle Scholar
  12. 12.
    Qian L, Barthel A, Johnson A, Osgood G, Kazanzides P, Navab N, Fuerst B (2017) Comparison of optical see-through head-mounted displays for surgical interventions with object-anchored 2D-display. Int J Comput Assist Radiol Surg 12(6):901–910. CrossRefGoogle Scholar
  13. 13.
    Rodrigues DG, Jain A, Rick SR, Shangley L, Suresh P, Weibel N (2017) Exploring mixed reality in specialized surgical environments. Paper presented at the proceedings of the 2017 CHI conference extended abstracts on human factors in computing systems, Denver, Colorado, USAGoogle Scholar
  14. 14.
    Chen X, Xu L, Wang Y, Wang H, Wang F, Zeng X, Wang Q, Egger J (2015) Development of a surgical navigation system based on augmented reality using an optical see-through head-mounted display. J Biomed Inform 55:124–131. CrossRefGoogle Scholar
  15. 15.
    Teber D, Guven S, Simpfendörfer T, Baumhauer M, Güven EO, Yencilek F, Gözen AS, Rassweiler J (2009) Augmented reality: a new tool to improve surgical accuracy during laparoscopic partial nephrectomy? Preliminary in vitro and in vivo results. Eur Urol 56(2):332–338. CrossRefGoogle Scholar
  16. 16.
    Wang J, Suenaga H, Yang L, Kobayashi E, Sakuma I, Wang J, Suenaga H, Yang L, Kobayashi E, Sakuma I (2017) Video see-through augmented reality for oral and maxillofacial surgery. Int J Med Robot Comput Assist Surg 13(2):e1754–e1767. CrossRefGoogle Scholar
  17. 17.
    Su L-M, Vagvolgyi BP, Agarwal R, Reiley CE, Taylor RH, Hager GD (2009) Augmented reality during robot-assisted laparoscopic partial nephrectomy: toward real-time 3D-CT to stereoscopic video registration. Urology 73(4):896–900. CrossRefGoogle Scholar
  18. 18.
    Volonte F, Fo Pugin, Bucher P, Sugimoto M, Ratib O, Morel P (2011) Augmented reality and image overlay navigation with OsiriX in laparoscopic and robotic surgery: not only a matter of fashion. J Hepato-Biliary-Pancreat Sci 18(4):506–509. CrossRefGoogle Scholar
  19. 19.
    Krempien R, Hoppe H, Kahrs L, Daeuber S, Schorr O, Eggers G, Bischof M, Munter MW, Debus J, Harms W (2008) Projector-based augmented reality for intuitive intraoperative guidance in image-guided 3D interstitial brachytherapy. Int J Radiat Oncol Biol Phys 70(3):944–952. CrossRefGoogle Scholar
  20. 20.
    Zitová B, Flusser J (2003) Image registration methods: a survey. Image Vis Comput 21(11):977–1000. CrossRefGoogle Scholar
  21. 21.
    Souzaki R, Ieiri S, Uemura M, Ohuchida K, Tomikawa M, Kinoshita Y, Koga Y, Suminoe A, Kohashi K, Oda Y, Hara T, Hashizume M, Taguchi T (2013) An augmented reality navigation system for pediatric oncologic surgery based on preoperative CT and MRI images. J Pediatr Surg 48(12):2479–2483. CrossRefGoogle Scholar
  22. 22.
    Puerto-Souza GA, Mariottini GL (2013) Toward long-term and accurate augmented-reality display for minimally-invasive surgery. In: 2013 IEEE international conference on robotics and automation, 6–10 May 2013, pp 5384–5389.
  23. 23.
    Stoyanov D, Darzi A, Yang GZ (2010) A practical approach towards accurate dense 3D depth recovery for robotic laparoscopic surgery. Comput Aided Surg 10(4):199–208. CrossRefGoogle Scholar
  24. 24.
    Chang P-L, Stoyanov D, Davison AJ, Edwards PE (2013) Real-time dense stereo reconstruction using convex optimisation with a cost-volume for image-guided robotic surgery. In: medical image computing and computer-assisted intervention—MICCAI 2013, Berlin, Heidelberg, 2013. Springer Berlin Heidelberg, pp 42–49Google Scholar
  25. 25.
    Totz J, Thompson S, Stoyanov D, Gurusamy K, Davidson BR, Hawkes DJ, Clarkson MJ (2014) Fast semi-dense surface reconstruction from stereoscopic video in laparoscopic surgery. In: Information processing in computer-assisted interventions, 2014. Springer, Cham, pp 206–215Google Scholar
  26. 26.
    Bouchard C, Magill JC, Nikonovskiy V, Byl M, Murphy BA, Kaban LB, Troulis MJ (2012) Osteomark: a surgical navigation system for oral and maxillofacial surgery. Int J Oral Maxillofac Surg 41(2):265–270. CrossRefGoogle Scholar
  27. 27.
    Casap N, Nadel S, Tarazi E, Weiss EI (2011) Evaluation of a navigation system for dental implantation as a tool to train novice dental practitioners. J Oral Maxillofac Surg 69(10):2548–2556. CrossRefGoogle Scholar
  28. 28.
    Venosta D, Sun Y, Matthews F, Kruse AL, Lanzer M, Gander T, Grätz KW, Lübbers H-T (2014) Evaluation of two dental registration-splint techniques for surgical navigation in cranio-maxillofacial surgery. J Cranio-Maxillofac Surg 42(5):448–453. CrossRefGoogle Scholar
  29. 29.
    Badiali G, Ferrari V, Cutolo F, Freschi C, Caramella D, Bianchi A, Marchetti C (2014) Augmented reality as an aid in maxillofacial surgery: validation of a wearable system allowing maxillary repositioning. J Cranio-Maxillofac Surg 42(8):1970–1976. CrossRefGoogle Scholar
  30. 30.
    Wang J, Suenaga H, Yang L, Liao H, Ando T, Kobayashi E, Sakuma I (2015) 3D surgical overlay with markerless image registration using a single camera. In: Linte CA, Yaniv Z, Fallavollita P (eds) Augmented environments for computer-assisted interventions. Springer, Cham, pp 124–133CrossRefGoogle Scholar
  31. 31.
    Yang J, Li H, Campbell D, Jia Y (2016) Go-ICP: a globally optimal solution to 3D ICP point-set registration. IEEE Trans Pattern Anal Mach Intell 38(11):2241–2254. CrossRefGoogle Scholar
  32. 32.
    Wold S, Esbensen K, Geladi P (1987) Principal component analysis. Chemometr Intell Lab Syst 2(1):37–52. CrossRefGoogle Scholar
  33. 33.
    Ulrich M, Wiedemann C, Steger C (2012) Combining scale-space and similarity-based aspect graphs for fast 3D object recognition. IEEE Trans Pattern Anal Mach Intell 34(10):1902–1914. CrossRefGoogle Scholar
  34. 34.
    Wang J, Kobayashi E, Sakuma I (2015) Coarse-to-fine dot array marker detection with accurate edge localization for stereo visual tracking. Biomed Signal Process Control 15:49–59. CrossRefGoogle Scholar
  35. 35.
    Liu DC, Nocedal J (1989) On the limited memory BFGS method for large scale optimization. Math Program 45(1):503–528. CrossRefGoogle Scholar
  36. 36.
    Wang J, Ji X, Zhang X, Sun Z, Wang T (2018) Real-time robust individual X point localization for stereoscopic tracking. Pattern Recognit Lett 112:138–144. CrossRefGoogle Scholar

Copyright information

© CARS 2019

Authors and Affiliations

  1. 1.School of Mechanical Engineering and AutomationBeihang UniversityBeijingChina
  2. 2.Beijing Advanced Innovation Center for Biomedical EngineeringBeihang UniversityBeijingChina
  3. 3.Stomatological HospitalSouthern Medical UniversityGuangzhouChina

Personalised recommendations