Projector-based surgeon–computer interaction on deformable surfaces

  • Bojan KocevEmail author
  • Felix Ritter
  • Lars Linsen
Original Article



Providing intuitive and easy to operate interaction for medical augmented reality is essential for use in the operating room. Commonly, intra-operative navigation information is displayed on an installed monitor, requiring the operating surgeon to change focus from the monitor to the surgical site and vice versa during navigation. Projector-based augmented reality has the potential to alleviate this problem. The aim of our work is to use a projector for visualization and to provide intuitive means for direct interaction with the projected information.


A consumer-grade projector is used to visualize preoperatively defined surgical planning data. The projection of the virtual information is possible on any deformable surface, and the surgeon can interact with the presented virtual information. A Microsoft Kinect camera is used to capture both the surgeon interactions and the deformations of the surface over time. After calibration of projector and Kinect camera, the fingertips are localized automatically. A point cloud surface representation is used to determine the surgeon interaction with the projected virtual information. Interaction is detected by estimating the proximity of the surgeon’s fingertips to the interaction zone and applying projector–Kinect calibration information. Interaction is performed using multi-touch gestures.


In our experimental surgical scenario, the surgeon stands in front of the Microsoft Kinect camera, while relevant medical information is projected on the interaction zone. A hand wave gesture initiates the tracking of the hand. The user can then interact with the projected virtual information according to the defined multi-touch-based gestures. Thus, all information such as preoperative planning data is provided to the surgeon and his/her team intra-operatively in a familiar context.


We enabled the projection of the virtual information on an arbitrarily shaped surface and used a Microsoft Kinect camera to capture the interaction zone and the surgeon’s actions. The system eliminates the need for the surgeon to alternately view the surgical site and the monitor. The system eliminates unnecessary distractions and may enhance the surgeon’s performance.


Surgeon–computer interaction Multi-touch gestures  Projector-based medical data visualization Image processing 


Conflict of interest


Supplementary material

Supplementary material 1 (mpg 33078 KB)

Supplementary material 2 (mpg 12208 KB)


  1. 1.
    Bailly G, Walter R, Müller J, Ning T, Lecolinet E (2011) Comparing free hand menu techniques for distant displays using linear, marking and finger-count menus. In: Campos P, Graham N, Jorge J, Nunes N, Palanque P, Winckler M (eds) Human–computer interaction INTERACT 2011. Lecture notes in computer science, vol 6947, pp 248–262. Springer, Berlin. doi: 10.1007/978-3-642-23771-3_19
  2. 2.
    Burrus N (2013) nestk—c++ library for kinect.
  3. 3.
    Burrus N (2012) Rgbdemo 0.7.0.
  4. 4.
    Douglas D, Peucker T (1973) Algorithms for the reduction of the number of points required to represent a digitized line or its caricature. Cartographica Int J Geogr Inf Geovis 10(2):112–122Google Scholar
  5. 5.
    Fischer J, Bartz D, Straßer W (2005) Intuitive and lightweight user interaction for medical augmented reality. Vision, modeling, and visualization. Erlangen, pp 375–382Google Scholar
  6. 6.
  7. 7.
    Fraunhofer MEVIS MeVisLab.
  8. 8.
    Fujii K, Grossberg M, Nayar S (2005) A projector–camera system with real-time photometric adaptation for dynamic environments. In: IEEE conference on computer vision and pattern recognition (CVPR), vol 1, pp 814–821Google Scholar
  9. 9.
    GmbH C Computer assisted soft tissue surgery.
  10. 10.
    Graetzel C, Fong T, Grange S, Baur C (2004) A non-contact mouse for surgeon–computer interaction. Technol Health Care 12(3):245–257Google Scholar
  11. 11.
    Grossberg M, Peri H, Nayar S, Belhumeur P (2004) Making one object look like another: controlling appearance using a projector–camera system. In: IEEE conference on computer vision and pattern recognition (CVPR), vol I, pp 452–459Google Scholar
  12. 12.
    Hansen C, Wieferich J, Ritter F, Rieder C, Peitgen H (2010) Illustrative visualization of 3D planning models for augmented reality in liver surgery. Int J Comput Assist Radiol Surg 5(2):133–141PubMedCrossRefGoogle Scholar
  13. 13.
    Harrison C, Benko H, Wilson A (2011) Omnitouch: wearable multitouch interaction everywhere. In: Proceedings of the 24th annual ACM symposium on user interface software and technology, pp 441–450. ACMGoogle Scholar
  14. 14.
    Hartung C, Gnahm C, Sailer S, Schenderlein M, Friedl R, Hoffmann M, Dietmayer K (2009) Towards projector-based visualization for computer-assisted cabg at the open heart. Bildverarbeitung für die Medizin, pp 376–380Google Scholar
  15. 15.
    Hoppe H, Brief J, Däuber S, Raczkowsky J, Haßfeld S, Wörn H (2001) Projector based intraoperative visualization of surgical planning data. In: Proceedings of ISRACASGoogle Scholar
  16. 16.
    Kocev B, Ojdanic D, Peitgen H (2011) An approach for projector-based surgeon–computer interaction using tracked instruments. In: Proceedings of GI workshop: emerging technologies for medical diagnosis and therapyGoogle Scholar
  17. 17.
    Liu Y, Paul J, Yong J, Yu P, Zhang H, Sun J, Ramani K (2006) Automatic least-squares projection of points onto point clouds with applications in reverse engineering. Comput Aided Des 38(12):1251–1263CrossRefGoogle Scholar
  18. 18.
    Mistry P, Maes P (2009) Sixthsense: a wearable gestural interface. In: ACM SIGGRAPH ASIA 2009 Sketches, pp 1–1. ACMGoogle Scholar
  19. 19.
    Nayar S, Peri H, Grossberg M, Belhumeur P (2003) A projection system with radiometric compensation for screen imperfections. In: ICCV workshop on projector-camera systems (PROCAMS)Google Scholar
  20. 20.
    Nokia Qt-cross-platform application and UI framework.
  21. 21.
  22. 22.
  23. 23.
    Parsons C (2011) Christian Parsons.
  24. 24.
    PCL: Point Cloud Library
  25. 25.
    Ramer U (1972) An iterative procedure for the polygonal approximation of plane curves. Comput Graph Image Process 1(3):244–256CrossRefGoogle Scholar
  26. 26.
    Ritter F, Hansen C, Wilkens K, Köhn A, Peitgen H (2009) Benutzungsschnittstellen für den direkten Zugriff auf 3D-Planungsdaten im OP user interfaces for direct interaction with 3D planning data in the operating room. i-com 8(1):24–31Google Scholar
  27. 27.
    Sklansky J (1982) Finding the convex hull of a simple polygon. Pattern Recognit Lett 1(2):79–83 Google Scholar
  28. 28.
  29. 29.
    Suzuki S et al (1985) Topological structural analysis of digitized binary images by border following. Comput Vis Graph Image Process 30(1):32–46CrossRefGoogle Scholar
  30. 30.
    The university of edinburgh school of informatics. Computer vision it412.
  31. 31.
    Volonté F, Pugin F, Bucher P, Sugimoto M, Ratib O, Morel P (2011) Augmented reality and image overlay navigation with Osirix in laparoscopic and robotic surgery: not only a matter of fashion. J HepatoBiliary Pancreat Sci 18(4):506–509. doi: 10.1007/s00534-011-0385-6 PubMedCrossRefGoogle Scholar

Copyright information

© CARS 2013

Authors and Affiliations

  1. 1.Fraunhofer MEVISBremenGermany
  2. 2.Jacobs UniversityBremenGermany

Personalised recommendations