Use of a Mixed-Reality System to Improve the Planning of Brain Tumour Resections: Preliminary Results
The lack of intuitive visualization techniques for neurosurgical planning is a challenging hurdle faced by neurosurgeons and neurosurgery residents. Within this context, this paper describes the development and evaluation of an Augmented Reality (AR) system geared towards planning brain tumour resection interventions. Successful resection of a tumour or hematoma requires careful pre-operative planning to avoid damaging the brain. We hypothesize that our proposed AR system facilitates the planning of tumour resection operations by making more effective use of the visuospatial abilities of individuals to assess patient-specific data. To test our hypothesis, a number of experiments were conducted where subjects were asked to perform relevant spatial judgment tasks using three different conventional visualization approaches as well as the proposed AR system. Our preliminary results indicate that, compared to traditional methods, the proposed AR system a) greatly improves the user performance in tasks involving 3D spatial reasoning about the tumour relative to the anatomical context, b) reduces error associated with mental transformation, and c) supports generic spatial reasoning skills, independent of the sensory-motor tasks performed.
KeywordsAugmented Reality Neurosurgical Planning Tumour Resection
Unable to display preview. Download preview PDF.
- 1.American Cancer Society: Cancer Facts and Figures 2012. American Cancer Society, Atlanta (2012) (last accessed December 12, 2012)Google Scholar
- 2.Canadian Cancer Statistics, produced by Canadian Cancer Society, Statistics Canada, Provincial/Territorial Cancer Registries, Public Health Agency of Canada (2012) (last accessed December 12, 2012)Google Scholar
- 3.National Cancer Institute (NCI) booklet (NIH Publication No. 09-1558), (Posted: April 29, 2009)Google Scholar
- 4.Hegarty, M., Keehner, M., Cohen, C., Montello, D.R., Lippa, Y., Allen, G.L.: The role of spatial cognition in medicine: Applications for selecting and training professionals. In: Applied Spatial Cognition: From Research to Cognitive Technology, pp. 285–315. Lawrence Erlbaum Associates Publishers (2007)Google Scholar
- 5.Biström, J., Cogliati, A., Rouhiainen, K.: Post-WIMP User Interface Model for 3D Web Applications, Helsinki University of Technology Telecommunications Software and Multimedia Laboratory (2005)Google Scholar
- 7.Hutchins, E.L., Hollan, J.D., Norman, D.A.: Direct Manipulation Interfaces. In: Norman, D.A., Draper, S.W. (eds.) User-Centered System Design, pp. 87–124. Lawrence Erlbaum, Hillsdale (1986)Google Scholar
- 12.Kniss, J., Kindlmann, G., Hansen, C.: Interactive volume rendering using multi-dimensional transfer functions and direct manipulation widgets. In: Proc. IEEE Visualization, pp. 255–262 (2001)Google Scholar
- 13.Baxter, J.S.H., Peters, T.M., Chen, E.C.S.: A unified framework for voxel classification and triangulation. In: Proc. SPIE, vol. 7964, p. 796436 (2011)Google Scholar
- 14.Williams, L.J.: Tunnel vision induced by a foveal load manipulation. The J. of the Human Factors 27(2), 221–227 (1985)Google Scholar
- 17.Abhari, K., Baxter, J.S.H., de Ribaupierre, S., Peters, T., Eagleson, R.: Perceptual Improvement of Volume-Rendered MR Angiography Data using a Contour enhancement Technique. In: International Society for Optics and Photonics (SPIE), USA, vol. 8318, p. 831809 (2012)Google Scholar
- 18.Bichlmeier, C., Wimmer, F., Michael, H.S., Nassir, N.: Contextual anatomic mimesis: Hybrid in-situ visualization method for improving multi-sensory depth perception in medical augmented reality. In: Proc. IEEE and ACM Int. Symp. on Mixed and Augmented Reality, ISMAR, pp. 129–138 (2007)Google Scholar
- 20.Rajchl, M., Yuan, J., Ukwatta, E., Peters, T.M.: Fast Interactive Multi-region Cardiac Segmentation With Linearly Ordered Labels. In: 9th IEEE International Symposium on Biomedical Imaging, ISBI (2012)Google Scholar