Advertisement

A novel gaze-supported multimodal human–computer interaction for ultrasound machines

  • Hongzhi ZhuEmail author
  • Septimiu E. Salcudean
  • Robert N. Rohling
Original Article
  • 17 Downloads

Abstract

Purpose

Conventional ultrasound (US) machines employ a physical control panel (PCP) as the primary user interface for machine control. This panel is adjacent to the main machine display that requires the operator’s constant attention. The switch of attention to the control panel can lead to interruptions in the flow of the medical examination. Some ultraportable machines also lack many physical controls. Furthermore, the need to both control the US machine and observe the US image may lead the practitioners to adopt unergonomic postures and repetitive motions that can lead to work-related injuries. Therefore, there is a need for a more efficient human–computer interaction method on US machines.

Methods

To tackle some of the limitations with the PCP, we propose to merge the PCP into the main screen of the US machines. We propose to use gaze tracking and a handheld controller so that machine control can be achieved via a multimodal human–computer interaction (HCI) method that does not require one to touch the screen or look away from the US image. As a first step, a pop-up menu and measurement tool were designed on top of the US image based on gaze position for efficient machine control.

Results

A comparative study was performed on the BK Medical SonixTOUCH US machine. Participants were asked to complete the task of measuring the area of an ellipse-shaped tumor in a phantom using our gaze-supported HCI method as well as the traditional method. The user study indicates that the task completion time can be reduced by \(20.6\%\) when using our gaze-supported HCI, while no extra workload is imposed on the operators.

Conclusions

Our preliminary study suggests that, when combined with a simple handheld controller, eye gaze tracking can be integrated into the US machine HCI for more efficient machine control.

Keywords

Gaze tracking Ultrasound machine Human–computer interaction Multimodal interaction 

Notes

Acknowledgements

The authors would like to deliver their thanks to Mrs. Vickie Lessoway, Mrs. Irene Tong, Mr. Zhaoshuo Li, Mr. Neerav Patel for their supports and help during the study. Appreciations should also be delivered to the voluntary participants in the user study. Without your comments and feedback, very limited improvements on the design of our system can be made.

Compliance with ethical standards

Conflict of interest

The authors declare that they have no conflict of interest.

Ethical approval

Our research was approved by the Behavioral Research Ethics Board at the University of British Columbia (UBC), Vancouver, Canada. All procedures performed in our study involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki Declaration and its later amendments or comparable ethical standards.

Supplementary material

Supplementary material 1 (mp4 52293 KB)

References

  1. 1.
    Andreoni G, Mazzola M, Matteoli S, D’Onofrio S, Forzoni L (2015) Ultrasound system typologies, user interfaces and probes design: a review. Procedia Manuf 3:112–119CrossRefGoogle Scholar
  2. 2.
    Bartol K, Graziano S, Kelton W (2007) Remote wireless control device for an ultrasound machine and method. US Patent App. 10/550,046Google Scholar
  3. 3.
    Cobbold RS (2006) Foundations of biomedical ultrasound. Oxford University Press, OxfordGoogle Scholar
  4. 4.
    Feit AM, Williams S, Toledo A, Paradiso, A, Kulkarni H, Kane S, Morris MR (2017) Toward everyday gaze input: accuracy and precision of eye tracking and implications for design. In: Proceedings of the 2017 CHI conference on human factors in computing systems. ACM, pp 1118–1130Google Scholar
  5. 5.
    Halwani Y (2017) An investigation of multi-modal gaze-supported zoom and pan interactions in ultrasound machines. Ph.D. thesis, University of British ColumbiaGoogle Scholar
  6. 6.
    Halwani Y, Salcudean SE, Lessoway VA, Fels SS (2017) Enhancing zoom and pan in ultrasound machines with a multimodal gaze-based interface. In: Proceedings of the 2017 CHI conference extended abstracts on human factors in computing systems. ACM, pp 1648–1654Google Scholar
  7. 7.
    Han JH, Yang Eh (2015) Method of moving the displays of an ultrasound diagnostic device and ultrasound diagnostic device. US Patent App. 14/314,313Google Scholar
  8. 8.
    Hart SG (2006) Nasa-task load index (nasa-tlx); 20 years later. In: Proceedings of the human factors and ergonomics society annual meeting, vol 50. Sage Publications, Los Angeles, CA, pp 904–908Google Scholar
  9. 9.
    Hart SG, Staveland LE (1988) Development of nasa-tlx (task load index): results of empirical and theoretical research. Adv Psychol, vol 52. Elsevier, Amsterdam, pp 139–183Google Scholar
  10. 10.
    Istance H, Bates R, Hyrskykari A, Vickers S (2008) Snap clutch, a moded approach to solving the midas touch problem. In: Proceedings of the 2008 symposium on eye tracking research & applications. ACM, pp 221–228Google Scholar
  11. 11.
    Jarc AM (2017) Medical devices, systems, and methods using eye gaze tracking for stereo viewer. US Patent App. 15/126,151Google Scholar
  12. 12.
    Murphy S, Need DE (1996) Voice control of a medical ultrasound scanning machine. US Patent 5,544,654Google Scholar
  13. 13.
    Pelissier L, Zhang B, Bobovsky T (2016) Highly configurable medical ultrasound machine and related methods. US Patent 9,408,587Google Scholar
  14. 14.
    Reiner B (2017) Visually directed human-computer interaction for medical applications. US Patent 9,841,811Google Scholar
  15. 15.
    Ruhland K, Peters CE, Andrist S, Badler JB, Badler NI, Gleicher M, Mutlu B, McDonnell R (2015) A review of eye gaze in virtual agents, social robotics and HCI: behaviour generation, user interaction and perception. Comput Graph Forum 34(6):299–326CrossRefGoogle Scholar
  16. 16.
    Washburn MJ, Hawley BM, Prichard SD (2010) Voice control of a generic input device for an ultrasound system. US Patent 7,698,142Google Scholar
  17. 17.
    Weinger MB, Gardner-Bonneau DJ, Wiklund ME (2010) Handbook of human factors in medical device design. CRC Press, Boca RatonCrossRefGoogle Scholar
  18. 18.
    Wichrowski M (2015) Usability engineering in the prototyping process of software user interfaces for mobile medical ultrasound devices. Comput Sci 16:219–236CrossRefGoogle Scholar
  19. 19.
    Yudkovitch LM, Farrokhnia F, Chiao R (2007) Method and apparatus for natural voice control of an ultrasound machine. US Patent 7,247,139Google Scholar
  20. 20.
    Zhai S, Morimoto C, Ihde S (1999) Manual and gaze input cascaded (magic) pointing. In: Proceedings of the SIGCHI Conference on human factors in computing systems. ACM, pp 246–253Google Scholar

Copyright information

© CARS 2019

Authors and Affiliations

  1. 1.Department of Biomedical EngineeringUniversity of British ColumbiaVancouverCanada
  2. 2.Department of Electrical and Computer EngineeringUniversity of British ColumbiaVancouverCanada

Personalised recommendations