Advertisement

Usability Evaluation of an Online Workplace Health and Safety Return on Investment Calculator

  • Olivia Yu
  • Kelly Johnstone
  • Margaret Cook
Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 819)

Abstract

The online Return on Investment (ROI) calculator developed by the Queensland Government (Australia) is a financial evaluation tool that is publicly available and captures the direct and indirect costs and benefits of implementing a workplace health and safety (WHS) intervention to estimate a comprehensive ROI figure. Whilst cost is an important factor in decision-making, how useful is such a tool to WHS professionals in deciding whether to implement a WHS control? Aim: This study aimed to evaluate user perceptions of the online ROI calculator’s usefulness, ease of use, and value. Methods: Google Analytics and the ROI calculator’s results were obtained via the Queensland Government database and a usability questionnaire was administered to capture how users perceive the online calculator’s ease of use and usefulness. Results: Google Analytics recorded 18,633 sessions, 12,803 new users, average of 1.12 page views per session, average session duration of 3 min 30 s and a bounce rate of 86.24%. The ROI calculator yielded 548 observations with a 13.50% conversion rate (74 completions). Overall, users (n = 17) perceived the ROI calculator to be average to slightly positive in terms of usefulness and ease-of-use. Conclusions: The results indicate effective site promotion or interest around ROI in WHS, however, the calculator has poor ability to retain users to complete the tool. Future research is currently being conducted to gain a better understanding of the potential roadblocks to greater tool use and user design and experience of the tool.

Keywords

Return on investment Usability evaluation Cost benefit analysis Occupational health and safety 

Notes

Acknowledgements

I would like to thank The Office of Industrial Relations, Queensland Government, Australia for funding this PhD research project, with special thanks due in particular to Sebastian Bielen and Nita Maynard.

References

  1. 1.
    Larson R, Berliner L (1983) On evaluating evaluations. Policy Sci 16:147–163.  https://doi.org/10.1007/bf00138348CrossRefGoogle Scholar
  2. 2.
    Blagec K, Romagnoli K, Boyce R, Samwald M (2016) Examining perceptions of the usefulness and usability of a mobile-based system for pharmacogenomics clinical decision support: a mixed methods study. PeerJ 4:e1671.  https://doi.org/10.7717/peerj.1671CrossRefGoogle Scholar
  3. 3.
    Mohamed K, Hassan A (2015) Evaluating federated search tools: usability and retrievability framework. Electron Libr 33:1079–1099.  https://doi.org/10.1108/el-12-2013-0211CrossRefGoogle Scholar
  4. 4.
    Nielsen J (2000) Designing web usability. New Riders, IndianapolisGoogle Scholar
  5. 5.
    Wang J, Senecal S (2007) Measuring perceived website usability. J Internet Commer 6:97–112.  https://doi.org/10.1080/15332860802086318CrossRefGoogle Scholar
  6. 6.
    Laitenberger O, Dreyer H (1998) Evaluating the usefulness and the ease of use of a web-based inspection data collection tool. In: Software metrics symposium fifth internationalGoogle Scholar
  7. 7.
    Carroll M (2003) Usability and web analytics: ROI justification for an internet strategy. Interact Mark 4:223–234.  https://doi.org/10.1057/palgrave.im.4340184CrossRefGoogle Scholar
  8. 8.
    Benbunan-Fich R (2001) Using protocol analysis to evaluate the usability of a commercial web site. Inf Manag 39:151–163.  https://doi.org/10.1016/s0378-7206(01)00085-4CrossRefGoogle Scholar
  9. 9.
    Barnard P, Hammond N, Morton J, Long J, Clark I (1981) Consistency and compatibility in human-computer dialogue. Int J Man-Mach Stud 15:87–134.  https://doi.org/10.1016/s0020-7373(81)80024-7CrossRefGoogle Scholar
  10. 10.
    Davis F (1989) Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Q 13:319.  https://doi.org/10.2307/249008CrossRefGoogle Scholar
  11. 11.
    Venkatesh V, Davis F (2000) A theoretical extension of the technology acceptance model: four longitudinal field studies. Manag Sci 46:186–204.  https://doi.org/10.1287/mnsc.46.2.186.11926CrossRefGoogle Scholar
  12. 12.
    Turner M, Kitchenham B, Brereton P, Charters S, Budgen D (2010) Does the technology acceptance model predict actual use? A systematic literature review. Inf Softw Technol 52:463–479.  https://doi.org/10.1016/j.infsof.2009.11.005CrossRefGoogle Scholar
  13. 13.
    Fishbein M, Ajzen I (1975) Belief, attitude, intention, and behavior. Addison-Wesley Publishing Co., ReadingGoogle Scholar
  14. 14.
    Work Health and Safety eTools (2018). https://worksafe.qld.gov.au/etools/. Accessed 1 Nov 2016
  15. 15.
    Ali Babar M (2010) A framework for groupware-supported software architecture evaluation process in global software development. J Softw Evol Process 24:207–229.  https://doi.org/10.1002/smr.478CrossRefGoogle Scholar
  16. 16.
    Benford S, Giannachi G, Koleva B, Rodden T (2018) From interaction to trajectories: designing coherent journeys through user experiences. In: SIGCHI conference on human factors in computing systemsGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.School of Earth and Environmental SciencesThe University of QueenslandBrisbaneAustralia
  2. 2.The Office of Industrial Relations, Queensland GovernmentBrisbaneAustralia

Personalised recommendations