Advertisement

UX Evaluation Design of UTAssistant: A New Usability Testing Support Tool for Italian Public Administrations

  • Stefano Federici
  • Maria Laura Mele
  • Rosa Lanzilotti
  • Giuseppe Desolda
  • Marco Bracalenti
  • Fabio Meloni
  • Giancarlo Gaudino
  • Antonello Cocco
  • Massimo Amendola
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10901)

Abstract

Since 2012, usability testing in Italian public administration (PA) has been guided by the eGLU 2.1 technical protocols, which provide a set of principles and procedures to support specialized usability assessments in a controlled and predictable way. This paper describes a new support tool for usability testing that aims to facilitate the application of eGLU 2.1 and the design of its User eXperience (UX) evaluation methodology. The usability evaluation tool described in this paper is called UTAssistant (Usability Tool Assistant). UTAssistant has been entirely developed as a Web platform, supporting evaluators in designing usability tests, analyzing the data gathered during the test and aiding Web users step-by-step to complete the tasks required by an evaluator. It also provides a library of questionnaires to be administered to Web users at the end of the usability test. The UX evaluation methodology adopted to assess the UTAssistant platform uses both standard and new bio-behavioral evaluation methods. From a technological point of view, UTAssistant is an important step forward in the assessment of Web services in PA, fostering a standardized procedure for usability testing without requiring dedicated devices, unlike existing software and platforms for usability testing.

Keywords

Experimental UX evaluation methodology  Usability evaluation tool Public administration UX Semi-Automatic assessment International usability standards 

References

  1. 1.
    Desolda, G., Gaudino, G., Lanzilotti, R., Federici, S., Cocco, A.: Utassistant: a web platform supporting usability testing in italian public administrations. In: CHItaly 12th edn. CHItaly 2017, pp. 138–142 (2017)Google Scholar
  2. 2.
    Dipartimento della Funzione Pubblica: Il Protocollo Eglu 2.1: Come Realizzare Test Di Usabilità Semplificati Per I Siti Web E I Servizi Online Delle Pa. Formez PA, Rome, IT (2015)Google Scholar
  3. 3.
    Dipartimento della Funzione Pubblica: Il Protocollo Eglu-M: Come Realizzare Test Di Usabilità Semplificati Per I Siti Web E I Servizi Online Delle Pa. Formez PA, Rome, IT (2015)Google Scholar
  4. 4.
    Borsci, S., Federici, S., Mele, M.L.: Eglu 1.0: Un Protocollo Per Valutare La Comunicazione Web Delle Pa. Diritto e Pratica Amministrativa, vol. 1, pp. 9–10. Il Sole 24 Ore, Milan, IT (2014)Google Scholar
  5. 5.
    Brooke, J.: SUS: a “Quick and Dirty” usability scale. In: Jordan, P.W., Thomas, B., Weerdmeester, B.A., McClelland, I.L. (eds.) Usability Evaluation in Industry, pp. 189–194. Taylor & Francis, London (1996)Google Scholar
  6. 6.
    Borsci, S., Federici, S., Lauriola, M.: On the dimensionality of the System Usability Scale (SUS): a test of alternative measurement models. Cogn. Process. 10, 193–197 (2009)CrossRefGoogle Scholar
  7. 7.
    Di Nocera, F.: Usability Evaluation 2.0: Una Descrizione (S)Oggettiva Dell’usabilità. In: Ergoproject edn., Rome, IT (2013)Google Scholar
  8. 8.
    Borsci, S., Federici, S., Mele, M.L., Conti, M.: Short scales of satisfaction assessment: a proxy to involve disabled users in the usability testing of websites. In: Kurosu, M. (ed.) HCI 2015. LNCS, vol. 9171, pp. 35–42. Springer, Cham (2015).  https://doi.org/10.1007/978-3-319-21006-3_4CrossRefGoogle Scholar
  9. 9.
    Borsci, S., Federici, S., Bacci, S., Gnaldi, M., Bartolucci, F.: Assessing user satisfaction in the era of user experience: comparison of the SUS, UMUX and UMUX-LITE as a function of product experience. Int. J. Hum. Comput. Interact. 31, 484–495 (2015)CrossRefGoogle Scholar
  10. 10.
    Lewis, J.R., Utesch, B.S., Maher, D.E.: UMUX-LITE: when there’s no time for the SUS. In: Conference on Human Factors in Computing Systems: CHI 2013, pp. 2099–2102 (2013)Google Scholar
  11. 11.
    Tomlin, C.: 14 Usability Testing Tools Matrix and Comprehensive Reviews. Useful Usability, vol. 2018, USA (2014)Google Scholar
  12. 12.
    Lewis, J.R., Sauro, J.: Revisiting the factor structure of the system usability scale. J. Usability Stud. 12, 183–192 (2017)Google Scholar
  13. 13.
    Molich, R., Nielsen, J.: Improving a human-computer dialogue. Commun. ACM 33, 338–348 (1990)CrossRefGoogle Scholar
  14. 14.
    Nielsen, J.: Heuristic Evaluation. In: Nielsen, J., Mack, R.L. (eds.) Usability Inspection Methods. Wiley, New York (1994)CrossRefGoogle Scholar
  15. 15.
    Nielsen, J.: Enhancing the explanatory power of usability heuristics. In: SIGCHI Conference on Human Factors in Computing Systems: CHI 1994, pp. 152–158 (1994)Google Scholar
  16. 16.
    Nielsen, J., Molich, R.: Heuristic evaluation of user interfaces. In: SIGCHI Conference on Human Factors in Computing Systems: CHI 1990, pp. 249–256 (1990)Google Scholar
  17. 17.
    Bettman, J.R.: An Information Processing Theory of Consumer Choice. Addison-Wesley, Cambridge (1979)Google Scholar
  18. 18.
    Bettman, J.R., Park, C.W.: Implications of a constructive view of choice for analysis of protocol data: a coding scheme for elements of choice processes. Adv. Consum. Res. 7, 148–153 (1980)CrossRefGoogle Scholar
  19. 19.
    Biehal, G., Chakravarti, D.: Experiences with the Bettman-Park verbal-protocol coding scheme. J. Consum. Res. 8, 442–448 (1982)CrossRefGoogle Scholar
  20. 20.
    Biehal, G., Chakravarti, D.: Information-presentation format and learning goals as determinants of consumers’ memory retrieval and choice processes. J. Consum. Res. 8, 431–441 (1982)CrossRefGoogle Scholar
  21. 21.
    Biehal, G., Chakravarti, D.: Consumers’ use of memory and external information in choice: macro and micro perspectives. J. Consum. Res. 12, 382–405 (1986)CrossRefGoogle Scholar
  22. 22.
    Biehal, G., Chakravarti, D.: The effects of concurrent verbalization on choice processing. J. Mark. Res. (JMR) 26, 84–96 (1989)CrossRefGoogle Scholar
  23. 23.
    Green, A.: Verbal protocol analysis. Psychologist 8, 126–129 (1995)Google Scholar
  24. 24.
    Kuusela, H., Spence, M.T., Kanto, A.J.: Expertise effects on prechoice decision processes and final outcomes: a protocol analysis. Eur. J. Mark. 32, 559 (1998)CrossRefGoogle Scholar
  25. 25.
    Federici, S., Borsci, S., Stamerra, G.: Web usability evaluation with screen reader users: implementation of the partial concurrent thinking aloud technique. Cogn. Process. 11, 263–272 (2010)CrossRefGoogle Scholar
  26. 26.
    Federici, S., Borsci, S., Mele, M.L.: usability evaluation with screen reader users: a video presentation of the Pcta’s Experimental Setting and Rules. Cogn. Process. 11, 285–288 (2010)CrossRefGoogle Scholar
  27. 27.
    Federici, S., Borsci, S.: Usability evaluation: models, methods, and applications. In: Stone, J., Blouin, M. (eds.) International Encyclopedia of Rehabilitation, pp. 1–17. Center for International Rehabilitation Research Information and Exchange (CIRRIE), Buffalo (2010)Google Scholar
  28. 28.
    Borsci, S., Federici, S.: The partial concurrent thinking aloud: a new usability evaluation technique for blind users. In: Emiliani, P.L., Burzagli, L., Como, A., Gabbanini, F., Salminen, A.-L. (eds.) Assistive Technology from Adapted Equipment to Inclusive Environments: AAATE 2009, vol. 25, pp. 421–425. IOS Press, Amsterdam (2009)Google Scholar
  29. 29.
    Borsci, S., Kurosu, M., Federici, S., Mele, M.L.: Computer Systems Experiences of Users with and without Disabilities: An Evaluation Guide for Professionals. CRC Press, Boca Raton (2013)Google Scholar
  30. 30.
    Ekman, P.: Darwin and Facial Expression: A Century of Research in Review. Academic Press, New York (1973)Google Scholar
  31. 31.
    Qu, F., Yan, W.-J., Chen, Y.-H., Li, K., Zhang, H., Fu, X.: “You Should Have Seen the Look on Your Face…”: self-awareness of facial expressions. Front. Psychol. 8, 832 (2017)CrossRefGoogle Scholar
  32. 32.
    Mele, M.L., Millar, D., Rijnders, C.E.: Validating a quality perception model for image compression: the subjective evaluation of the Cogisen’s image compression plug-in. In: Kurosu, M. (ed.) HCI 2016. LNCS, vol. 9731, pp. 350–359. Springer, Cham (2016).  https://doi.org/10.1007/978-3-319-39510-4_33CrossRefGoogle Scholar
  33. 33.
    Mele, M.L., Millar, D., Rijnders, C.E.: Using spatio-temporal saliency to predict subjective video quality: a new high-speed objective assessment metric. In: Kurosu, M. (ed.) HCI 2017. LNCS, vol. 10271, pp. 353–368. Springer, Cham (2017).  https://doi.org/10.1007/978-3-319-58071-5_27CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  • Stefano Federici
    • 1
  • Maria Laura Mele
    • 1
  • Rosa Lanzilotti
    • 2
  • Giuseppe Desolda
    • 2
  • Marco Bracalenti
    • 1
  • Fabio Meloni
    • 1
  • Giancarlo Gaudino
    • 3
  • Antonello Cocco
    • 3
  • Massimo Amendola
    • 3
  1. 1.Department of Philosophy, Social and Human Sciences and EducationUniversity of PerugiaPerugiaItaly
  2. 2.Department of Computer ScienceUniversity of Bari Aldo MoroBariItaly
  3. 3.ISCOM – Superior Institute of Communication and Information Technologies, Ministry of Economic DevelopmentRomeItaly

Personalised recommendations