Advertisement

Back to User-Centered Usability Testing

  • Kimmo Tarkkanen
  • Pekka Reijonen
  • Franck Tétard
  • Ville Harkke
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7946)

Abstract

Usability testing is a widely used evaluation method for product design during and after the development. Conventional usability testing applies short and discrete test tasks and task scenarios that are based on the tasks the product is designed to support. Thus, conventional test task design relies heavily on the representations of the specified context of use and the specified user requirements of the proposed design solution. However, a premature commitment to the specified context, requirements and proposed solutions may limit the scope of usability testing in a manner that hinders its capability to elicit and validate new user requirements, which is one of the objectives of the evaluation phase in the iterative user-centered design process. In this paper, we introduce a user-centered task design approach, which allows test participants to follow their natural work flow and freely express their needs during a test session. The main idea of this open-ended task approach is to break the tight link between the produced design solutions and the tasks used in the usability test and in this way increase the probability that novel user needs can emerge during a test session. Empirical results from a case study are used to depict the approach and its prerequisites, strengths, and limitations are discussed.

Keywords

Usability testing task design usability evaluation user-centered design requirements elicitation 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Dumas, J., Saparova, D.: The First Seven Years of the JUS. Journal of Usability Studies 8, 1–10 (2012)Google Scholar
  2. 2.
    Vredenburg, K., Mao, J.-Y., Smith, P.W., Carey, T.: A Survey of User-Centered Design Practice. In: CHI 2002, pp. 471–478. ACM (2002)Google Scholar
  3. 3.
    Barnum, C.M.: Usability testing essentials: ready, set...test!. Morgan Kaufmann, Burlington (2011)Google Scholar
  4. 4.
    Dumas, J., Redish, J.: A practical guide to usability testing, revised edn. Intellect Ltd., Exeter (1999)Google Scholar
  5. 5.
    Curtis, B., Krasner, H., Iscoe, N.: A field study of the software design process for large systems. Communications of the ACM 31, 1268–1287 (1988)CrossRefGoogle Scholar
  6. 6.
    Chilana, P.K., Wobbrock, J.O., Ko, A.J.: Understanding usability practices in complex domains. In: Proceedings of the 28th International Conference on Human Factors in Computing Systems CHI 2010, pp. 2337–2346. ACM, New York (2010)Google Scholar
  7. 7.
    Robertson, S., Robertson, J.: Mastering the Requirements Process. ACM Press (1999)Google Scholar
  8. 8.
    Jarke, M., Loucopoulos, P., Lyytinen, K., Mylopoulos, J., Robinson, W.: The brave new world of design requirements. Information Systems 36, 992–1008 (2011)CrossRefGoogle Scholar
  9. 9.
    Redish, J.: Expanding Usability Testing to Evaluate Complex Systems. Journal of Usability Studies 2, 102–111 (2007)Google Scholar
  10. 10.
    Cordes, R.E.: Task-Selection Bias: A Case for User Defined Tasks. Int. J. of Human Computer Interaction 13(4), 411–419 (2001)CrossRefGoogle Scholar
  11. 11.
    Diaper, D.: Scenarios and task analysis. Interacting with Computers 14, 379–395 (2002)CrossRefGoogle Scholar
  12. 12.
    International Organization for Standardization: ISO 9241-11:1998 Guidance on usability (1998), http://www.iso.org
  13. 13.
    Ferré, X., Juristo, N., Windl, H., Constantine, L.: Usability Basics for Software Developers. IEEE Software 18, 22–29 (2001)CrossRefGoogle Scholar
  14. 14.
    International Organization for Standardization: ISO 9241-210:2010 Human-centred design for interactive system (2010), http://www.iso.org
  15. 15.
    Abras, C., Maloney-Krichmar, D., Preece, J.: User-centered design. In: Bainbridge, W. (ed.) Encyclopedia of Human-Computer Interaction. Sage Publications, Thousand Oaks (2004)Google Scholar
  16. 16.
    Beyer, H., Holtzblatt, K.: Contextual design: defining customer-centered systems. Morgan Kaufmann, San Francisco (1998)Google Scholar
  17. 17.
    Preece, J., Rogers, Y., Sharp, H.: Interaction design: Beyond human-computer interaction. John Wiley & Sons Inc., New York (2002)Google Scholar
  18. 18.
    Diaper, D., Stanton, N.A.: The Handbook of Task Analysis for Human-Computer Interaction. Lawrence Erlbaum Associates Inc. Publishers, Mahwah (2004)Google Scholar
  19. 19.
    Rosson, M.B., Carroll, J.M.: Usability engineering: scenario-based development of human-computer interaction. Academic Press, San Diego (2002)Google Scholar
  20. 20.
    Beyer, H., Holtzblatt, K., Baker, L.: An Agile Customer-Centered Method: Rapid Contextual Design. In: Zannier, C., Erdogmus, H., Lindstrom, L. (eds.) XP/Agile Universe 2004. LNCS, vol. 3134, pp. 50–59. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  21. 21.
    Cooper, A.: The inmates are running the asylum: Why high-tech products drive us crazy and how to restore the sanity. SAMS, Indianapolis (1999)Google Scholar
  22. 22.
    Bødker, S.: Scenarios in user-centred design—setting the stage for reflection and action. Interacting with Computers 13, 61–75 (2000)CrossRefGoogle Scholar
  23. 23.
    International Organization for Standardization.: ISO 13407:1999 Human-centred design processes for interactive systems (1999), http://www.iso.org
  24. 24.
    Suchman, L.: Plans and situated actions: the problem of human-machine interaction. Cambridge University Press, New York (1987)Google Scholar
  25. 25.
    Wagner, E.L., Scott, S.V., Galliers, R.D.: The creation of ‘best practice’ software: myth, reality and ethics. Information and Organization 16, 251–275 (2006)CrossRefGoogle Scholar
  26. 26.
    Gasser, L.: The integration of computing and routine work. ACM Transactions on Office Information Systems 3, 205–225 (1986)CrossRefGoogle Scholar
  27. 27.
    Ciborra, C.U.: From control to drift: the dynamics of corporate information infrastructures. Oxford University Press, UK (2000)Google Scholar
  28. 28.
    Tarkkanen, K.: Business process modeling for non-uniform work. In: Filipe, J., Cordeiro, J. (eds.) Enterprise Information Systems. LNBIP, vol. 19, pp. 188–200. Springer, Heidelberg (2009)CrossRefGoogle Scholar
  29. 29.
    Rubin, J., Chisnell, D.: Handbook of usability testing: How to plan, design and conduct effective tests, 2nd edn. Wiley Publishing Inc., Indianapolis (2008)Google Scholar
  30. 30.
    Wilson, C.: Taking usability practitioners to task. Interactions 14, 48–49 (2007)Google Scholar
  31. 31.
    Kuniavsky, M.: Observing the user experience: a practioner’s guide to user research. Morgan Kaufmann, San Francisco (2003)Google Scholar
  32. 32.
    van Waes, L.: Thinking Aloud as a Method for Testing the Usability of Websites: The Influence of Task Variation on the Evaluation of Hypertext. IEEE Transactions on Professional Communication 43, 279–291 (2000)CrossRefGoogle Scholar
  33. 33.
    van den Haak, M.J., de Jong, M.D.T., Schellens, P.J.: Evaluating Municipal Websites: A Methodological Comparison of Three Think Aloud Variants. Government Information Quarterly 26, 193–202 (2009)CrossRefGoogle Scholar
  34. 34.
    Albers, M.J.: Design and Usability: Beginner Interactions with Complex Software. Journal of Technical Writing and Communication 41, 271–287 (2011)MathSciNetCrossRefGoogle Scholar
  35. 35.
    Mirel, B.: Dynamic usability: designing usefulness into systems for complex tasks. In: Albers, M.J., Mazur, B. (eds.) Content and complexity: Information Design in Technical Communication, Lawrence Erlbaum Associates, Mahwah (2003)Google Scholar
  36. 36.
    Lindgaard, G., Chattratichart, J.: Usability testing: What have we overlooked? In: Proceedings of CHI 2007, pp. 1415–1424. ACM Press (2008)Google Scholar
  37. 37.
    Molich, R., Dumas, J.S.: Comparative usability evaluation (CUE-4). Behaviour & Information Technology 27, 263–281 (2008)CrossRefGoogle Scholar
  38. 38.
    Alshamari, M., Mayhew, P.: Task design: Its impact on usability testing. In: The Third International Conference on Internet and Web Applications and Services, pp. 583–589. IEEE Press (2008)Google Scholar
  39. 39.
    Atladottir, G., Hvannberg, E.T., Gunnarsdottir, S.: Comparing task practicing and prototype fidelities when applying scenario acting to elicit requirements. Requirements Engineering 17, 157–170 (2011)CrossRefGoogle Scholar
  40. 40.
    Lindgaard, G., Dillon, R., Trbovich, P., White, R., Fernandes, G., Lundahl, S., Pinnamaneni, A.: User Needs Analysis and Requirements Engineering: Theory and Practice. Int. Comp. 18, 47–70 (2006)Google Scholar
  41. 41.
    Anastassova, M., Mégard, C., Burkhardt, J.-M.: Prototype evaluation and user-needs analysis in the early design of emerging technologies. In: Jacko, J.A. (ed.) HCI 2007. LNCS, vol. 4550, pp. 383–392. Springer, Heidelberg (2007)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2013

Authors and Affiliations

  • Kimmo Tarkkanen
    • 1
  • Pekka Reijonen
    • 1
  • Franck Tétard
    • 2
  • Ville Harkke
    • 1
  1. 1.Information Systems ScienceUniversity of TurkuTurkuFinland
  2. 2.Department of Informatics and MediaUppsala UniversityUppsalaSweden

Personalised recommendations